By Abbey White, Staff Writer, SatNews
Dispatch from SmallSat Symposium. Coverage and analysis from across the conference, tracking the forces shaping the next phase of the SmallSat market.

MOUNTAIN VIEW. For decades the space industry measured ground segment power in rack units and heavy copper cabling. That model was effectively retired here at the SmallSat Symposium. Instead of bigger antennas or faster rockets, discussion on the floor focuses on the convergence of Silicon Valley software agility with modern defense’s strategic requirements.
Industry leaders, evaluating the traditional architecture of satellite operation at SmallSat’s Space Virtualization, AI, and Future panel, presented a challenge for legacy vendors. The U.S. military and intelligence communities are moving away from hardware in order to prioritize applications with abilities to download a ground station to a laptop in a forward operating base, to process targeting data in orbit, and to execute solutions before an adversary detects observation.
The End of the Pizza Box
Nicole Robinson, President of Gilat DataPath, clearly articulated this shift. Her company, deeply entrenched in the defense sector, sees uniformed customers moving away from the proprietary hardware that once defined the industry’s physical infrastructure.
Channeling feedback from her defense clients, Robinson said, “We don’t need a whole bunch of pizza boxes anymore. We’re happy to pay you for your waveform just as much as we were happy to pay you for your pizza box.”
This represents a paradigm shift where the modem becomes a piece of software running on a generic server. Rather than a matter of convenience, the Department of Defense finds this an operational necessity. A physical ground station makes a fixed target. A virtualized ground station can provide resiliency instantly by being spun up in a cloud container on the other side of the planet. In a contested environment, Robinson noted, operators cannot be “beholden to one path of communication, one orbit, one frequency.”

The Battle for the OODA Loop: Cloud vs. Edge
The panel highlighted a tactical distinction regarding where data processing should occur. On one side sits the centralized power of hyperscale cloud providers. On the other lies the distinct physics of the tactical edge.
Doug Hairfield, Senior GenAI Solutions Architect at AWS, discussed the scale of terrestrial cloud computing. The “impatience of humans” generates a need for democratization, where “no longer do you have to be an aerospace engineer to be able to provide value.” In his aggregative vision, large datasets are pulled down to Earth to be processed by Large Language Models and generative AI.
Dennis Gatens, President of LEOcloud, challenged that vision with the constraints of a defense scenario, where the latency in sending data to a centralized cloud, along with the cost of data egress, creates a potential liability: “It enables the insight versus the raw data to be delivered from space to Earth.”
Gatens instead advocates for the Space Edge. His firm, now part of Voyager Technologies, has deployed data center infrastructure to the International Space Station with a clear strategy. Process the data in orbit. Instead of sending the image of the ocean, send the ship’s coordinates. Such an architecture bypasses the egress costs of the cloud, tightens the OODA (Observe, Orient, Decide, Act) loop, and gains fundamental advantages in “security, latency and transport.”
The Autonomous Kill Chain
Fully autonomous systems made a significant part of the conversation. Dr. Owen Brown of Scientific Systems (SSCI) steered discussion to “tip and cue” architectures. In this scenario, a satellite detects an object of interest and, without human intervention, autonomously tasks another asset to track it.
Such a process establishes a technical foundation for an autonomous kill chain, which Robinson described as working in practice by fusing different data types to validate targets instantly.
“One SAR collection in black and white is not really going to tell you an awful lot,” Robinson explained. “But the fusion of a SAR change detection algorithm produces an insight that says there are vehicles in this particular location.”
The speed of that insight is its primary value. The system identifies a confirmed set of vehicles and classifies the tank type, allowing commanders to make decisions based on that data.
The Garbage-In Risk
However, the move toward automated defense systems significantly raises a risk profile. Lucy Hoag, CEO of Violet Labs, provided a voice of caution, noting that the industry’s focus on speed must be balanced with quality verification. Software tools used to build these systems are often outdated, creating a gap between AI capabilities and engineering realities.
“The slow pace that we’ve allowed ourselves to deliver space systems is not just untenable,” Hoag stated. “It’s egregious.”
Hoag’s concern addresses the core AI dilemma. Flaws in underlying data may lead an autonomous system not just to act faster but to make a mistake faster. If an AI model misidentifying a rock as a tank in a demonstration is a technical issue, in an operational theater it is a critical failure.
The “Quindar” Signal
The days of bent pipe connectivity are fading. The new space architecture is a global, orchestrated system defined by software.
Companies like Quindar are automating control rooms. Co-founder Sunny Bhagavathula noted that operators are “not hiring more people” but expecting software to handle the scale. The infrastructure becomes invisible as it moves into code and containers. While the hardware footprint decreases, the stakes remain high. Beyond selling bandwidth, the industry is now selling the speed-of-decision advantage.


