"Every Starlink Above Critical Mass Is a Money Tree": OrbitsEdge Founder Rick Ward On Edge Compute, Orbital Data Centers, and Why the Biggest Bottleneck in Space Is Not the Rocket
I met Rick Ward at the New Worlds Conference in Houston last November, Rick Tumlinson's annual gathering of space founders, astronauts, and people who tend to say things like "cloud computing above the clouds" without a trace of irony. Ward was wearing a T-shirt with a raccoon on it. The raccoon's name is Orbit. It is also the mascot of his company, OrbitsEdge, and it appears on his business card. The company itself is less whimsical than the branding: Ward is trying to solve what he considers the most overlooked infrastructure problem in the space economy. Not launch. Not communications. Compute.
The problem clicked into focus for him a few years earlier, while he was working at Deep Space Industries, the asteroid mining company founded by Rick Tumlinson, which was also Ward's entry point into the space industry. DSI had built regolith simulants for NASA, experimented with 3D-printing structures on Mars, and was designing spacecraft to prospect near-Earth objects. Ward's job was hands-on: lab work with simulated asteroid materials, figuring out what it would really take to pull water and minerals out of rock in deep space. The answer was autonomy. You cannot send a person with a pickax to an asteroid for three years. It has to be robots. And robots need computers.
When he looked at what was actually flying in orbit, he found the RAD750: a single-core, 200-megahertz processor roughly on par with the first iPhone. It powered the Mars Perseverance rover. It powered military satellites. And it was, by any ground-based standard, ancient. For the kind of autonomous work Ward was imagining, it was not even close.
Bradford Space bought Deep Space Industries in January 2019. The asteroid mining mission disappeared into someone else's business plan. But the gap Ward had spotted, between what space systems needed to think through and what their onboard computers could actually handle, did not go away. That same year, he started OrbitsEdge.
Rick Ward grew up in Alabama, joined the United States Marine Corps after high school, and spent his enlistment carrying a rocket launcher. He later picked up an associate degree in mechanical engineering, kicked around the startup world, and ended up at Deep Space Industries, where the idea for OrbitsEdge took shape. He does not have a PhD. He is not a career aerospace engineer. He is a Marine veteran and company founder who noticed a missing layer in the space economy and went after it.
We sat down again this week to talk about what OrbitsEdge is actually building, how putting a computer on a satellite changes the math for Earth observation, why SpaceX's vertical integration follows a playbook from the 1800s, what a "digital space Pearl Harbor" might look like in practice, and whether AI is really as smart as everyone says it is.
Everyone is talking about AI right now, including in the space sector. You use these models daily. Where do you actually think they stand, and how does the AI boom connect to what OrbitsEdge is building?
Ward's view on AI is enthusiastic and skeptical in roughly equal measure. He uses multiple models regularly, likes different ones for different tasks, and does not consider any of them dominant across the board. "A lot of people are using Claude for coding. Grok is a popular one. There's different models that are good for different things." He pauses to note that he liked Claude's recent ad campaign. But his bigger concern is with the economics underneath.
"The problem I see is, how do you monetize a commodity?" he says. "There are a lot of agents out there. If any one of them had existed in isolation five years ago, it would have been world dominant, but they're all developing in parallel." The competition is a marathon, not a sprint, with each company pouring enormous capital into marginal improvements while consumers pay twenty dollars a month or use the free tier one level down. Ward frames it with an analogy: "What's the profit margin on sugar? And that only makes sense because sugar cane is cheap to grow. But in a world where sugar cane is very expensive to grow, how do I make money selling cheap sugar?" AI models are expensive to train and run. If the product they generate is priced like a commodity, the math gets uncomfortable.
As for what the models actually do, Ward is blunt. "It really does feel like it put 10,000 books into a blender, and when you say something, it reaches into the pile of confetti and pulls out some stuff to give you back." He is not dismissive. He just thinks the hype has outrun the reality in some cases. "Maybe I'm less bullish on AI today than I would have been yesterday."
Where the AI boom connects directly to OrbitsEdge is demand. Every query to every model requires compute, and compute requires infrastructure. "Is it affecting demand? Absolutely. Everybody's using ChatGPT for everything."He catches himself on the brand name. "That's generic shorthand. That's like Kleenex." The hunger for processing power is real. The question of where that processing happens, on the ground or eventually in orbit, is the question Ward's company is built around.
OrbitsEdge started with a vision for full-scale orbital data centers, but your first product is much smaller than that. Walk me through the pivot, what you are actually building now, and why the entry point matters as much as the long-term vision.
Ward does not dance around the pivot. When he started the company in 2019, the long-term idea was big: computing infrastructure in orbit. That has not changed. But the entry point did, because customers told him to his face that it had to. "I was trying to pitch a 1,000-watt platform, and they're like, my whole power budget is 500 watts. I can't dump all that into a computer that also weighs as much as my whole spacecraft. Do you have something smaller?"
He heard that enough times. The result is Edge10, OrbitsEdge's first product: a radiation-shielded, thermally managed computing module that runs on about 10 watts of power and pushes 50 trillion operations per second, enough to run a mid-range vision AI model. It is small, light, and designed to ride on someone else's satellite rather than needing its own bus.
"Edge compute basically means that the data is generated at the same place where the math is done," Ward says. For Earth observation, that one sentence reshapes the whole business.
Earth observation is your first market. How does putting compute onboard the satellite actually change the economics, and how bad are the economics right now for companies that do not have it?
Ward explains the current model with an analogy that sticks with you. "The business model for Earth observation is: the customer wants needles. So we gather these haystacks which have some needles, but mostly hay. We put them on a flatbed truck, we send them across the country, and then we look for needles." It is expensive, slow, and lossy. Some of the haystacks are pure hay. Some of the needles fall off on the road. And for a lot of Earth observation companies, the unit economics have been ugly. "Some of these companies have spent four dollars to make one. Without the Ukraine war, half of the EO companies probably would have been bankrupt by now."
If the satellite can process imagery onboard, the whole sequence flips. "You can pass a magnet over those haystacks as you build them, collect a big fat stack of needles, and just go ahead and send those down to the ground." Faster answers for the customer. Far less data to transmit. And the compression, in some cases, is striking.
Ward puts it concretely. "Can I take a picture of a parking lot or a dockyard or a rail terminal and turn it into a spreadsheet that describes what's going on, without being a photograph?" Depending on the resolution needed, that conversion can compress data by a factor of 10,000 to one. What goes down to the ground is not a picture. It is a summary: what is there, what changed, what matters. The system can also run change analysis, comparing today's image against yesterday's, last week's, or last month's. Ward jokes: "We can figure out what the heck Rick is doing in his backyard. And maybe that's of great interest to someone."
Then there is the communications bottleneck. Satellites are not always talking to the ground. Ground stations tend not to be on the ocean, so big chunks of each orbit pass with no downlink window at all. Even when a satellite does pass over a station, confirmation that the data actually made it down is unreliable, and the prioritization of what gets sent first has historically been poor. "If we can triage the data, figure out what's the most important thing to send down first, we'll be able to significantly improve delivery times."
Starlink is starting to chip away at part of this. SpaceX is working with companies like Muon Space to put Starlink terminals directly on observation satellites, which would mean always-on, high-bandwidth links. They are also developing their direct-to-cell network, which Ward describes as essentially "putting a cellular card on your satellite the size of a deck of cards." Some people argue that solves the problem entirely and makes onboard compute unnecessary. Ward does not buy it. "There are still constraints, and it's still beneficial to do onboard processing before you send everything down." The volume of raw data is growing faster than anyone's pipes can handle. The math, he thinks, has to happen where the data is born.
Beyond Earth observation, what other use cases are you pursuing, and have customers come to you with anything unexpected?
Ward lays it out in what he calls a "now, near, and next" framework. Earth observation is now. It pays the bills while the company works toward the bigger stuff.
Next is spacecraft autonomy, specifically rendezvous, proximity operations, and docking. That means one spacecraft approaching another for refueling, repair, servicing, or freeing something that got stuck. Ward being Ward, he also flags the flip side: "Intentionally damaging somebody else's spacecraft, either in a way that everybody knows, or in a way that's plausibly deniable. That's a different set of use cases." Either way, you need real compute onboard to pull it off.
After that comes in-space manufacturing. OrbitsEdge is talking to biopharmaceutical companies about microgravity production. One wants to grow cell cultures and organoids, basically partial organs you grow to study how the real thing works. Another is doing protein crystal synthesis, which touches drug development, optics, ceramics, metallurgy, and even the substrates used to print circuit boards. "There's so much you can do with crystallography in space," Ward says. "And I'm interested in being involved in all of that."
Ward frames OrbitsEdge's position across all of these with a line he knows is well-worn but still useful: "The shovels in the gold rush. Everything that you do is going to require some degree of autonomy, some degree of edge compute, and we can provide a lot of that."
On the unexpected side, OrbitsEdge has fielded inquiries ranging from blockchain-based image allocation systems, figuring out who gets paid when a requested satellite image is delivered, all the way to cryptocurrency mining in orbit. Ward ties it to the Kardashev scale, which measures civilizational advancement by energy use. "To get to Type I, you have to capture all the energy that falls on your planet. And I think a lot of getting to Type I is going to be compute." His long view: "Fifty years from now, the majority of humanity's compute is going to be done in space."
Walk me through the full roadmap. You start at 10 watts. Where does it go from there, and how does the company get from edge compute to actual orbital data centers without running out of money along the way?
The plan moves in steps. Edge10 at 10 watts is the starting point: low power, low weight, lots of potential customers. Then 50-watt and 100-watt systems. A 1,000-watt platform starts to make sense for commercial space stations, where operators need computing for their own systems and also for paying customers who show up with their own data needs.
After that, OrbitsEdge is eyeing a 250-kilowatt node, roughly twice the power output of the International Space Station. One of those would be about 25 server racks and the most compute anyone has put in orbit. Ward ran the numbers against xAI's Colossus data center and figured it would take about 1,000 of those nodes to match its raw throughput, ignoring latency. "That was an absurdly large number," he says, "until a couple of days later SpaceX did a filing saying they want to launch a million satellites."
The thing Ward keeps coming back to is sequencing. A lot of the other space computing startups have staked everything on the data center vision. Their whole business model is centered on building a space-based data center, and if they cannot do that, they do not have a way to pay the bills. Ward sees companies whose pitch to investors boils down to: give us a huge amount of money, and eventually we will give you a much larger amount back, do not ask when. OrbitsEdge's approach is to start earning revenue at the small end, work through the technical headaches at low stakes, and scale up from a position of credibility rather than pure faith. "We can work out some of the technical challenges to the large stuff at the small scale, while also keeping the lights on and not being completely dependent on investor cash to stay alive one more day."
SpaceX is building orbital data centers on Starlink. It launches the rockets, makes the satellites, uses the network itself, and will likely have huge IPO cash. It also just spent roughly $13โ17 billion on direct-to-cell spectrum. So where does OrbitsEdge fit when one company controls nearly the whole stack?
Ward does not pretend this is a comfortable question. But he has clearly thought about it, and the answer starts in the 1800s.
"SpaceX has a benefit in verticalization that most people don't have and most people can't and won't ever have,"he says. "It's kind of like the most efficient verticalization since Andrew Carnegie with steel. He owned the mines. He owned the railroads. He owned the processing. Full stack vertical integration." SpaceX owns the rocket, the satellite, the ground network, the spectrum, and more and more of the customer relationship. And the way they got there followed a logic that only makes sense in hindsight.
Ward walks through the economics. Reusable launch with Falcon 9 only works at high volume. "If you go from 10 to 12 launches a year, it'll take you decades to pay for the work. You will never break even. The only way that vertical launch is ever going to make sense is if you go from 10 to 100." But satellite launch demand is inelastic. Third-party customers were not going to 10x their orders. "The only answer is to become your own customer. And that's exactly what SpaceX did."
So they built Starlink. But satellite internet in low Earth orbit has its own broken business model: it takes thousands of satellites, and because LEO orbits are unstable, you have roughly three to five years to fill your constellation before the earliest satellites start decaying. You have to build fast, launch faster, and somehow reach the subscriber base that makes the unit economics work before the hardware falls out of the sky. SpaceX took that broken model and made it work, too.
Ward draws the parallel to Tesla. The company started with the hand-assembled, high-margin Model S and Model X. Then it shifted to the mass-produced, lower-margin Model 3 and Model Y, which have now eaten the luxury line. Next is the Cybercab, which Tesla may never sell at all, instead running it as a software-as-a-service platform. "Every Cybercab becomes a money tree. Just like every Starlink below critical mass is a money pit, but every Starlink above critical mass is a money tree. And then every time you improve the next generation, more performance, more leaves on the money tree. Add compute to it, and it's money plus gold."
With the IPO, SpaceX will have a massive cash reserve. "They can do insolvent things for a long time until they work out the business model. That's the thing. They have a history of doing things that didn't make any sense until they did."
And the direct-to-cell play expands the addressable market by an order of magnitude. SpaceX just spent billions acquiring spectrum and regulatory approvals that took another company a decade to assemble. Ward notes that Gwynne Shotwell recently appeared at a major global telecom conference, signaling serious intent. The business model logic is simple: "Do I want a million dollars from one guy, or a dollar from a million guys?" There are five or six billion cell phones in the world. Starlink users are tied to a fixed location, or at best a Starlink Mini for digital nomads. Cell phone users are everyone. If SpaceX enters the cellular game, they are leveraging a very similar tech stack to unlock something like a 10x expansion of their customer base. "The world's gonna get run over by a steamroller moving half a mile an hour," Ward says.
So where does that leave everyone else? Ward does not frame OrbitsEdge as going toe-to-toe with SpaceX. He sees it as serving the rest of the ecosystem, the satellite operators and mission types that do not live inside the Starlink stack. The next decade is still genuinely up for grabs. "SpaceX is going to do it one way or another. The question is whether or not they'll make any money off of it, or whether it's a giant fiasco. I'm thinking it's definitely going to be a giant fiasco at first, but it might end up not being a giant fiasco."
After our conversation, Ward followed up with a contrarian thought he had been turning over. The case for massive orbital inference data centers assumes demand will keep growing. But Moore's Law has not stopped. In 2026, high-end Qualcomm-based flagship phones already handle 20 to 40 billion parameter models smoothly. By 2027 or 2028, with continued NPU scaling and model densification, on-device capabilities could reach 70 to 100 billion parameter class, running fully private, low-latency AI assistants entirely locally. Ward's question is pointed: if your phone can handle half of what you want to ask it today, and three-quarters of it in two years, how much demand is actually left for a giant distributed inference data center in orbit? Online models will likely continue to outperform local ones, he concedes, but local models running on local data may be more desirable for many use cases. It is the kind of question a founder asks when he is genuinely trying to figure out where the market lands, not just where he wants it to land.
On the risk side, there is a lot of hardware going up, there are adversaries watching, and there are known vulnerabilities in space infrastructure. What are the actual threats to what you and others are building in orbit, and which risks do you think are being overplayed?
Ward is calmer about accidental Kessler syndrome than most people expect. "If you did that same map with airplanes, the US would basically be a solid mass of dots, and you'd be like, oh my god, we should be having 100 crashes per day. But we don't." Aviation built deconfliction systems over a century. The space industry had more incidents in the early days, too, but it is heading the same direction. More companies are now doing active space situational awareness, strapping cameras and computers onto spacecraft to track what is around them and feeding that into a shared network. Ward recently spoke with one such company. "They're basically wanting to put cameras on everything and then a computer to analyze what the camera sees, so you can have a local understanding of what's going on in your vicinity." Tracking active spacecraft is getting better. The debris is still the hard part.
Intentional Kessler is a different story. Ward will not spell out the method, but he is blunt. "There's a relatively easy way to turn lots of LEO into space trash." He treats it as a Samson option, the kind of thing an adversary does when it has decided that if it cannot use space, nobody will.
What he thinks is more likely is what he calls a "digital space Pearl Harbor." In a conflict with China, the opening move would probably be cyber, not kinetic. "We know there are tons of backdoors. How many of them we know exist and have closed, that they don't know we've closed, and how many we don't know exist, and we'll just have to find out the hard way."
The goal would not be to blow up satellites. It would be to shut them off long enough to matter. "If you can just deny the use of assets that still exist for somewhere between 12 and 96 hours, that's tremendously valuable." Ward draws the comparison to Pearl Harbor: Japan was not looking for a long war. It was looking for a fait accompli, a new reality delivered so fast that the other side accepts it rather than fights. "They wanted to present us with a fait accompli. This is the way the world works now. You can either accept it, or we can fight over it." If Japan had sent a third wave to hit the fuel storage tanks, the Pacific War might have looked very different. The strike was crippling but not decisive enough to prevent the galvanized American response that followed.
Ward sees China studying the same logic. And recent events, he believes, may have disrupted Beijing's calculus. The United States has moved against Venezuela and Iran, both Chinese allies that supplied discounted fuel in exchange for weapons and diplomatic cover. Those actions removed significant energy inputs from China's supply chain and, perhaps more importantly, tested Chinese-exported defense systems in actual combat. The results, in Ward's reading, were not good for Beijing. "Billions of dollars worth of equipment, some of which was actually taken from operational PLA units, all got turned to scrap metal in the first 24 hours." He mentions reports of Chinese-made air defense systems failing to detect stealth aircraft, tanks with barrel life so short the cannon exploded after a small number of rounds, and a general pattern that has undercut China's defense export program.
Ward notes that the daily PLA Air Force harassment flights toward Taiwan reportedly stopped the day bombs started falling in Iran. "Because they're scared," he says, though he adds the caveat that his primary sources on this are Chinese dissident media channels, which have their own editorial motivations. "They are people who want to put out the message that China is a paper tiger."
The broader question Ward raises is about information flow inside the Chinese system. "How much of that is getting up to Xi?" If subordinates are afraid to deliver bad news, and if the equipment is not performing as advertised, then the leadership is making calculations based on assumptions that may not hold. And a country making strategic decisions on bad data, with a leader who has purged anyone willing to challenge him, is, in Ward's estimation, the most dangerous kind of adversary for everyone who depends on space infrastructure. "He worked himself into a corner where everybody's afraid of him, but nobody has anything to lose."
Author's Analysis
It is 2038. A constellation of Earth observation satellites passes over the South China Sea every 47 minutes, imaging every vessel, every wake, every construction barge working on a disputed reef. Each satellite carries an onboard computing module that processes imagery in real time, compresses it into a structured summary, and sends the results to a joint operations center in Guam within minutes of capture. The raw images never touch a ground station. What arrives is a spreadsheet: vessel type, heading, speed, change since the last pass. The analysts on the receiving end are working with data that is less than an hour old and has already been sorted by a machine that knows what to look for.
A decade earlier, that workflow would not have been possible. The satellites would have captured images, stored them, waited for a ground station pass, beamed the raw files down, and then waited again while analysts dug through terabytes looking for the handful of frames that actually mattered. The gap between capture and decision could stretch into days. In a crisis, you do not have days.
The company that built those computing modules does not launch rockets or operate satellites. It builds small, radiation-hardened boxes that turn raw data into answers at the point of collection. It started with a 10-watt product that could run a single vision AI model, and then it scaled. Whether that company turns out to be OrbitsEdge, or whether the market Ward identified gets absorbed by larger players before an independent builder can plant a flag, is still an open question. Ward is betting that the space economy is big enough, and messy enough, to need infrastructure that is not locked inside any single platform. The market has not settled that bet. But the data problem he spotted at Deep Space Industries keeps getting worse, and the satellites keep going up. If the first days of a future crisis are shaped by who can process orbital data fastest, and the computing layer that makes that possible was never built because no one could figure out how to fund it independently of SpaceX, what exactly was the plan?
About Rick Ward
Rick Ward is the founder and CEO of OrbitsEdge, a space computing company building radiation-shielded edge computing platforms for in-orbit data processing. The company's product roadmap runs from 10-watt edge modules to 250-kilowatt orbital data center nodes.
Before starting OrbitsEdge, Ward worked at Deep Space Industries, where he built asteroid regolith simulants for NASA and researched in-space resource utilization, including 3D-printing methods for Mars construction. His work on the computing requirements for autonomous space mining led directly to the founding of OrbitsEdge.
Ward served in the United States Marine Corps and holds an associate degree in mechanical engineering. He speaks regularly at space computing conferences and has been covered by Bloomberg, NASA Spinoff, and Payload Space.
Get exclusive insights from our network of NASA veterans, DARPA program managers, and space industry pioneers. Weekly. No jargon.