"I've Documented Over 17 Million Control System Cyber Incidents": Father of Industrial Control Systems Cybersecurity Joe Weiss on Why Ground Stations Are the Achilles' Heel of Modern Space Operations
Cybersecurity expert Joe Weiss warns that space infrastructure's greatest vulnerability lies in the decades-old terrestrial ground stations that control satellites—systems built when security wasn't even a consideration.

Joe Weiss speaks with the quiet urgency of someone who has glimpsed catastrophe on the horizon. For four decades, he has inhabited the shadowy intersection between physical engineering and digital security—a no-man's land where the most devastating vulnerabilities in our critical infrastructure hide in plain sight. Having amassed a database of more than 17 million control system cyber incidents—incidents that have collectively claimed over 30,000 lives—Weiss has become a modern Cassandra, his warnings largely unheeded by an industry captivated by data security while blind to the physical components generating that data. Some of those real cases are space-related.
"There's a gap between what engineers do and what IT professionals think," Weiss explains, leaning forward with intensity. "Two separate universes with minimal overlap."
This disconnect might once have been merely academic. Today, as billionaires race to colonize low Earth orbit with thousands of satellites, and as nation-states develop increasingly sophisticated capabilities to weaponize space, it represents an existential threat. The most alarming vulnerability, according to Weiss, isn't in the sleek spacecraft capturing public imagination, but in their terrestrial counterparts—the ground stations that communicate with and control these orbital assets. These stations rely on decades-old industrial technologies developed in the 1970s through 1990s, an era when security wasn't even a consideration.
For Weiss, who helped start the control system cyber security program for the electric industry 25 years ago and a witness before multiple Congressional committees, this blind spot represents a ticking time bomb—one that could detonate with catastrophic consequences as commercial space operations rapidly expand. I sat down with him to explore the lessons from industrial control systems that have urgent applications for our orbital future, and to understand why the most dangerous threats to space operations might come not from space itself, but from the ground beneath our feet.
After 40+ years in control systems, what security principles remain constant, and how do they apply to commercial space operations? Your work bridging multiple generations of industrial technology offers valuable perspective as the space sector rapidly commercializes.
"To start with, there are really two aspects of security," Joe Weiss explains, setting the framework for our conversation. "One is data security, and the other is what you might call functional or control system security."
This distinction forms the core of Weiss's perspective — a critical difference he believes is often overlooked in modern cybersecurity approaches. While most security professionals focus on protecting information, Weiss has spent decades concentrating on the physical components that generate and act on that information.
"Data security is what most people think of first—making sure, number one, that their data is accurate, and number two, that no one is viewing or altering it without permission," he clarifies. "That's what typically comes to mind when you hear the term 'cybersecurity.'"
But there's another side that receives far less attention—what Weiss calls "the engineering side of cyber."
"It's the process sensors, pumps, valves, relays, transformers, and turbines—the actual equipment that generates and acts on the data," he emphasizes. "Data doesn't just come out of nowhere."
Weiss notes the disconnect was evident at the recent RSA Conference in San Francisco, which drew 46,000 attendees. Named after its founders (Rivest, Shamir, and Adleman, who created the RSA encryption algorithm), the RSA Conference has grown since 1991 to become the world's largest cybersecurity event and a bellwether for industry trends. Yet despite its prominence, Weiss estimates, "there weren't 25 engineers of the 46,000" attendees. "As far as the RSA conference was concerned, data just pops up out of the ground, must be secured, authenticated, and delivered and that's the only thing that matters."
This distinction becomes even more critical in space operations. "When you go into space, everything is remote," Weiss notes. "That introduces unique cyber challenges — how do you send and receive data from something that might be a million miles away?"
The most alarming insight from our conversation is how little has changed in the fundamental security approaches to these engineering systems, despite decades of technological advancement. Most sensors and control systems were developed in the 1970s through 1990s, an era when security wasn't a consideration.
"Most of the sensors and control systems still in use were actually developed in the '80s or '90s — with some of the old analog models dating back to the '70s," Weiss explains. "The concept of security was not considered. Who would have thought anyone would want to manipulate these devices?"
These sensors include resistance temperature detectors (RTDs), thermocouples, pressure transmitters, level transmitters, and flow meters - the fundamental instrumentation that measures physical conditions and enables control of everything from power plants to water systems to rockets. Many of these devices use protocols like HART (Highway Addressable Remote Transducer), Modbus, or Profibus that were designed for reliability rather than security.
This legacy presents a particular challenge for space systems, which often incorporate these same industrial control technologies. "Even the latest technologies are still based on or use our old '80s and '90s designs," Weiss emphasizes. "People assume it's all new, but that's not the case."
Perhaps most concerning is the inherent conflict between engineering priorities and security requirements. "Everything you do to make it reliable, safe, or efficient also makes it cyber-vulnerable," Weiss warns. "You’re essentially opening it up so others can see. It’s this innocence of the engineering world."
Meanwhile, cybersecurity professionals often advocate for the opposite approach. "They want everything locked down — need-to-know access, maximum restriction."
This conflict plays out in the design of control systems such as Programmable Logic Controllers (PLCs), Distributed Control Systems (DCS), and SCADA (Supervisory Control and Data Acquisition) systems which are now ubiquitous in industrial and manufacturing control. These systems were designed with openness, reliability, and efficiency as primary goals––not security.
This fundamental difference in philosophy has created two separate worlds with little overlap or understanding between them. "There’s a gap between what engineers do and what network security professionals do," Weiss says. "The engineering world with its priorities has collided with the network security world, creating two separate universes with almost no overlap."
Table 1: Engineering vs. Cybersecurity - Two Worlds Colliding
In space, where everything is remote, this disconnect becomes a critical vulnerability. "You can’t physically secure or redesign equipment once it’s deployed," Weiss notes. "Unlike IT, you can't just patch it."
Yet the space industry often applies traditional cybersecurity approaches that ignore these engineering realities. "Most of the decision-makers are cybersecurity professionals who have little understanding of engineering," Weiss points out. "They don't care if the rocket lifts off — as long as the data is secure."
The emergence of the term "operational technology" (OT) in 2006 further complicated matters, creating what Weiss describes as an artificial barrier between engineering and cybersecurity.
"OT stands for operational technology," Weiss explains. "It was a term coined by Gartner in 2006 — an IT organization that needed a label for something it didn’t fully understand. That’s where OT came from. If it wasn’t IT, they labeled it OT, even though they had little idea what it really meant."
"Before that, you were either an engineer or you were IT. Engineers didn’t like IT, and IT didn’t like engineers," he continues. "Back then, IT was simply a support service. If engineers needed a firewall installed or something else, they would call IT and say, 'Hey, come here and do this.' But once the term OT emerged, everything changed. This strange new category appeared, and suddenly, all security responsibilities were shifted to the OT organization—away from engineering."
Weiss points to a recent job posting that illustrates this disconnect. "There was a job opening on LinkedIn about a month ago from one of the largest utilities—a very large utility—and they were looking for an OT cybersecurity analyst, actually a senior OT cybersecurity analyst. The requirements? A degree in computer science, not engineering," Weiss explains. "The role was focused on identifying network vulnerabilities and ensuring compliance with the North American Electric Reliability Corporation's (NERC) critical infrastructure protection—essentially, meeting cybersecurity requirements. The job did not require any knowledge of the electric grid itself."
Weiss also recalls receiving what he initially thought was junk mail from a mid-sized water company looking for engineers. The job description outlined responsibilities such as: "Assisting with or leading efforts to provide electrical engineering and technical support to ensure reliable operation of the agency's supervisory control and data acquisition (SCADA) system, including remote terminal units, programmable logic controllers, programmable automation controllers, associated industrial communications, networking equipment, and protective relaying equipment." Notably, the description made no mention of cybersecurity or the term “OT”.
As commercial space operations accelerate, with companies like SpaceX launching thousands of satellites and private firms exploring everything from space tourism to asteroid mining, Weiss's warning becomes increasingly urgent. Security principles must integrate both data and engineering perspectives from the outset —not as an afterthought.
"Those whose responsibility included cybersecurity were removed from dealing with cyber after the term OT was established are now no longer involved with cybersecurity of their own systems," Weiss says. "They're designing these systems to function, but cybersecurity isn't part of the conversation."
The implications for space systems are clear. "Think about all the communications you want back and forth, the rockets on the launch pad, or the radar stations used to track everything. There’s no security built into that."
The 2022 Russian cyberattack against Viasat demonstrated how conflicts increasingly target space-related communications infrastructure. As someone who testified before Congressional committees on critical infrastructure protection, what policy approaches do you believe would better protect the growing commercial space sector from similar threats?
The Russian cyberattack against Viasat in 2022 disrupted satellite communications across Europe just as Russia began its invasion of Ukraine, providing a sobering example of how modern warfare increasingly targets space infrastructure. This wasn't the first time Russia had used cyber capabilities against critical infrastructure––Weiss points to the 2015 attack on Ukraine's power grid as another instructive example.
"When the Russians cyber-attacked Ukraine's power grid in 2015, the Ukrainians operated without networks for almost six to eight months after recovering. They simply couldn't trust them," Weiss recalls. "We operated power grids for 80 to 100 years without these networks. After all, Edison didn't invent the internet."
This historical perspective challenges assumptions about what systems truly need network connectivity. "You don't need those networks to run the power grid," Weiss points out. "On the other hand, if you don't have the sensors and the actuators, you have no power grid."
When asked about policy approaches to protect the growing commercial space sector, Weiss offers a fundamental reframing of how we should think about cybersecurity governance. "If you were going to have a heart bypass operation," he begins with an analogy, "who would you want performing it? The heart surgeon or the network security people who set up the operating room?"
The answer seems obvious––you'd want the heart surgeon. Yet in critical infrastructure including space systems, Weiss argues, "We're having the people who network the operating room telling the heart surgeon what to do. It's insane."
This inverted dynamic, where network security specialists dictate requirements to engineering experts, creates fundamental vulnerabilities in our most critical systems, including space infrastructure. The first principle Weiss recommends is a reversal of this relationship.
"The first thing is that you need space scientists telling the network people, 'This is what I need to accomplish. Now you tell me how to best secure it,'" he explains. "We can't continue doing it backwards, which is happening everywhere today—network people dictating what you can or cannot do, potentially preventing the heart surgeon from saving your life.”
Weiss points to Tesla as a positive example of security-by-design, though he's quick to note he's not defending Elon Musk specifically. "Musk had a phenomenal approach with Tesla in that he knew he was going to need all of these over-the-air updates. As a result, Tesla made cyber security an integral part of the design of everything in the vehicle."
The results speak for themselves. "I can't recall seeing any cybersecurity notifications about Tesla. Why? Because security was integrated into the basic design from the beginning."
This approach stands in stark contrast to most cybersecurity implementations, which were added as afterthoughts. "Almost everywhere else cyber was an add-on at the end," Weiss says. This pattern repeats across industries, from automotive to industrial control systems.
"There was one control system supplier that was fully secure," Weiss reveals, "and they are no longer in business." The story of this company illustrates both the possibility and challenges of security-by-design. This vendor was able to develop a fully secure system because they started from scratch without legacy compatibility constraints. "They started with a clean sheet of paper. And from day one, this control system was designed to be cyber secure. It was even electromagnetic pulse (EMP) resistant. It was a better control system that was also cybersecure"
Weiss elaborates on the company's origins: "The company was a spin-off of a semiconductor manufacturer. They had brought in the head of R&D from one of the major control system vendors." This experienced leader applied "lessons learned from 20 or 30 years in the business" to create a better, more secure control system. Because they were a spin-off of a semiconductor manufacturer, "they effectively had almost the entire supply chain within their scope."
"The only element outside their scope was the operating system, so they partnered with a vendor who had supplied the Air Force with an exceptionally robust operating system," Weiss explains. "Everything else was self-designed and built from day one with cybersecurity as a fundamental principle."
Despite creating a genuinely secure control system that intelligence organizations purchased for testing and that was installed in water and other systems, the company ultimately disappeared. This cautionary tale highlights how market forces can sometimes work against security innovation––a lesson with direct relevance to the commercial space sector where competitive pressures and time-to-market often trump security considerations.
When I ask about the Golden Dome missile defense system currently being developed, Weiss is direct about the security implications: "If you haven't integrated cybersecurity from the very beginning of development, you're doomed."
Even more concerning, failure could be worse than simply not working. "The worst case isn't simply system failure. The worst case is the system functioning in ways you never intended," he explains. With missile defense, this could mean "your defensive rocket lands in the wrong place and strikes unintended targets. Essentially, you become the very threat you're trying to stop," Weiss warns.
"I would find it hard to believe that cyber isn't being addressed," Weiss says regarding the Golden Dome project. "However, I go back to what I was telling you before. Are they really looking at both sides of cyber, or only the network side? And that I can tell you from what I've seen to date, both on the DoD as well as the commercial side, that isn't happening."
For the commercial space sector, Weiss's policy recommendations would focus on reversing the current governance model, requiring cyber-physical security to be integrated from the beginning of system design, and ensuring engineers maintain authority over security requirements rather than network specialists.
Weiss concludes with a stark assessment: "We need comprehensive cybersecurity training for both network teams and engineering/operations personnel so they can effectively identify when control system incidents have cyber origins."
Without this fundamental change in approach, our increasingly crowded orbital infrastructure remains vulnerable not just to data breaches, but to physical compromise that could have catastrophic consequences.
Your experience with Estonia following their landmark cyber attacks and your participation in the first NATO cybersecurity conference provides unique insights. How have countries like Estonia evolved their approach to critical infrastructure security, and what lessons could benefit the space sector?
Weiss's experience with Estonia offers a fascinating window into how nations respond to cyber attacks on critical infrastructure––and the lessons that might apply to securing space systems. "After the first Estonian cyber attack," Weiss begins, referencing the 2007 incident when Russia targeted Estonia after the relocation of a Soviet-era statue, "the first NATO cybersecurity conference was held in Tallinn, Estonia."
Through a remarkable coincidence, Weiss was able to attend this pivotal meeting."I received an invitation to the conference, but traveling from San Francisco to Tallinn wasn't feasible. By coincidence, I happened to be at a nuclear plant in Sweden at that time. The representatives from Vattenfall said, 'We can cover your ticket from Gothenburg, Sweden, to Tallinn, Estonia.' I actually flew Air Estonia—yes, there really is an Air Estonia—and attended that groundbreaking NATO cybersecurity meeting in Tallinn," he adds.
This first NATO cybersecurity meeting would prove to be a turning point, but Weiss notes a critical gap in the conversation even then. "I was the only engineer present," he recalls. "Everyone else focused exclusively on data security—I was the only one addressing physical systems."
This focus on data security rather than operational technology reflected Estonia's particular vulnerability profile at the time. "When Russia cyberattacked Estonia, I asked my taxi driver upon arriving in Tallinn, 'Did you lose power?' No. 'Did you lose water?' No. The attack targeted only networks, primarily affecting banking and other IT systems."
Estonia was particularly vulnerable to this type of attack because "Estonia was far ahead of other countries, having digitized all their banking, driver's licenses, and government services. When Russia disrupted these systems, the impact was significant precisely because everything was interconnected and web-enabled."
At this first NATO cyber meeting, Weiss met a professor who is now at the Naval Postgraduate School-–a connection that would later lead to collaborative work on addressing the educational gap between engineering and cybersecurity. This educational gap remains a fundamental problem across industries and nations. "Cybersecurity is taught within computer science departments," Weiss explains. "Very few universities require computer science students to take even introductory engineering courses. Consequently, future cybersecurity experts graduate knowing essentially nothing about the engineering systems they're supposed to protect."
Conversely, "Engineering disciplines—electrical, mechanical, nuclear, civil, structural—typically don't require any coursework in cybersecurity. Consequently, cybersecurity considerations are not part of the thought process".
This creates a dangerous divide where engineers design systems without security considerations, while cybersecurity experts implement protections without understanding the engineering realities and potential impacts. Weiss and his colleague from the Naval Postgraduate School, Professor Bret Michael, are addressing this through publications including an upcoming article in IEEE Computer magazine.
"We're writing this article because our educational systems are fundamentally broken when it comes to control system cybersecurity," Weiss explains. "I received what I initially thought was junk mail," he says, referring to the water company job posting mentioned earlier. "It made me realize that engineers responsible for networking and communications infrastructure have no security requirements in their job descriptions. Even more concerning, this extensive recruiting brochure never once mentioned the term ‘Operational technology (OT)’."
The lessons for the space sector are clear––addressing cybersecurity challenges requires bridging this fundamental educational and conceptual gap. Countries like Estonia, despite being ahead of their time in digital governance, still faced this division between IT security and engineering security. For the rapidly expanding commercial space sector, these lessons suggest the need for integrated approaches that combine IT security expertise with deep understanding of the engineering systems that drive space operations. Moreover this gap also exists between the engineers and policymakers.
How should security approaches evolve for space systems that operate autonomously for extended periods without physical access? Your experience with embedded systems in hostile environments seems particularly relevant to this challenge.
When asked about securing autonomous space systems that operate for extended periods without physical access, Weiss laughs with a hint of irony. "Here's the ironic part," he says. "Before we started using the term 'autonomous,' what exactly did we think satellites were?"
His point cuts through modern buzzwords to a simple truth: space systems have always been autonomous by necessity. "Once you launch a rocket or satellite into space, all operations must be performed remotely. It's inherently autonomous—there's simply no alternative."
This perspective reframes the question entirely. The challenge of securing autonomous space systems isn't new—it's as old as the space program itself. What's changed is the scale, complexity, communication technologies, and commercial nature of modern space operations.
"We've actually hacked into our own satellites to perform repairs," Weiss notes, highlighting both the necessity and vulnerability of remote access. "If those with legitimate purposes can hack in to fix them, imagine what malicious actors could accomplish."
The security principles for autonomous space systems must account for this reality—that any system designed for remote maintenance inherently creates pathways for unauthorized access. One particularly instructive example Weiss shares comes not from space, but from another autonomous domain—nuclear power plants. While working at the Electric Power Research Institute (EPRI) managing the Nuclear Instrumentation and Diagnostics Program, Weiss addressed a similar challenge with remote sensor calibration.
"When measuring temperature—or pressure, level, flow, or any physical parameter—the sensing components gradually drift over time. Eventually, you must recalibrate these instruments to restore their accuracy and reliability." This recalibration is expensive and hazardous in nuclear environments, creating a need for self-calibrating sensors. “We were developing what we called a self-calibrating resistance temperature detector (RTD), eliminating the need for human intervention in the calibration process."
RTDs are precision temperature sensors that measure temperature by correlating the resistance of the RTD element with temperature. As temperature increases, the resistance increases in a predictable way. However, these sensors require periodic recalibration to maintain accuracy, which is particularly challenging in radioactive and other remote environments.
Weiss provides more detail on this project: "I collaborated with Oak Ridge National Laboratory on this project in the 1990s. Though we successfully developed the technology, the nuclear industry ultimately did not implement it." The solution they developed, while not adopted by the nuclear industry, found a perfect application in satellite systems. "Guess who adopted this technology? NASA, for their satellite programs. In orbital platforms, you simply can't send technicians to perform recalibration."
However, this innovation came with an unaddressed vulnerability. "We never considered the cybersecurity implications of this technology. This was the early 1990s—cybersecurity wasn't yet on anyone's radar for these systems." This pattern—engineering innovation without security considerations—has repeated throughout the history of control systems development. For autonomous space systems today, the security approach must integrate the engineering and data security perspectives as well as physical security considerations from the beginning.
Weiss emphasizes that the terminology we use often blinds us to what should be obvious connections. "Space systems have always been autonomous by necessity. Consider submarine cables at the bottom of the ocean—they're autonomous because they must be. Humans can't survive in those environments. Yet we rarely frame these systems as 'autonomous' technologies."
This artificial distinction between traditional space systems and new "autonomous" technologies creates blind spots in security thinking. "The terminology we use fundamentally shapes how we approach problems and the solutions we develop."
For securing today's autonomous space systems, Weiss would likely recommend applying the same principles he advocates for all critical infrastructure: integrating security from the beginning, prioritizing engineers' expertise over network specialists', and recognizing that both data and physical systems need protection.
Most critically, security approaches must account for the reality that these systems cannot be physically accessed once deployed, making resilience, fault tolerance, and secure remote management essential from the start.
Space operations depend heavily on terrestrial control systems that often use legacy industrial technologies. What specific cyber vulnerabilities in these ground systems create the greatest risks to space missions, and how do these differ from vulnerabilities in other critical infrastructure you've assessed?
The ground-based infrastructure supporting space operations represents a critical vulnerability that Weiss believes receives far too little attention. While people focus on satellites and rockets in space, the terrestrial control systems that communicate with and control these space assets often rely on decades-old industrial technologies with minimal security protection.
"All of these ground-based stations—which transmit communications to space assets and perform tracking functions—rely on electro-mechanical components. These systems contain numerous sensors and actuators, which haven’t incorporated cybersecurity protections," Weiss explains.
Table 2: Space Ground Systems - Vulnerable Infrastructure
He draws a parallel to another critical but often overlooked infrastructure: "A colleague attended a conference in Hawaii focused on cybersecurity of undersea cables—an infrastructure at the opposite extreme from space systems. Interestingly, both rely heavily on land-based support systems. These terrestrial facilities generate substantial heat, requiring sophisticated cooling systems with temperature sensors, flow meters, and other instrumentation. Like space systems, these critical components for subsea cables did not contain inherent security protections nor were the discussions about that gap."
Even more concerning, this vulnerability was likely not addressed in their design specifications. "I haven't reviewed the specifications, but I strongly suspect there were no cybersecurity requirements incorporated into the design of these facilities or their equipment," he says. "There were likely cybersecurity requirements for the data itself—at least I hope so—but almost certainly none for the physical hardware components essential for maintaining the capability to transmit and receive that data."
These ground-based vulnerabilities create attack surfaces that could compromise entire space missions without ever touching the space-based components. Everything from cooling systems to power supplies to communication equipment relies on sensors and actuators that were designed with reliability, not security.
When asked about how an adversary like China might exploit these vulnerabilities, particularly given concerns about Chinese-manufactured power transformers in U.S. infrastructure, Weiss warns that the most dangerous attacks wouldn't simply shut systems down. If adversaries merely wanted to switch systems off, they wouldn't be using their current tactics," he notes. "We have established protocols for handling outages and system failures. What we lack are procedures for responding when systems appear functional but are actually compromised—when they start 'misoperating.'"
This distinction between complete failure and malicious misoperation highlights a fundamental difference in how engineers and security professionals think about risks. While redundancy can address system failures, it may not protect against systems that appear to be functioning correctly but are actually compromised. "Our systems incorporate redundancy. When a tracking station fails, backup facilities take over," Weiss explains. "However, we haven't designed protections against tracking stations that appear operational but are actually transmitting corrupted data or executing malicious commands."
Weiss shares a remarkable historical example that illustrates this vulnerability in space systems. "I was one of the two keynotes in a conference in Cleveland. It was in 2019. It turns out the other keynote speaker was Harrison Schmidt." Schmidt, an Apollo 17 astronaut, geologist, and the last person to step onto the lunar surface, shared a story that Weiss recognizes as the first identified control system cyber incident.
"In his keynote, Schmidt described being in the capsule preparing for launch. Minutes before scheduled takeoff, mission control suddenly lost visibility into the rocket's fueling status. They faced an immediate critical decision: scrub the mission, abort the launch, or proceed despite the information gap."
The rocket was, in fact, still fueling—evident from the steam visible to those outside—but this information wasn't reaching the computer room. "That was the first control system cyber incident, and this was in 1972" Weiss declares. While not malicious, this computer malfunction demonstrates how control system failures or compromises can create life-or-death situations in space operations.
The similarity between terrestrial control systems across different sectors creates additional vulnerabilities. As Weiss notes, "Food uses the same sensors and actuators and drives from the same companies" as other critical infrastructure. "There are only so many ways to measure and manipulate physical parameters. The same principles apply regardless of industry or application." This standardization creates systemic vulnerabilities across sectors. If an adversary discovers a vulnerability in process sensors from a particular manufacturer, that vulnerability likely exists across multiple critical infrastructure sectors, including space systems.
What makes the space sector particularly vulnerable is its isolation from mainstream cybersecurity conversations. At major security conferences like RSA, which drew 46,000 attendees in 2024, Weiss estimates "there weren't 25 engineers" among them. Similarly, at an American Petroleum Institute cybersecurity conference with 800-1,000 attendees, Weiss didn't think there were "25 engineers that attended. They were all network people." In another case at the Houseccon conference in Houston in September 2024, Weiss notes, “I participated in an OT cybersecurity session. There were approximately 90 attendees and only one was identified as being an engineer.”
Weiss offers a recent example illustrating how cybersecurity discussions often miss the most dangerous threats: "They had the head of information sharing from the Food and Agriculture ISAC (Information Sharing and Analysis Center) speaking on Thursday, and all he did was talk about ransomware—what a big problem that is, like when JBS Meats was hit by ransomware and shut down their operations in an abundance of caution."
"Ransomware is strictly an IT issue," Weiss continues. "Yet consider this incident from August 2023: a disgruntled employee of a sanitizing company compromised a poultry producer's production systems. He altered the chemical formulations used to sanitize the processing lines where chickens were prepared for consumer packaging. He manipulated the chemical constituents, disabled the safety alarms, and redirected notification emails to prevent detection. This was a genuine threat to our food safety. The FBI only revealed this incident when filing the indictment on April 16, 2025—keeping it confidential for a year and a half during their investigation." This cyberattack was not by a nation-state but could put the US food pipeline at risk. Yet, this was not discussed at the RSA Conference Weiss notes with evident frustration. "The Food and Agriculture ISAC focused entirely on ransomware—which poses financial risks but doesn't directly threaten lives. Meanwhile, they ignored a documented attack that could have resulted in widespread foodborne illness or deaths."
Weiss highlights another blind spot: "There were discussions about Chinese 'typhoon' cyberattacks but no discussions of hardware backdoors in large Chinese-made electric power transformers that bypassed OT networks or Chinese-made inverters communicating back to China" This concern is directly relevant to the earlier question about Chinese exploitation of infrastructure vulnerabilities as well as Chinese-made equipment such as inverters used in US space systems.
This disconnect between engineering realities and security practices creates blind spots that adversaries can exploit, particularly in complex systems like space infrastructure where both engineering and security expertise are essential. Weiss's presentation at the November 2024 API conference entitled "We have no cyber security in our sensors" was "new to most everybody there" despite being a fundamental vulnerability as the attendees were almost all network security specialists. The same situation occurred at the April 2025 RSA Conference where there weren't any discussions of process sensor cyber security.
In a recent blog post following the RSA Conference, Weiss elaborated on this problem: "Control system field devices have no cyber security, authentication, or cyber forensics. These devices were orphaned from cybersecurity programs as OT networks became the focus of cybersecurity programs and the RSA Cybersecurity Conference. At RSA, there were numerous discussions about network cybersecurity threats from Russia, China, and Iran, as well as on the latest APTs. There were panel sessions and presentations on OT and critical infrastructures, but no discussions about control system field devices."
This ongoing blind spot in cybersecurity thinking means that many vulnerabilities in terrestrial control systems supporting space operations likely remain unaddressed and even unacknowledged. Weiss notes, "The vast majority [of incidents] were not identified as being cyber-related, but 'glitches' as they were not IP network-related incidents."
Even more concerning is that adversaries are well aware of this gap. As Weiss puts it, Russia, China, and Iran have identified and are actively exploiting the cybersecurity gaps between OT networks and control system field devices. They're targeting these vulnerabilities precisely because they know cybersecurity defenders aren't monitoring them."
As commercial space operations continue to expand, with increasing dependencies on ground-based infrastructure, this gap between engineering and security perspectives represents a growing risk that could impact everything from satellite communications to deep space exploration.
Author's Analysis: The Unheeded Warnings of Control System Security
Throughout my conversation with Joe Weiss, I was struck by a profound sense of déjà vu. Here was a Cassandra-like figure (fittingly, he was featured in Richard Clarke and RP Eddy's book "Warning—Finding Cassandras to Stop Catastrophes") who has spent decades meticulously documenting vulnerabilities that few seem willing to acknowledge.
What makes Weiss's warnings particularly credible is his database of over 17 million actual control system cyber incidents across all sectors globally—incidents that have collectively claimed more than 30,000 lives. Yet most of these weren't labeled as cybersecurity events. Instead, they were dismissed as "glitches" because they didn't involve IP networks—the sole focus of mainstream cybersecurity.
Table 3: Control System Cyber Incidents - Hidden in Plain Sight
The engineering-cybersecurity divide Weiss describes isn't merely an academic turf war. It represents a fundamental blind spot in our approach to protecting critical infrastructures. By focusing exclusively on data protection while ignoring the physical components that generate and act on that data, we've created a false sense of security—one that adversaries can and are exploiting.
What's particularly alarming is how this blind spot extends to space systems, which face all the vulnerabilities of terrestrial control systems plus the added challenges of remoteness and inaccessibility. As commercial space activities accelerate and more nations develop space capabilities, the risks Weiss identifies become increasingly urgent.
His anecdote about the one control system supplier that built security from the ground up, only to be acquired and effectively disappeared—raises troubling questions about market incentives. If truly secure systems can't survive commercially, how can we expect security-by-design to prevail in highly competitive sectors like commercial space?
The parallels between physical infrastructure and space systems are striking. Just as a compromised sensor in a power plant could cause catastrophic damage, a compromised sensor on a satellite could lead to collision, mission failure, or worse. And just as our terrestrial infrastructure relies on decades-old control technologies designed without security in mind, our space infrastructure often incorporates these same vulnerable components.
Perhaps most concerning is Weiss's observation that adversaries like Russia, China, and Iran are well aware of these gaps and are exploiting them, knowing that cyber defenders are looking elsewhere. This asymmetry creates a dangerous vulnerability in both critical infrastructure and space systems.
As space becomes increasingly militarized and contested, with nations developing capabilities like the Golden Dome and engaging in close-proximity maneuvers, securing the engineering side of cyber becomes not just a technical challenge but a matter of national security. Weiss's warnings, backed by decades of experience and millions of documented incidents, deserve far more attention than they've received.
The rapid commercialization of space raises urgent questions: How do we integrate security into systems that were designed without it? Who should have the authority to set security standards for space infrastructure—engineers who understand the operational requirements or cybersecurity specialists who understand threats? Can legacy systems be secured without complete redesigns? And perhaps most importantly, who will bear the cost of implementing robust security measures in an increasingly competitive commercial space sector?
These questions have no easy answers, but as our dependence on space infrastructure grows, finding solutions becomes increasingly critical. For now, Weiss's decades of warnings remain largely unheeded as we continue our orbital expansion with systems built on technologies that were never designed to withstand targeted attacks.
The question isn't whether Weiss is righ—his extensive documentation of incidents speaks for itself. The question is whether we'll heed his warnings before experiencing a catastrophic failure in our orbital infrastructure that could have been prevented. What will it take for the industry to bridge this dangerous gap between the engineering and cybersecurity worlds? Will it require a catastrophic incident in orbit, or can we learn from the terrestrial incidents that Weiss has so meticulously documented? The answer may determine the security and resilience of our future in space.
About Joe Weiss
Joseph Weiss is an industry expert on control systems and electronic security of control systems. Mr. Weiss spent more than 14 years at the Electric Power Research Institute (EPRI), the first 5 years managing the Nuclear Instrumentation and Diagnostics Program. He was responsible for developing many utility industry security primers and implementation guidelines. Mr. Weiss serves as a member of numerous organizations related to control system security. He is also an invited speaker at many industry and vendor user group security conferences, has chaired numerous panel sessions on control system security, and is often quoted throughout the industry.
He has published over 100 papers on instrumentation, controls, and diagnostics including chapters on cyber security for Electric Power Substations Engineering, Securing Water and Wastewater Systems, and Data Center Handbook. He coauthored Cyber Security Policy Guidebook and authored Protecting Industrial Control Systems from Electronic Threats. He has provided numerous presentations and meetings with US, Canadian, and other international government and industry organizations.
In February 2016, Mr. Weiss gave the keynote to the National Academy of Science, Engineering, and Medicine on control system cyber security. Mr. Weiss gave the keynote at the Texas A&M Instrumentation and Automation Symposium. He has also given lectures on control system cybersecurity to the University of California-Berkeley, Stanford, University of Illinois, Missouri Science and Technology, the Air Force Cyber College, the Naval Postgraduate School, and the US Military Academy among others.
Mr. Weiss has conducted SCADA, substation, nuclear and fossil plant control systems, and water control systems vulnerability and risk assessments. He has conducted short courses on control system cybersecurity with members of the Idaho National Laboratory and the Pacific Northwest National Laboratory. He has amassed a database of more than 17 million actual control system cyber incidents. He was a member of the Transportation Safety Board Committee on Cyber Security for Mass Transit. He was a subject matter expert to the International Atomic Energy Agency on nuclear plant control system cyber security.
Mr. Weiss has received numerous industry awards, including the EPRI Presidents Award (2002) and is an ISA Life Fellow, Managing Director of ISA Fossil Plant Standards, ISA Nuclear Plant Standards, Managing Director of ISA Industrial Automation and Control System Security (ISA99) for 12 years, a Ponemon Institute Fellow, and an IEEE Life Senior Member. He has been identified as a Smart Grid Pioneer by Smart Grid Today. He is a Voting Member of the TC65 TAG and a US Expert to TC65 WG10, Security for industrial process measurement and control – network and system security and IEC TC45A Nuclear Plant Cyber Security. Mr. Weiss was featured in Richard Clarke and RP Eddy's book - Warning – Finding Cassandras to Stop Catastrophes. He was a Visiting Research Associate in the Computer Science Department at the University of Missouri Science and Technology. He has patents on instrumentation, control systems, and OT networks. He is a registered professional engineer in the State of California, a Certified Information Security Manager (CISM), a Certified in Risk and Information Systems Control (CRISC) and a member of Control's Process Automation Hall of Fame.
For more information, visit his blog at www.controlglobal.com/unfettered