Picture this: an IT security manager receives an alert about unusual network activity on a control system. She escalates it to the network team, who isolates the affected node — a standard IT incident response move. What she doesn’t know is that the node she just cut off was a PLC managing pressure regulation in a chemical process. The operator on the plant floor is now flying blind.
No one made a bad decision. Everyone followed their training. The problem was organizational: the people with the authority to act didn’t have the context to act safely, and the people with the context didn’t have the authority.
This scenario plays out in real facilities around the world, and it points to one of the most underappreciated challenges in industrial cybersecurity. The technology problem is difficult. The people and organizational problem is harder. According to NIST Special Publication 800-82, securing Industrial Control Systems (ICS) requires something that neither IT departments nor control engineering teams can deliver alone: a genuine cross-functional security team where both cultures share knowledge, authority, and accountability.
This article is a practical guide to building that team — who needs to be on it, what each role brings, where the organizational friction points are, and how to structure the team for lasting effectiveness.
Table of Contents
Why Neither IT Nor OT Can Do This Alone
The instinct in many organizations is to assign ICS security to whoever already owns the adjacent domain. Sometimes that means IT security takes it on because “it’s a network.” Sometimes it means control engineers handle it because “it’s their equipment.” Both approaches fail, and for the same reason: each group only sees half the problem.
IT security professionals bring deep knowledge of threat landscapes, vulnerability management, network architecture, security frameworks, and incident response. What they often lack is an understanding of how industrial processes work, why availability matters more than confidentiality in OT environments, why you can’t just reboot a PLC the way you’d reboot a server, and why a network scan that’s harmless on a corporate LAN can crash a control system.
Control engineers and operators understand the process intimately — the timing requirements, the safety interlocks, the consequences of a valve opening at the wrong moment. What they often lack is awareness of how their systems look to an attacker, experience with cybersecurity concepts and tools, and familiarity with the evolving threat landscape targeting industrial systems.
NIST 800-82 is explicit on this point: assumptions that the differences between IT and OT “are not significant can have disastrous consequences on system operations.” The reverse is equally true — assuming that ICS security is just an engineering problem, with no need for IT security expertise, leaves organizations exposed to threats that control engineers have no training to recognize or counter.
The cross-functional team is not a compromise or a committee. It is the minimum viable structure for addressing a problem that genuinely spans two distinct professional domains.
Who Belongs on the Team
NIST 800-82 defines a minimum composition for the ICS cybersecurity team. Each role is there for a reason, and the absence of any one of them creates a blind spot.
IT Security Representative
The IT security professional brings the cybersecurity knowledge base that most control engineers don’t have: understanding of attack vectors, experience with firewalls and intrusion detection systems, familiarity with vulnerability assessment methodologies, and knowledge of security frameworks like NIST SP 800-53. They also bring institutional knowledge about what the corporate network looks like — critical for understanding where ICS and IT environments intersect and where traffic crosses boundaries.
Their challenge on this team is learning to apply security principles within OT constraints. Solutions that work perfectly in an IT context — aggressive patch cycles, endpoint detection agents, network scanning — may need to be adapted, tested extensively, or ruled out entirely in an ICS environment.
Control Engineer
The control engineer is the person who understands how the physical process works and what the control system is actually doing. They can tell you why a particular PLC cannot be taken offline for maintenance, what the consequences of a delayed sensor reading are, and whether a proposed security change will interfere with a critical timing sequence. Without this knowledge, security decisions made in the abstract can cause operational failures or safety incidents.
Control engineers are also often the people who have been managing ICS the longest and understand the legacy technology landscape — including equipment that is 15 or 20 years old, runs unsupported operating systems, and was never designed with network security in mind.
Control System Operator
The operator is the human closest to the process during normal operations. They are typically the first to notice that something is behaving unexpectedly — an alarm that doesn’t look right, a reading that seems off, a response that is slower than usual. This situational awareness is invaluable for early detection of security incidents that technical monitoring tools might miss.
Operators also need to be involved in security planning because they are directly affected by any security measure that changes how they interact with the system. Access controls, authentication requirements, and network changes all affect operator workflow. Security measures that create friction for operators in high-pressure situations are likely to be bypassed — which is a security failure, not a human one.
Network and Systems Security Expert
Beyond the general IT security representative, organizations with complex ICS environments benefit from having a specialist in network architecture and secure infrastructure design. This person focuses on the technical implementation of security controls: network segmentation, DMZ architecture, firewall rule sets, monitoring and logging infrastructure, and secure remote access design. They bridge the gap between security policy and technical reality.
Enterprise Risk Management Representative
ICS security decisions are risk decisions, and they need to be made with visibility into the organization’s broader risk posture. The risk management representative ensures that ICS security priorities are calibrated against overall organizational risk tolerance, that costs and benefits are weighed appropriately, and that ICS security investments are communicated in terms that senior leadership can act on. Without this link to enterprise risk management, ICS security programs often struggle to secure sustained funding and executive sponsorship.
Management Representative
Someone with budget authority and organizational influence needs to be on the team — not just informed of its decisions, but actively participating. Management’s role is to remove obstacles, allocate resources, adjudicate conflicts between operational priorities and security requirements, and ensure that accountability for ICS security is clearly assigned. NIST 800-82 places ultimate responsibility for ICS cybersecurity with the CIO or CSO, who accepts “complete responsibility and accountability for the cybersecurity of the ICS, and for any safety incidents, reliability incidents, or equipment damage caused directly or indirectly by cyber incidents.”
That accountability has to be real and named. Organizations where ICS security is everyone’s responsibility tend to find that it becomes no one’s priority.
Physical Security Representative
Cyber and physical security are more intertwined in ICS environments than in typical IT settings. Physical access to a PLC, a network switch in a control cabinet, or an operator workstation can enable attacks that no software control can prevent. The physical security representative ensures that access controls, surveillance, and facility security measures are designed with ICS assets in mind — and that the team considers physical attack vectors alongside digital ones.
Control System Vendor and/or System Integrator
This is often overlooked, but NIST 800-82 recommends including the vendor for continuity and completeness. The vendor knows the architecture of their system better than anyone, is the authoritative source on what patches and security configurations are supported, and can advise on what third-party security tools are compatible (or incompatible) with the system. In many ICS environments, installing security software without vendor approval voids the support contract — making the vendor relationship not just useful but necessary.
Safety Expert
Contemporary thinking in ICS security increasingly recognizes that safety and security are deeply connected — both are emergent properties of connected systems with digital control. A safety expert helps the team understand how security measures interact with safety instrumented systems (SIS), identify security scenarios that could compromise safety functions, and ensure that the security program supports rather than undermines the organization’s safety management system.
The Cultural Challenge: IT and OT Don’t Naturally Speak the Same Language
Assembling the right people is necessary but not sufficient. The harder challenge is getting IT and OT cultures to work together effectively, because they approach problems from fundamentally different starting points.
IT security culture tends to assume that systems can be updated, patched, and reconfigured relatively quickly. Risk tolerance is calibrated around data confidentiality and system availability in roughly equal measure. Incident response often means taking a system offline to investigate. Security is a core competency of the IT function.
OT culture tends to prioritize process continuity and safety above all else. Changes to running systems are made cautiously and only after extensive testing, because the cost of a failure is not just lost productivity — it can be equipment damage, environmental release, or worker injury. Security has historically been a secondary concern, because ICS systems were air-gapped and the threat model was different.
These are not just different preferences. They reflect genuinely different operating realities. The integration of these two cultures — which NIST 800-82 identifies as “essential for the development of a collaborative security design and operation” — requires deliberate effort. A few practices that help:
Joint training. IT security staff benefit from process familiarization — actually spending time on the plant floor, understanding what operators do, and seeing what happens when a system behaves unexpectedly. Control engineers benefit from cybersecurity fundamentals training that covers threat landscapes, attack methodologies, and security concepts. Neither group should be expected to become experts in the other’s domain, but shared vocabulary and mutual respect are essential.
Shared incident response exercises. Tabletop exercises that simulate ICS security incidents — with IT, OT, and management all in the room — surface the assumptions and communication gaps that lead to the scenario described in this article’s opening. These exercises are most valuable when they are realistic and when they highlight failures as learning opportunities rather than assigning blame.
Translating security concepts into OT terms. Telling a control engineer that a system has a “high CVSS score vulnerability” may not register. Explaining that the same vulnerability could allow an attacker to modify control logic and cause a pressure spike in a specific vessel is actionable. Security communication within the cross-functional team needs to be grounded in the physical consequences that OT professionals are trained to think about.
Establishing the Team’s Charter and Scope
A cross-functional team without a clear mandate tends to drift. NIST 800-82 recommends that the information security manager establish a formal charter that defines the team’s objectives, the systems and networks within scope, the division of responsibilities, the budget and resources available, and the reporting structure.
Some key decisions the charter needs to address:
What is in scope? The ICS security program should cover all industrial control systems — SCADA, DCS, PLCs, HMIs, historian servers, and the network infrastructure connecting them — as well as the boundary between ICS and corporate networks. It should explicitly include legacy systems, even those that cannot be updated, because understanding their vulnerabilities is essential for compensating control design.
How does this team relate to existing IT security functions? In most organizations, there is already an IT security program. The ICS security team should identify which existing practices can be leveraged and where ICS-specific approaches are required. Duplicating effort is wasteful; ignoring relevant expertise is dangerous. The goal is integration, not isolation.
Who has decision-making authority? Security decisions that affect operational systems require clear authority. The charter should specify who can authorize changes to ICS networks, who can approve security exceptions, and who has the authority to shut down a system in response to a security incident (and under what conditions).
How are conflicts between security and operations resolved? These conflicts will arise. A patch needs to be applied, but there is no maintenance window for six weeks. A security scan would help identify vulnerabilities, but running it on production equipment is risky. The charter should establish a process for escalating and resolving these conflicts before they become crises.
Developing ICS-Specific Security Policies
Generic IT security policies do not transfer cleanly to ICS environments. The cross-functional team’s early deliverables should include security policies tailored specifically to the control system environment. These policies need to be grounded in the operational reality of the ICS — realistic about what can be implemented, specific enough to be actionable, and integrated with existing operational procedures.
NIST 800-82 identifies the absence of ICS-specific security policy as one of the most significant vulnerabilities in industrial control system environments. Policies that exist on paper but conflict with operational realities tend to be ignored. Policies developed with control engineers and operators in the room are more likely to be workable — and therefore followed.
Key policy areas that require ICS-specific treatment include: change management and patch management procedures; remote access controls; portable media and removable device policies; incident response procedures for ICS-specific scenarios; vendor and third-party access controls; and physical security requirements for control system components in remote or distributed locations.
Reporting Structure and Accountability
For the cross-functional team to have lasting impact, it needs to be embedded in the organization’s governance structure, not exist as an ad hoc working group. NIST 800-82 recommends that the ICS security team report to the information security manager, who in turn reports to the CIO or CSO.
What matters most is that accountability for ICS security is explicitly assigned at the executive level — and that this accountability is taken seriously. ICS security programs that are funded, staffed, and visible to senior leadership survive organizational changes and budget cycles. Those that live entirely within the control engineering function or the IT security function tend to be deprioritized when either domain faces competing demands.
Conclusion: The Team Is the Strategy
It is tempting to think of ICS security primarily as a technical challenge — a matter of firewalls, segmentation, patching, and monitoring. These technical controls matter enormously. But the foundation of an effective ICS security program is organizational, not technical.
The cross-functional team is not a project deliverable or a compliance checkbox. It is the mechanism through which the organization develops the shared knowledge, shared language, and shared accountability needed to make good security decisions about systems where the consequences of getting it wrong extend beyond lost data into the physical world.
Building this team is difficult. The cultures are different, the domains are complex, and the organizational dynamics are often challenging. But as NIST 800-82 makes clear, no single discipline — not IT security, not control engineering, not physical security — can address the full scope of ICS risk on its own. The cross-functional team is not one approach among several. It is the approach.
