Overview
This 50-minute Birds-of-a-Feather session explores a focused question: which practices and metrics from the DevOps “Accelerate” research (Forsgren, Humble, Kim) actually translate to OT environments, and which ones don’t? Rather than a traditional presentation, this is a structured, facilitator-led collaborative session where participants do the thinking together, using an “Adopt, Adapt, or Avoid” framework to evaluate DevOps principles against OT realities.
The session produces a tangible deliverable: an OT DevOps self-assessment guide, co-created by participants, that attendees can take back to their organizations.
Why This Topic Matters
The Accelerate research identified four key metrics (Deployment Frequency, Lead Time for Changes, Change Failure Rate, and Mean Time to Recovery) as strong predictors of software delivery performance. These DORA metrics and associated practices (CI/CD, automated testing, trunk-based development) are well validated in IT. However, OT environments present fundamentally different constraints: safety-critical operations where unplanned downtime can cost millions or risk lives, air-gapped and legacy systems incompatible with cloud-centric tooling, strict change management regimes, and long equipment lifecycles. The question is not whether OT should modernize, but how to do so on its own terms.
As one framing we will use in the session: in IT, the risk is often in not changing fast enough. In OT, change itself is the risk. The question becomes not just “is it secure?” but “is it safe to change?”
Session Flow (50 minutes)
Introduction (10 min): The facilitator opens with a brief primer on the four DORA metrics and why they predict high performance in IT. We then frame the OT context: why these principles are not one-size-fits-all, what makes OT different (safety, uptime, air-gapped networks, proprietary systems), and how to think about translating rather than copying. The “Adopt, Adapt, or Avoid” framework is introduced as the lens for discussion. Participants are oriented to the session goal: collaboratively building a self-assessment guide.
Small Group Discussion (25 min): Attendees break into groups of 4-6 with printed discussion prompts and worksheets. Each group works through guided questions including:
– Which of the four DORA metrics are most meaningful for OT, and how would you define and measure them? (What counts as a “deployment” in OT: a PLC code change, a firmware update, a configuration change?)
– Which DevOps practices can OT teams adopt directly (e.g., version control for configurations, automated backup validation)?
– Which need adaptation (e.g., continuous deployment becomes scheduled deployment with rigorous pre-checks)?
– Which should be avoided in OT (e.g., deploying multiple times per day to a running production process)?
– What survey questions or checklist items would indicate OT DevOps maturity at your organization?
Facilitators circulate between groups, pushing deeper thinking and ensuring groups generate proposed self-assessment questions.
Group Debrief and Synthesis (20 min): The room reconvenes. Each group shares one practice they would adopt, one they would adapt, and one they would avoid, plus their top 2-3 self-assessment questions. A facilitator captures inputs on a visible board, clustering common themes. This produces a composite view of where the room sees alignment and divergence between IT DevOps principles and OT reality.
Wrap-Up and Deliverable (5 min): The facilitator summarizes key themes, introduces the self-assessment guide (a polished one-pager informed by the session’s findings), and distributes it to participants. The guide includes core metric evaluation questions, a capabilities and practices checklist, and Adopt/Adapt/Avoid annotations. Participants are encouraged to use it as a discussion tool with their own teams and leadership to identify where to invest in improvement.
What Makes This Session Different
Most OT security sessions are lecture-driven and threat-focused. This session is collaborative and improvement-focused. It meets OT practitioners where they are, respects the real constraints of industrial environments, and avoids the trap of assuming IT practices can be applied wholesale. The deliverable gives attendees something concrete to act on after the conference, not just ideas but a tool.
Facilitator Approach
The session will be led by two facilitators to allow one to lead framing and synthesis while the other supports group work. We will provide printed prompts, worksheets, and the final self-assessment handout. No projector is strictly required, though a brief slide deck or poster with the four metrics and the Adopt/Adapt/Avoid framework will be used to anchor discussion.