Return on AI Pulse
We Asked. Nobody Knows Who Owns AI Governance.
Earlier this year, we ran what we're calling the AI Governance Trust Gap Index, a survey across about 70 organizations spanning every size from 20-person startups to companies with 500-plus employees. Respondents came from security, legal, compliance, product, and executive leadership. US-based mostly, with some UK and EU representation. The sample is early. We're not pretending it's definitive. But the patterns are consistent enough that they're worth talking about.
When we asked people to rate their confidence in how their organization governs AI, the average landed at 3.5 out of 5. That sounds like a passing grade until you look underneath it. About a third of respondents scored themselves at a 1 or a 2. So you've got a meaningful chunk of organizations that know, right now, that they're not governing AI well. And they're sitting next to organizations that scored themselves a 5. The gap between perception and reality across this group is wide enough to drive a truck through.
The most telling finding wasn't a number. It was a theme. When we asked who owns AI governance, the most common answer amounted to "nobody, specifically." No clear ownership. No RACI. No defined accountability. It's like a farmhouse where everybody assumes somebody else is locking the back door at night. So nobody checks, and the door stays open.
Shadow AI came up as a top challenge regardless of org size. Engineering teams are consistently moving faster than governance can keep up with. That's not a surprise to anyone who's been paying attention, but seeing it confirmed across 70 organizations, from tiny teams to large enterprises, makes it harder to dismiss as a growing-pains problem. This is structural.
Leadership knows something is off. The majority of respondents said their leadership had specifically requested clarity on AI governance. But when we asked whether organizations had comprehensively updated their governance controls for AI, only a fraction said yes. So the ask is there and the follow-through isn't. The number one requested resource was some version of a virtual compliance officer or team, along with clear mapping of AI usage to SOC 2 and ISO controls. People aren't asking for another framework document. They're asking for someone to help them actually do the work.
And the tools aren't filling the gap. Respondents using existing governance automation platforms said repeatedly that those tools don't satisfy all their concerns. The tooling was built for yesterday's problem. The frameworks haven't caught up to today's.
What this data tells us is pretty simple. There is no shared system for AI governance right now. There's no shared language. There's no commonly accepted ownership model. Organizations are aware of the gap but don't have a way to measure it, benchmark it, or close it systematically.
This is the first signal from what we're building into a recurring measurement, the ROAI Pulse. We'll keep asking. We'll keep publishing. Because the first step to fixing a problem that nobody owns is proving, with data, that nobody owns it.
Sources
- Striv AI Governance Trust Gap Index (February 2026)