Loading …
Leadership Strategy AI

The Jaws Effect: How bad storytelling is feeding AI's fear machine

  • March 30, 2026
  • By Archie Cobb
  • 6 minute read

In 1975, Steven Spielberg spent most of his shoot without a working shark. The mechanical beast kept breaking down, so he filmed around it, leaving the threat just out of frame and letting the audience's imagination fill the gap. The result was one of cinema's most enduring fears. Fifty years on, we're doing something very similar with AI. 

That was the provocation at the heart of a panel at Tech Show London, where Archie Cobb, AI and Data Lead at Sullivan & Stanley, joined journalist Serena Haththotuwa, CTO Mark Rodseth and moderator Hillary Gray from Propeller Group to examine whether the stories being told about AI are doing more harm than good. 

The shark is already swimming 

The conversation opened with a reframe. "AI is no longer a tech problem," Archie said. "It's pervasive in nature. We need to make the human the centre of it." While most AI debate fixates on what the technology can do, his argument centred on what it demands from us, and what happens when fear crowds out that conversation entirely. 

We didn't ask for the shark in our water, but it's there. Being cautious is a reasonable response. Letting the story stop there is where the problem starts. 

"All the way back to the start of Homo sapiens, we've always liked being in control," he said. "We're scared of the fear coming out from storytelling, that we are going to lose control. But fundamentally, we have got control here, and we can continue to gain control." 

In practice, that control starts with something simple: knowing what you're trying to solve before you reach for the technology. The organisations Archie works with that are navigating AI well aren't the ones with the biggest budgets or the most advanced tooling. They're the ones where leadership has taken the time to understand the problem first, involved their people in the conversation, and resisted the urge to hand the whole thing to a vendor. "Control doesn't mean locking AI down," he said. "It means being deliberate about where it shows up, who's shaping it, and what success looks like before you start." 

Beyond STEM 

The assumption has been that STEM is the answer: push people towards technical competency and the problem resolves itself. Archie argues that this framing is already out of date. AI literacy is becoming table stakes. "You can become STEM literate within a month or two," he said, pointing to how quickly the barriers to technical entry are collapsing. When the technology itself is no longer the differentiator, something else has to fill that gap. 

"So in the next five years, as humans, what needs to be different?" His answer was a return to what makes us useful in ways that AI cannot replicate: critical thinking, curiosity, the ability to challenge a problem rather than simply process it. 

The behaviours holding organisations back are rarely technical. More often, it's teams accepting AI outputs at face value without interrogating them, or leadership framing the whole conversation as a skills gap that can be closed with a training course. "The biggest risk I see isn't that people can't use the tools," Archie said. "It's that they're not asking whether the tools are solving the right problem. You don't need more people who can prompt an LLM. You need people who can look at the output and say, 'That's not good enough, and here's why.'" 

Fear and curiosity aren't opposites 

Jaws didn't only create fear. A proportion of the audience came out of the cinema wanting to know more about sharks, not less. Archie sees the same dynamic playing out with AI: "If we tell the stories and shape the narrative around it, it can actually make us more curious and more in control of this technology." 

His view is that better storytelling starts with being more curious and less certain about what AI is, what it isn't and what it means for the people working alongside it. The instinct for most organisations is to manage fear. The more productive move is to convert it into curiosity, and that requires telling different stories, with more specificity, from closer to the ground. 

Trust is the defining currency 

Trust is the currency that either compounds or gets spent, depending on how you show up. That was Archie's framing, and it shaped his position on disclosure: "You need to be authentic with your customers." Using AI openly is not a vulnerability. Being found out later is the far costlier option. 

The analogy he used was banking: if an AI agent gives you the right output, most customers are satisfied with the transaction. But if a brand presents AI-generated content as human and audiences discover it later, the relationship breaks. "Once you lose that trust, they're not coming back." 

Mark Rodseth reinforced this from the angle of AI slop, named word of the year for 2024. Beyond quality, the concern is what lazy, undifferentiated AI output signals about a brand. "People are developing an allergy to AI slop," he said, "and immediately lose trust in individuals who just throw slop over the fence and expect people to read it." 

For Archie, the antidote is intent. "Before you use AI to create something or put it in front of a customer, you need to be clear on what you're trying to achieve and who it's for," he said. "The brands getting this right aren't using AI to produce more. They're using it to produce something better, faster, or more specific, and they're being upfront about the role AI played in getting there. The ones getting it wrong are treating AI as a shortcut to volume, and their audiences can feel it." 

The practitioner difference 

The distinction Archie drew between consultancy models was blunt. On one side: firms that arrive with an army of people and pre-loaded platforms. On the other: practitioners who come in with real experience, a willingness to say they don't have all the answers, and a model built around the client's reality. 

"If I stood here and said we're on top of this and driving the right agenda, that's not going to be helpful," he said. "We stay curious, we stay authentic... we can come in and give you a realistic approach of how you can actually start looking at this." 

In the room, that looks like starting with the client's actual problem not a demo of what the platform can do. "We'll spend the first sessions just listening," Archie said. "Where are the bottlenecks? What's frustrating your teams? What have you already tried? Then we build from there, sometimes that means deploying a tool, sometimes it means helping a leadership team get aligned before any technology gets involved. The point is that the work fits the organisation, not the other way around."

Getting specific is the only antidote 

All three panellists came back to the same conclusion: specificity cuts through. Not AI as a grand amorphous force, but a named tool, solving a defined problem, with results you can point to. 

"The generalisations are where the problem lies," Mark said. "People can't talk about specifics because this technology has emerged so quickly. But that's where the value is: drawing out the detail and telling the stories around that detail." 

Spielberg didn't fix the shark. He changed what he was filming. The organisations navigating AI well right now are not the ones who've solved the technology. They're the ones who've learned to work with uncertainty, stay specific and keep the human in the frame. 

 

Archie Cobb
Archie Cobb

AI & Data Lead