Banned for a fake football match, you couldn't make it up (but AI did)
%20%E2%80%93%201.jpg)
We’ve all heard the story by now of why Maccabi Tel Aviv fans were banned from attending a Europa League match in Birmingham in November 2025, under the justification of ‘public order concerns’.
For all of us at the time, it seemed a fair reason for banning fans from attending a football match, although it also seemed an extreme decision to many. Who wants civil unrest if it can be avoided, afterall? The reason for the ban was also based on police information and historicaldata that specifically referenced a previous fixture between Maccabi Tel Aviv and West Ham United - so the police knew what they were doing, we thought.
There was one problem though.
The game never happened.
The match had been invented entirely but Microsoft’s AI tool, Copilot, which hallucinated the game.
Somehow though, that fictional game made its way into a real police report. From there, it helped justify a ban that sparked political fallout, public outrage and a formal apology from the Chief Constable.
The Mistake Everyone’s Talking About For the Wrong Reason
Much of the media coverage has focused on what happened: a fabricated match, a botched report and an apology in Parliament. First the police blamed a Google search, then they admitted it came from an AI tool.
But let’s talk about what it actually means. This isn’t just an AI failure, this is human failure on a grand scale.
It’s a failure to:
· Understand how AI tools work
· Identify hallucinations - a basic,well-documented flaw in large language models
· Review, verify and validate evidence
· Take responsibility
The fact that an AI tool was used at all isn’t the scandal.
The scandal is that nobody knew how to use it properly and that they didn’t check the output before using it to impact real lives, despite the fact there was no reference to the game in official police databases. All of this is even more surprising given the sensitivities and obvious fall out that would come from such a ban.
It also highlights something bigger. We know about this situation because of the fallout but where else are mistakes like this being made that aren’t becoming national headlines?
There cannot be a clearer example of the need for AI Training within organisations.
What Are Hallucinations and Why Do They Matter?
AI hallucinations happen when a model generates content that’s plausible but false. It’s not a bug but a known limitation of how large language models predict language patterns.
Hallucinations aren’t rare, they’re not new and any teamusing AI in a decision-making context should know about them before they evertouch the tools.
If your organisation is using AI without understanding:
· How it generates responses
· What the risks are
· Where verification is essential
Then you’re not innovating, you’re playing with fire.
Why AI Training Is Now a Governance Issue
This isn’t about catching up with tech trends. This is about protecting your organisation, your credibility and your people.
What happened in the West Midlands was avoidable.
If just one person had been trained to spot an AI hallucination, that ban would never have happened.
But here’s the bigger issue. If a public police force canfall for an AI hallucination, so can your team.
The only solution is AI training for everyone involved, notjust your IT lead and not just your comms team. Everyone from the C-suite to front-line staff needs to understand:
· What AI is good at
· What it’s terrible at
· How to fact-check, cross-reference and act responsibly
That’s just the basics which need to addressed in the new AI age. If you’re not covering them, you’re creating risk for your business, your team and yourself.
Don’t Let AI Become a Scapegoat or a Liability
AI is not the enemy here, blind faith in a tool people don’t understand is.
We wouldn’t let someone drive a vehicle without training, so why are we letting teams use powerful AI tools - tools that influence public policy, HR decisions, healthcare and more - with no training, no checks and no accountability?
At AI Expert, we train teams across the UK to use AI safely,strategically and responsibly. Not because it’s trendy but because it’s necessary, as the West Midlands Police example shows.
If you’re rolling out AI tools but haven’t rolled out training, you’re not transforming your business, you’re setting it up to fail.
If you want to avoid being the next headline like this,let’s talk about getting your team trained before it’s too late.


%20(1).png)