A routine shopping trip turned into a nightmare for Danielle Horan, a woman from Greater Manchester, after she was wrongly flagged by a facial recognition system and accused of shoplifting toilet paper worth just £10. Her experience has reignited concerns over the unchecked use of surveillance technology in everyday life — particularly in public retail settings.
A Case of Mistaken Identity
Danielle Horan was asked to leave two Home Bargains stores in May and June without any explanation. She later learned that her image had been placed on a facial recognition “watchlist” used by Facewatch, a retail security firm. The system had incorrectly identified her as someone involved in a shoplifting incident, resulting in her being banned from stores.
Despite her protests and attempts to explain, she was repeatedly denied entry and treated with suspicion. It was only after checking her bank statements and sending multiple emails to both Facewatch and Home Bargains that she proved she had paid for her items and had been wrongly accused.
Public Humiliation and Emotional Toll
The emotional impact of the accusation was severe. Horan described the embarrassment of being escorted out of a store while other customers looked on. “Everyone was looking at me,” she said. “I was like, ‘for what?'” She was especially concerned when this happened in front of her 81-year-old mother, whom she was accompanying on a trip.
Horan said she felt anxious, sick, and humiliated, stating that the situation “really played with my mind.” Her anxiety grew worse after repeated rejections from Facewatch and Home Bargains when she sought clarification and justice.
Tech Firm Responds, Retailer Stays Silent
Facewatch later acknowledged that the information came from a store report and not an official police record. In an email seen by the BBC, they told Horan the branches that reported her had since been suspended from using the system. The company added that they understood how distressing the experience must have been and had initiated additional staff training.
FCRF x CERT-In Roll Out National Cyber Crisis Management Course to Prepare India’s Digital Defenders
Home Bargains, on the other hand, declined to comment publicly on the incident.
A Bigger Problem: Guilty Until Proven Innocent?
Danielle Horan is not alone. Civil liberties group Big Brother Watch reported that over 35 people have contacted them with similar stories — wrongly flagged by facial recognition systems and denied entry into stores. Advocacy officer Madeleine Stone said people were being “kicked out of stores with no due process.”
She added, “When an algorithm, a camera, and a facial recognition system get involved, you are guilty until proven innocent.”
The UK government has stated that while such technologies are legal, they must be used fairly, transparently, and proportionately. Yet this case has sparked fresh calls to ban facial recognition in retail altogether, with critics warning that technological convenience should never come at the cost of basic rights. Danielle Horan’s ordeal has laid bare the dangers of relying on artificial intelligence for security decisions without human oversight. Her story is a reminder that in the rush to adopt smart surveillance, companies may be sacrificing fairness, transparency, and individual dignity.