
Facial Recognition Error Sees Woman Wrongly Accused of Theft (bbc.com) 42
A chain of stores called Home Bargains installed facial recognition software to spot returning shoplifters. Unfortunately, "Facewatch" made a mistake.
"We acknowledge and understand how distressing this experience must have been," an anonymous Facewatch spokesperson tells the BBC, adding that the store using their technology "has since undertaken additional staff training."
A woman was accused by a store manager of stealing about £10 (about $13) worth of items ("Everyone was looking at me"). And then it happened again at another store when she was shopping with her 81-year-old mother on June 4th: "As soon as I stepped my foot over the threshold of the door, they were radioing each other and they all surrounded me and were like 'you need to leave the store'," she said. "My heart sunk and I was anxious and bothered for my mum as well because she was stressed...."
It was only after repeated emails to both Facewatch and Home Bargains that she eventually found there had been an allegation of theft of about £10 worth of toilet rolls on 8 May. Her picture had somehow been circulated to local stores alerting them that they should not allow her entry. Ms. Horan said she checked her bank account to confirm she had indeed paid for the items before Facewatch eventually responded to say a review of the incident showed she had not stolen anything. "Because I was persistent I finally got somewhere but it wasn't easy, it was really stressful," she said. "My anxiety was really bad — it really played with my mind, questioning what I've done for days. I felt anxious and sick. My stomach was turning for a week."
In one email from Facewatch seen by the BBC, the firm told Ms Horan it "relies on information submitted by stores" and the Home Bargains branches involved had since been "suspended from using the Facewatch system". Madeleine Stone, senior advocacy officer at the civil liberties campaign group Big Brother Watch, said they had been contacted by more than 35 people who have complained of being wrongly placed on facial recognition watchlists.
"They're being wrongly flagged as criminals," Ms Stone said.
"They've given no due process, kicked out of stores," adds the senior advocacy officer. "This is having a really serious impact." The group is now calling for the technology to be banned. "Historically in Britain, we have a history that you are innocent until proven guilty but when an algorithm, a camera and a facial recognition system gets involved, you are guilty. The Department for Science, Innovation and Technology said: "While commercial facial recognition technology is legal in the UK, its use must comply with strict data protection laws. Organisations must process biometric data fairly, lawfully and transparently, ensuring usage is necessary and proportionate.
"No one should find themselves in this situation."
Thanks to alanw (Slashdot reader #1,822) for sharing the article.
"We acknowledge and understand how distressing this experience must have been," an anonymous Facewatch spokesperson tells the BBC, adding that the store using their technology "has since undertaken additional staff training."
A woman was accused by a store manager of stealing about £10 (about $13) worth of items ("Everyone was looking at me"). And then it happened again at another store when she was shopping with her 81-year-old mother on June 4th: "As soon as I stepped my foot over the threshold of the door, they were radioing each other and they all surrounded me and were like 'you need to leave the store'," she said. "My heart sunk and I was anxious and bothered for my mum as well because she was stressed...."
It was only after repeated emails to both Facewatch and Home Bargains that she eventually found there had been an allegation of theft of about £10 worth of toilet rolls on 8 May. Her picture had somehow been circulated to local stores alerting them that they should not allow her entry. Ms. Horan said she checked her bank account to confirm she had indeed paid for the items before Facewatch eventually responded to say a review of the incident showed she had not stolen anything. "Because I was persistent I finally got somewhere but it wasn't easy, it was really stressful," she said. "My anxiety was really bad — it really played with my mind, questioning what I've done for days. I felt anxious and sick. My stomach was turning for a week."
In one email from Facewatch seen by the BBC, the firm told Ms Horan it "relies on information submitted by stores" and the Home Bargains branches involved had since been "suspended from using the Facewatch system". Madeleine Stone, senior advocacy officer at the civil liberties campaign group Big Brother Watch, said they had been contacted by more than 35 people who have complained of being wrongly placed on facial recognition watchlists.
"They're being wrongly flagged as criminals," Ms Stone said.
"They've given no due process, kicked out of stores," adds the senior advocacy officer. "This is having a really serious impact." The group is now calling for the technology to be banned. "Historically in Britain, we have a history that you are innocent until proven guilty but when an algorithm, a camera and a facial recognition system gets involved, you are guilty. The Department for Science, Innovation and Technology said: "While commercial facial recognition technology is legal in the UK, its use must comply with strict data protection laws. Organisations must process biometric data fairly, lawfully and transparently, ensuring usage is necessary and proportionate.
"No one should find themselves in this situation."
Thanks to alanw (Slashdot reader #1,822) for sharing the article.
Sounds like a good lawsuit (Score:5, Interesting)
They have no reason to keep her from their stores, and it is specifically tied to her.
I wonder if a defamation suit would be in order?
Re: (Score:1)
Greetings fellow American, this happened outside of America.
You're right - there's no "America" anymore (Score:3)
outside of America.
Oh c'mon! Everybody knows there's no such thing!
You're right - it's now known as the "Shit Hole of America", because the rest of the world can rename things too!
Re: You're right - there's no "America" anymore (Score:3)
If I recall, one reason for renaming it was that it's more "inclusive" since "America" is the US, Mexico, etc. wouldn't that mean the name used by the US should revert back to "Gulf of Mexico"since renaming it was DEI?
Re: (Score:2)
> wouldn't that mean the name used by the US should revert back to "Gulf of Mexico" since renaming it was DEI?
Maybe not, here's the 1828 map of Mexico (United Mexican States) https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: Sounds like a good lawsuit (Score:3)
Re: (Score:1)
Re: (Score:1)
Ever heard of America?
Re: Sounds like a good lawsuit (Score:2)
There are non discrimination laws that apply even in the US in many cases. You can't just decide to arbitrarily ban certain people, or group of people, without reason.
Re:Sounds like a good lawsuit (Score:4, Interesting)
This seems like exactly the kind of situation lawsuits are intended for. A google search of "UK false accusation law" turns up multiple ambulance chasing lawyer web pages that inform me that false accusations, as well as milder defamation, can be prosecuted both criminally and civilly and I should definitely contact them as soon as possible to make sure I know my rights.
Re: (Score:3)
What we read in TFS makes me think she is preparing for that civil lawsuit: "Everyone was looking at me", "My heart sunk and I was anxious", "my mum as well was stressed". She's stating to whoever wants to listen she and her mum suffered psychological damage, and she can later claim that in court.
Re: (Score:2)
On the other hand, at the very least, Facewatch actually followed up and looked into her claims of innocence and admitted their error -- though they could do better than an "anonymous spokesperson." Some (many? most?) companies might have simply blown her off or perpetually routed her through automated systems (like I've heard Google, Facebook, etc... do) until she gave up and perhaps actually sued. Not defending their tech or the store's proactive use of it, but their follow-up could have been worse.
Re:Sounds like a good lawsuit (Score:4, Insightful)
Re:Sounds like a good lawsuit (Score:5, Interesting)
They didn't own up. They were asked about it by the media.
Trust me, if she didn't go to the media, she'd just have turned away like 99% of the other people mistakenly identified as shoplifters.
She went to the media, now the company behind it has a huge PR problem that they need to control - because once the media starts announcing the company behind it, as well as the store, people start getting turned off. The store identifies the company responsible as an attempt to deflect blame ("It wasn't us, we use Facewatch!"). Facewatch needs to own up because other stores using their technology might see a similar backlash.
A customer has choices when it comes to stores, and an article like this can get them to choose alternatives to avoid accidentally being detained. Especially with the media spotlight on AI things this could turn what was an accidental misidentification into a full blown boycott that does worse damage than the shoplifting ever did.
Companies admit to wrongdoing all the time if you read the news - some consumer gets shafted for months, contacts the media, the issue is magically resolved in 24 hours. company admits a mistake was made. Of course, no explanation why it took so long to actually do something, or what about the other people going through the exact same issue.
Re:Sounds like a good lawsuit (Score:4, Insightful)
She has a good case under GDPR rules. They processed her personal data (her biometrics), relying on legitimate interest to avoid having to get permission. But that brings a lot of responsibility too, and clearly they have failed here.
The difficulty will be that Facewatch blames Home Bargains, and Home Bargains blames Facewatch. I'd say the liability is mostly with Facewatch, since they flagged her as a thief, and they clearly didn't vet the information they were given, or provide proper training to Home Bargains' staff. If it were me I'd go for Small Claims Court, no lawyer required, after request probably around £2,000 in compensation via a Letter Before Action, for distress, misuse of personal data, and the time and effort required to get it corrected. That would be for both people affected, since the mother was distressed by it too.
Re:Sounds like a good lawsuit (Score:4, Informative)
There is obviously a personal data angle here. There might also be a defamation angle if the system works as implied by TFS, since it appears that someone's reputation has been affected because someone else lied about them and this has demonstrably caused harm? If there was more than one relevant incident then there might also be a harassment angle.
Please be careful with that advice about requesting compensation in a Letter Before Action, though. There are fairly specific rules for what you can and can't claim under our system and just going in with claiming some arbitrary figure of a few thousand pounds in "compensation" for vague damages is far from guaranteed to get the result you're hoping for. If someone were serious about challenging this kind of behaviour, they might do better to consult with a real lawyer initially to understand what they might realistically achieve and what kinds of costs and risks would be involved.
Re: (Score:2)
The ICO used to have examples of compensation awards for various types of DPA issues, but I can't find it now. My numbers were based on that, but IANAL and this is not legal advice. You are right, get legal advice, the cost can be passed on to them anyway.
Re: (Score:2)
You are right, get legal advice, the cost can be passed on to them anyway.
AIUI, your costs can't (or couldn't) generally be passed on when using the small claims system. Has that changed? It's been a while since I went through the process, so it's possible that my information here is out of date.
Re: (Score:3)
Exactly! The court slapping a company with a large lawsuit judgement is what gets things fixed.
Finger of blame pointing in the wrong direction? (Score:5, Informative)
There's really only one screw up here, and that was by the staff at the May 8th store who added her to the Facewatch DB, everyone and everything else seems to have done as they/it should have done under the circumstances. Still, on the "lessons learnt" front, users of systems like this *really* need to allow for the possibility of human error in the submission or a mistaken ID by the system (not that this seems to have happened here) when challenging someone like this, and have a clear cut audit trail and process of appeal. If Home Bargains had been able to say, right off the bat, that it was down to a presumed theft of toilet rolls on May 8th and undertake an on-the spot review on May 24th, this could easily have been avoided.
Re:Finger of blame pointing in the wrong direction (Score:5, Insightful)
In other words, the problem isn't that the facial recognition system didn't work, it's that it did, with zero errors.
Re: (Score:3)
Does the facial recognition "system" include the people who entered incorrect data?
Re: (Score:2)
FFS people aren't talking about the specifics of the algorithm of one part of the system they are talking about how the facial recognition system incorrectly identified someone as a shoplifter. Which it did. Why is immaterial, if you're being accused by a faceless company with little recourse.
Half of the pedantry on slashdot is people aggressively misunderstanding how people actually talk.
Re: (Score:3)
Re: (Score:2)
So an error?
Re: (Score:2)
An apparently human error, not a facial recognition one, unless you would call any other case where a chain sends out a picture of an innocent person and says "do not admit this person" a "facial recognition error".
The criticism is over the qualifier for the word "error", not the label as an error.
Re: (Score:2)
Sure, in the lessons learnt that would be great if that all takes place but this is all private enterprises so what's gonna make them do it? Maybe demands from their customers but I have to imagine the selling point of these companies to their customers is not having to do all that human work so there's some perverse incentives happening that lead to this happening.
I mean lets be real, it wasn't simple oversight that it seems like precisely none of those things are in the system now. This probably should b
Re: (Score:3)
AFAICT, the actual sequence of events is that Ms. Horan bought and paid for some toilet rolls on May 8th, after which *human error* at the store resulted in her being added to the Facewatch programme.
Actually, the wording of that Facewatch email is weaselly enough - it implies this was the case, but does not actually state it.
Is all that Facewatch doing just sharing info created entered by the retailers?
Re: (Score:2)
I agree it's not clear which of those it is, and the distinction is very important.
Is all that Facewatch doing just sharing info created entered by the retailers?
Yes. With some auto-detection of that person. Basically a really advanced wanted poster.
Re:Finger of blame pointing in the wrong direction (Score:4, Insightful)
The issue is that they have a very powerful facial recognition system deployed at multiple stores - not just BM, other shops as well, so potentially huge consequences if they make a mistake. And apparently all it takes is one unverified report to get you on their list, at which point you can expect to be accosted at any number of venues, with no idea why.
It's lucky she paid with a card, or she might not have been able to prove she didn't steal anything, and wouldn't have been able to get the situation sorted out. Do they have an expiry date on these reports, after which they take your face out of the database, or are you barred from an ever increasing number of shops and venues for life?
GDPR requires this immense power to be wielded very, very carefully. There should be extensive checks in place to verify the accuracy of allegations, and there probably needs to be some kind of independent body you can appeal to. There will be legal consequences for this kind of mistake.
Given that it's essentially a parallel judicial system, where one company is judge and juror, it may also be in need of heavy regulation. For example, if you do actually get convicted of theft, eventually the conviction is "spent" and you no longer have to report it to employers, it doesn't show up on many background checks etc. Credit reference agencies and insurers are regulated such that even if they could retain a record of you having say a speeding conviction or an at-fault accident, there is a time limit after which it must be purged. The Right to be Forgotten, as it's sometimes known.
The problem is the human reaction not the tech (Score:3)
The problem isn't that the face recognition system, it's the excessive response of immediately barring her from the shop. They should have let her in normally and just pay extra attention through the cameras.
If the system displays "100% MATCH" blinking in blinking red letters like we'd see in a movie, then that's the the problem. The system should say it's merely a possible match, and the matter must be treated with caution. The picture should be made available on a paper or a tablet if anyone is sent to talk to the person. A human would easily identify any mistakes from the system.
Or the provider of the solution should make it more clear during user training what should be the appropriate response.
Re: (Score:3)
Re: (Score:3)
What stores would typically do is bar you for a few years, but not for life time, at least apply to petty theft. Violent behavior or whatever would probably bring a life time ban. And no, stores do not have to serve you if you don't play by the rules.
Either way, what seems to have happened there is a complete breakdown of process. A woman was wrongly accused of theft by incompetent staff and publicly embarrassed through no fault of herself.
Meh (Score:1)
Historically in Britain, we have a history that you are innocent until proven guilty but when an algorithm, a camera and a facial recognition system gets involved, you are guilty.
Brits really love their surveillance cameras everywhere. Not sure what they expected to happen.
The tech is needed to prevent extremes (Score:3)
Just ban it from being used for petty stuff, under penalty of law. Anyone who uses the tech for catching petty thieves or something abusive like fucking with an ex should be charged with stalking or something like that. We do need facial and activity recognition to screen for actual stalkers, violent people, kidnappers, and psychos. Of course due process needs to be enforced in every criminal case, and during the case the tech must be open to scrutiny. The standard must be that a number of humans review the tape and agree -- and that too coupled with other evidence.
Re: (Score:2, Flamebait)
Not a problem with the tech (Score:2)
This is not a problem with Facial recognition but rather a problem with how it is used.
Facial recognition gives you some "best matches" and then the user is supposed to look at the photos and decide if this is the same person or not.
(Systems used for police say "this doesn't constitute probable cause")
But users can be lazy and let the machine do the thinking for them. That's on the user, not the tech.
Guilty until proven innocent. (Score:2)
At least in the UK, there is the legal assumption that a computer cannot make mistakes. It is rebuttable in court, but the onus would be on the accused.
Which is of course, a load of fetid dingo's kidneys. This type of presumption lead to the Post Office scandal.
Time for this presumption to go the way of the dodo.
Of course, yeah, UK or no, or Australia, NZ, RSA or just about anywhere, but especially in the US, I'd go directly to an ambulance chasing lawyer. Hopefully a good one. Who smells a fat fee for a re