Forgot your password?
typodupeerror
AI Education

Student Handcuffed After School's AI System Mistakes a Bag of Chips for a Gun (theguardian.com) 144

An AI system "apparently mistook a high school student's bag of Doritos for a firearm," reports the Guardian, "and called local police to tell them the pupil was armed." Taki Allen was sitting with friends on Monday night outside Kenwood high school in Baltimore and eating a snack when police officers with guns approached him. "At first, I didn't know where they were going until they started walking toward me with guns, talking about, 'Get on the ground,' and I was like, 'What?'" Allen told the WBAL-TV 11 News television station.

Allen said they made him get on his knees, handcuffed and searched him — finding nothing. They then showed him a copy of the picture that had triggered the alert. "I was just holding a Doritos bag — it was two hands and one finger out, and they said it looked like a gun," Allen said.

Thanks to Slashdot reader Bruce66423 for sharing the article.
This discussion has been archived. No new comments can be posted.

Student Handcuffed After School's AI System Mistakes a Bag of Chips for a Gun

Comments Filter:
  • by HiThere ( 15173 ) <(ten.knilhtrae) (ta) (nsxihselrahc)> on Saturday October 25, 2025 @02:44PM (#65750080)

    That's really stupid human oversight.

    • by TheMiddleRoad ( 1153113 ) on Saturday October 25, 2025 @02:47PM (#65750094)

      That's how humans use AI.

      • by Sloppy ( 14984 ) on Saturday October 25, 2025 @07:51PM (#65750590) Homepage Journal

        That's how some humans use everything. I used to be shocked by stories where some fuckwit blindly followed Google Maps into rivers or airport runways (long before LLMs) but now I know if a dialog window asks "Should I kill you as painfully as possible?" it'll get a lot of Yes clicks.

        If people aren't stupid, then can we at least admit they hate themselves?

        • by AmiMoJo ( 196126 )

          It's how the cops use every new bit of tech. When DNA came in, they were arresting people on very flimsy DNA evidence that later turned out to either be flawed or easily and obviously explained away.

          Happened when IP addresses became their new toy. Still happens with fingerprints, which, despite what CSI may tell you, rarely present an exact match.

      • No this isn't AI. If police are looking at the picture and telling him he has a gun instead of a Doritos packet then AI isn't "in use here". It really should be though. It's time we go back to employing "Actually Intelligent" people and not just let every dumb mouth breathing cunt join the police force.

        • Okay maybe it was AI. TFS is poorly written and TFA is TFA and therefore unread. Great work David. No I refuse to call you "EditorDavid" you don't deserve that title.

    • by 93 Escort Wagon ( 326346 ) on Saturday October 25, 2025 @03:30PM (#65750162)

      Actually, it's both.

    • by sound+vision ( 884283 ) on Saturday October 25, 2025 @03:31PM (#65750164) Journal

      In other words, the AI failed, and the humans also failed.

      • by HiThere ( 15173 ) <(ten.knilhtrae) (ta) (nsxihselrahc)> on Saturday October 25, 2025 @03:50PM (#65750210)

        No. If you want to avoid false positives, you have to accept false negatives, and conversely. Set the recognition to be "super cautions" and it's going to make mistakes that say "Maybe a gun there". This is literally inevitable.
        What's really stupid is that the police looked at the picture of a Doritos bag and a couple of fingers and didn't realize it was a false positive. (Or more likely didn't even bother to look at the evidence before flying off the handle.)

        • by The Grim Reefer ( 1162755 ) on Saturday October 25, 2025 @05:20PM (#65750372)

          What's really stupid is that the police looked at the picture of a Doritos bag and a couple of fingers and didn't realize it was a false positive. (Or more likely didn't even bother to look at the evidence before flying off the handle.)

          From TFS, it doesn't sound like the police saw a picture until after the fact. The way it reads, the AI flagged the frame in the video and called the police. The police were responding to a call for an armed person. If you're responding to a call for a possible shooter, it's unlikely you will be grilling the caller, on site, before trying to stop a potential mass shooter.

          It seems to me that the company who makes the AI should require a person to review the video once the call goes out. That way they can give the police an all clear before they arrive.

          • by evanh ( 627108 )

            So it was a Doxxing then. Do the company execs go to jail now for the burden they've put on society?

            • So it was a Doxxing then. Do the company execs go to jail now for the burden they've put on society?

              No , it's just that people shouldn't eat gaddammed Doritos! This should be obvious, people!

            • Not a doxxing. A swatting.
            • More like a Dorritoxxing, look they were just following orders -CEO
              Reporters: You mean they were following the orders of the AI?!?
              If that’s what lets me off the hook, yes! -CEO
          • The way it reads, the AI flagged the frame in the video and called the police. The police were responding to a call for an armed person. If you're responding to a call for a possible shooter, it's unlikely you will be grilling the caller, on site, before trying to stop a potential mass shooter.

            Who did it pretend to be? Did it identify itself as a bot? 911 operators usually do have questions of their callers, so they can feed info to the police. I'd be very curious to hear that conversation.

            • by Rei ( 128717 )

              Nobody called anybody. It's a weapons detection system. They're deliberately running it. The article makes it sound like ChatGPT got bored, started watching webcams, and decided to narc on the Doritos Guy. That's not even close to what happened.

              • They're deliberately running it.

                Yes I know that. So you are saying nobody actually called the police? How did they know to come, or do they have some sort of setup like a silent alarm that just triggers a light somewhere on a web dashboard? If it is not via 911 you would presumably need some sort of special arrangement with the police. The article only says it "sends an alert" without being more specific.

          • by cusco ( 717999 )

            I can't see how that would actually work, because there's no way they're going to know which person in a group is the "offender" and these clowns apparently did. While the AI could probably identify the location based on the camera name, I have trouble believing it could even tell them which group of kids much less "the kid with the green shirt, blue jeans and blue backpack." This is almost certainly a moron of a security guard responding to an automatic alarm after only the most casual glance at the video

          • by Calydor ( 739835 )

            Then when did the cops get the picture to show the suspect? On the way to the scene? While approaching with guns drawn? While searching him after handcuffing him? Did the AI forward a description of the suspect instead of the picture?

          • by Rei ( 128717 )

            From TFS, there's no indication either way of whether they had seen the picture before, and if I had to argue either way from the wording, I'd go with "yes, they had".

            Also, when did we switch from calling weapons detections systems "weapons detections systems" to "artificial intelligence systems"? It's still true, but a much less useful choice of wording, and is probably going to make some readers think they were shoving video feeds through ChatGPT or something.

            Also, in the picture [metro.co.uk], it was clearly their ce

          • by AmiMoJo ( 196126 )

            Around here the police usually won't come out to surveillance alerts, unless a human has manually reviewed them first.

          • What's really stupid is that the police looked at the picture of a Doritos bag and a couple of fingers and didn't realize it was a false positive. (Or more likely didn't even bother to look at the evidence before flying off the handle.)

            From TFS, it doesn't sound like the police saw a picture until after the fact. The way it reads, the AI flagged the frame in the video and called the police. The police were responding to a call for an armed person. If you're responding to a call for a possible shooter, it's unlikely you will be grilling the caller, on site, before trying to stop a potential mass shooter.

            It seems to me that the company who makes the AI should require a person to review the video once the call goes out. That way they can give the police an all clear before they arrive.

            TFA says the cops had the picture, they showed it to the student while they were detaining him at gunpoint. Here's how the student described the photo when the cops showed it to him:

            “I was just holding a Doritos bag – it was two hands and one finger out, and they said it looked like a gun,” Allen said.

            Either they did a shitty job of looking at the photo or they didn't look at all.

    • That's really stupid human oversight.

      Agreed! Chips are junk food, that poor kid isn't being looked after properly!

    • by Iamthecheese ( 1264298 ) on Saturday October 25, 2025 @07:31PM (#65750566)
      The problem is panic culture, choosing safety over liberty and zero tolerance ass covering.
  • AI is dumber. Though if they were the Flamin' Hot Doritos the mistake may be understandable.

    • The job is boring as shit. If you score too high on an intelligence test they won't hire you because you're likely to quit after training.

      We don't pay well enough to counter that so there you go. Cops are dumb by design
      • I think the problem is more that cops have to do all sorts of evil things as part of enforcing capital and whatever bigotry is codified where you are, which means anyone who joins is either too stupid to notice, or too evil to care,
      • by sjames ( 1099 )

        Sure, but that's beyond dumb. Any moron knows the difference between a hand with a finger extended holding a bag or Doritos and a gun.

        As punishment, they should be required to report to the front of the cafeteria and be paddled by the student in front of the entire student body.

        The local news can make a day of it.

  • by memory_register ( 6248354 ) on Saturday October 25, 2025 @02:46PM (#65750088)
    You must - MUST - have a human double checking this. Dispatching any kind of response without human review invites catastrophe.
    • by TheMiddleRoad ( 1153113 ) on Saturday October 25, 2025 @02:47PM (#65750096)

      If you have humans double-checking your AI, then you don't save money. Never gonna happen.

      • by Keick ( 252453 )

        Do you then suppose that double-checking an potential false positive is somehow more expensive than sending a squad to a school?

        • Sending the squad to a school is already paid for. It doesn't hit the budget. Assigning review duties to AI alerts? Gotta pay somebody for that shit.

          • by StormReaver ( 59959 ) on Saturday October 25, 2025 @03:28PM (#65750160)

            Gotta pay somebody for that shit.

            Police are paid for by taxes, and public school is paid for by taxes. Everyone's already paid. It is pure criminal negligence that allows machine vision to automatically call police. The student's parents should sue the school, and the school administrators should be prosecuted for filing a false police report.

            • Comment removed (Score:4, Insightful)

              by account_deleted ( 4530225 ) on Saturday October 25, 2025 @03:57PM (#65750222)
              Comment removed based on user account deletion
              • by sjames ( 1099 )

                Or just have the leader of the squad that's about to go threaten a kid over a bag of Doritos give the photo a look first. It pays for itself by leaving the squad free to pursue something more relevant and worthwhile. The 30 seconds it would take to look at the picture isn't even enough time to get to their cars.

                • Comment removed based on user account deletion
                  • by sjames ( 1099 )

                    They had the picture with them when they went to arrest the kid. There was no active threat. Even a glance at the photo would have revealed the truth. Then there's the bonus of not having yet another law abiding citizen one day instructing his children "Don't EVER trust the cops".

                    Apparently the best adjective to describe current police operations is "Keystone".

          • Paying someone to review the AI's assessments will wind up being cheaper than paying out the settlements on the lawsuits for 'overenthusiastic' police officers forcing a child to their knees, cuffing them, and searching them after a false accusation. And, presumably, the number of times that the AI will trigger on such an event will be small, allowing the human tasked with reviewing AI positives to perform other duties when not actively fielding an AI system's misperceptions.
            • The police department is going to hire and train on-demand, 24-hour AI inspectors, but they'll have other responsibilities? I doubt that.

              The lawsuits may eventually cost more than hiring a checker, but that's a problem for somebody several years from now. They already have lawyers on retainer, but not AI checkers.

      • by thegarbz ( 1787294 ) on Sunday October 26, 2025 @04:37AM (#65751050)

        If you have humans double-checking your AI, then you don't save money. Never gonna happen.

        Of course you do, and literally every other AI enforcement system uses this approach already. AI reviews everything, a human only confirms the positives to ensure they aren't false.

        For example Australia is now using AI enforcement systems to detect people using mobile phones behind the steering wheel. The AI will flag a picture, send it to a human to review, and the human issues the fine. Employing humans to look at every picture of every vehicle would be many MANY orders of magnitude more expensive.

      • by allo ( 1728082 )

        How did you calculate this? Let's think of a purely human and a human-checks-AI system:

        1) Four persons watching 40 security cameras, each keeping their attention at ten cameras (sounds like a good estimate for human attention).
        2) One person checking an AI generated alert of a camera every few hours to every few days, depending on the number of (false) positives the AI system generates.

    • well it may take an case that makes to the court room where the source code is put under question.

    • by Sique ( 173459 )
      That's what was supposed to happen, but the school's administration was not present to review the footage and cancel the alarm, hence the police came in force.
    • Cops might need a human double doing the checking but most people could do it themselvesâ¦

    • You must - MUST - have a human double checking this. Dispatching any kind of response without human review invites catastrophe.

      According to TFS, the cops on the scene had a copy of the photo, so this should already be the case.

      They then showed him a copy of the picture that had triggered the alert. "I was just holding a Doritos bag — it was two hands and one finger out, and they said it looked like a gun," Allen said.

      Hopefully, they had already surmised this was a false alarm, but guessing they had to cover their asses and check it out and follow through as though it was a real situation, since it was flagged by the system. Not defending this, but can see why it happened. Had the video simply been originally viewed by a person, who would have realized it was a finger, it would have ended there. On the downside, peopl

    • Enter the vibe checkers.

    • That's fine. Except that if it turns out its really a gun, when they're going over all the missteps involved, the time spent waiting for somebody to go over the result manually will absolutely come up.

    • The term you're looking for is "human in the loop" aka HITL.

  • by Slashythenkilly ( 7027842 ) on Saturday October 25, 2025 @02:46PM (#65750090)
    Its "almost" like you need an officer to review the footage before issuing the ticket so we arnt humiliating people and wasting anybody's time with nonsense.
    • Reviewing footage costs a lot of money. There's no room in the budget for it.

      • They looked at the picture afterward...just not before bothering to go on site.

        If they had looked at it afterward BEFORE bothering to go on site, they could have saved everyone a lot of effort.

        • The AI already told them what they saw. When you remove the cognitive load of recognizing what something is and then you prime people with a set answer, they'll believe it much if not most of the time. If I told you that the lyrics to "Kiss the Sky" were "Kiss This Guy" and you'd never heard of the song before, you'd hear "Kiss This Guy" when you listened. I told my kids that "Smack My Bitch Up" was "Smack My Pitch Up" and they believed it for years.

          • I dunno about all that....but humoursly enough...in the Billy Joel song, "Pressure", for DECADES, I thought the lyric was:

            But you will come to a place
            Where the only thing you feel
            Her lonely cunt in your face
            And you'll have to deal with
            Pressure

            And I always thought, "wow!! That's really some strong imagery for a Billy Joel song".

            Only in the last few years did I figure out the lyric is actually:

            But you will come to a place
            Where the only thing you feel
            Are loaded guns in your face
            And you'll have to deal with
            Pre

        • by PPH ( 736903 ) on Saturday October 25, 2025 @03:34PM (#65750170)

          Anecdote: I received a red-light camera ticket in the mail a few years back. Pre-AI, they were using some sort of automated license plate reading software, which confused an 'O' with a 'Q'. They dutifully attached a copy of the picture of a late model GMC pickup truck with my vehicle plate and description. A 45 year old Toyota Landcruiser. Not even close.

          My mistake was in calling the municipal court and having the error fixed clerically. Which saved me time but ended up not having me challenge the ticket in court. Which would have made the error a public record and a data point useful for others to challenge the systems accuracy.

          • The system's accuracy wouldn't be in question and precisely nothing would change. You literally demonstrated just now you had a method to correct the error, that would be accepted form of accuracy.

            Clerical errors happen all the time. I remember a story where someone was clocked doing 407km/h (impossible) on the highway. He was driving a Peugeot 407. Nothing was automated here, a human was doing the work. That doesn't mean the system was inacurate, and the ticket also got thrown out following the normal proc

            • by PPH ( 736903 )

              Clerical errors happen all the time.

              There was no clerical review. The method offered for a ticket challenge is to request a court date. Which costs drivers time and money, a kind of penalty by itself. Some people just pay the ticket, since (as they advertise) the infraction doesn't go on your record. I just happen to know cops and how broken the court systems are. In a proper and just system, nobody should be able to get a charge dropped through unofficial channels.

              Had there been a cursory human review, this would never have passed muster. B

      • You don't need to review all footage, just the flagged positive. Given how this is currently news, and how the school isn't having a shooter every day, it doesn't sound like it would cost even a cent extra to have someone at the school review the identified picture and click a button *before* calling in the guns.

        Does your company have a dedicated budget for you typing in a 2FA code?? Of course not, that would be absurd.

    • Funnier than the other joke but I was looking for the obligatory joke about the dangers of potato chips to some folks...

    • Its "almost" like you need an officer to review the footage before issuing the ticket so we arnt humiliating people and wasting anybody's time with nonsense.

      This is a system that called the police. The police are out of the loop beyond being told there was a shooter flagged. You need a human reviewing potential positives *before* calling the police.

  • by xanadu113 ( 657977 ) on Saturday October 25, 2025 @03:07PM (#65750124)
    Where is the picture? I want to see what AI thought was a gun, and police, who should KNOW the difference in appearance between a bag of Dorito's and a gun, felt it was enough to go on to draw guns and arrest the student. Then I want to know what brand of AI cameras were in use.
    • If it could possibly be interpreted as a gun and the AI recognition primes, the viewers, they will usually see it as a gun.

    • by sound+vision ( 884283 ) on Saturday October 25, 2025 @03:41PM (#65750192) Journal

      The police were acting on the assumption that "what they heard" was correct, which they always do, and always did, before AI.

      "What they heard" was some kind of school administrator placing an emergency call about a student with a gun.

      That administrator hasn't said anything publicly yet that I've seen, but the AI company is claiming their own human staff had reviewed the images, flagged it as a false positive, and communicated that false positive status to the school admins.

      The question is who knew what and when.

      • by cusco ( 717999 )

        Bungie had an oversized Master Chief manikin in its lobby when they were located in Kirkland, WA. They were in two buildings half a block from each other, one of them sharing a parking lot with the Kirkland Senior Center. One day Master Chief's five foot long gun was needed in the other building for something, so after the photos were done one of the Asian employees was walking down the hill to take it back to the lobby.

        The police got a panicked call from an old lady screaming that there was "an Arab with

    • and police, who should KNOW the difference

      Police saw the picture after the fact.

  • new rule no outside food but you can buy from vending with X2 mark up

  • Kids can now sneak guns into school inside Doritos bags.

    Police: We're not falling for that one again!

  • These displays of extreme authority are not an accident, they're a feature of a system that intends to remind you of the presence of constant surveillance.
  • by 93 Escort Wagon ( 326346 ) on Saturday October 25, 2025 @04:14PM (#65750254)

    From https://www.wbaltv.com/article... [wbaltv.com]

    "I am writing to provide information on an incident that occurred last night on school property. At approximately 7 p.m., school administration received an alert that an individual on school grounds may have been in possession of a weapon. The Department of School Safety and Security quickly reviewed and canceled the initial alert after confirming there was no weapon. I contacted our school resource officer (SRO) and reported the matter to him, and he contacted the local precinct for additional support. Police officers responded to the school, searched the individual and quickly confirmed that they were not in possession of any weapons.

    So the school resource officer (SRO) called the cops after the school had already determined it was a false positive and cancelled the alert. So the cops showing up, and this kid having to get handcuffed and searched, is all thanks to the SRO - who really deserves to be fired and quite possibly sued.

    • An additional level of failure is the over-reaction (IMO) of the police. A school security officer misidentifying a gun by mistake is always possible. But as there was no indication a threatening attitude in the initial report nor visibly when the police approached, the officers should not send people to the ground in handcuffs as their first reaction.

      As I understand, the officers were following their protocols for such situations; which would mean there is a systemic issue in that police protocols are inad

      • All true but no-one gets the point: This is a tiger-repelling, I mean, terrorist-repelling, rock.

        The half-wit security guard did something, the trigger-happy police did something and terrorism didn't happen: The system works. The friendly-fire and the gun-shot, never-armed suspect, when it happens, is the unfortunate price of 'saving' other people. The policy is, assume (not prepare) for the worst and damn the (avoidable) corpses.

        Like so much of the USA, cruelty is built-in. Until voters demand equ

      • An additional level of failure is the over-reaction (IMO) of the police.

        This did not seem like an overreaction to me. Maybe if it were in Europe, but not in the USA where stories of random armed people are actually quite common. If you're called to a scene for someone who has a gun, it stands to reason that you would go in guns drawn and command the person to get on the ground, and cuff them.

        You don't know anything, maybe that kid sitting there had a gun in his pocket ready to draw.

        An overreaction is stories of shoot first and asking questions later, like the police who shot an

        • Unfortunately students with weapons (kitchen knives) are not uncommon, and murders have occurred, to the point since March 2025 the French police is conducting random searches at schools. But this does not change the expected behaviour from educators and police. In short, whatever happens with school students inside or immediately next to schools is the primary responsibility of the school principal, and police does not touch students until there is a certain threat. Here some illustrative examples from Fra

          • I wanted to conclude that while these sad stories triggered a debate on how dangerous it is to be a school employee nowadays, or how low paid it is for the level of commitment it requires, this did absolutely not lead to any discussion that police should touch the kids at first suspicion.

  • this is a future we do not need to see. that student deserves a pay day for that fuckup.

    • Precisely. If you make mistakes like this expensive enough for the police station then the problem solves itself. The real problem is that someone promised the police and the school a magic new technology that would make their schoolyard safer. So far the system probably has zero wins, and one spectacular failure. If the political and economic fallout for the failure is high enough then the school turns off the crappy system, and it encourages other schools to do the same. Potential new buyers for the

  • by devslash0 ( 4203435 ) on Saturday October 25, 2025 @08:08PM (#65750608)

    Every time something like this happens, the victim should sue the hell out of the operator. Fighting back is the only way to make this stop. Without a legal and financial detriment, those cowboy software unicorns will do what they want without a drop of common sense.

    • those cowboy software unicorns

      This has nothing to do with software. It was the school resource officer who called the police even after it was reported as a false positive.

      This is why vigilante justice sucks, you're so ready to put the wrong person's head on a stake. The software unicorn developed software with flags, with human review, and the operator of the software correctly flagged the report as false.

  • by joshuark ( 6549270 ) on Saturday October 25, 2025 @08:57PM (#65750662)

    The school and SRO will play the "Wag the Dog" movie scene where a Tostitos sack of chips is changed into a cat.

    https://youtu.be/Bp79QTKqKyg?t... [youtu.be]

    Now they change the bag of Doritos into a gun, and blame the AI for hallucinating, and blame the AI. D'oh! Then the school will issue the press release "We take our students protection and civil rights very seriously..."

    JoshK.

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...