Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Facebook AI

Meta Won't Release Its Multimodal Llama AI Model in the EU (theverge.com) 26

Meta says it won't be launching its upcoming multimodal AI model -- capable of handling video, audio, images, and text -- in the European Union, citing regulatory concerns. From a report: The decision will prevent European companies from using the multimodal model, despite it being released under an open license. Just last week, the EU finalized compliance deadlines for AI companies under its strict new AI Act. Tech companies operating in the EU will generally have until August 2026 to comply with rules around copyright, transparency, and AI uses like predictive policing. Meta's decision follows a similar move by Apple, which recently said it would likely exclude the EU from its Apple Intelligence rollout due to concerns surrounding the Digital Markets Act.
This discussion has been archived. No new comments can be posted.

Meta Won't Release Its Multimodal Llama AI Model in the EU

Comments Filter:
  • ... sounds a lot like arresting people before they perform an illegal act based upon the AI suggesting that they are about to perform an illegal act. If it was used to station police officers in likely high crime areas, I would support that, however I believe that criminals aren't actually that stupid and would see the increase in police presence and then, rightly so, determine that a lower police presence area should then exist and seek that area out to then commit crimes in.
    • If it was used to station police officers in likely high crime areas, I would support that

      I don't think anyone needs AI for that. While I'm not an expert on violent crime, I imagine it doesn't migrate very quickly. If an area was a hotbed of violent crime yesterday and the day before, there's a good chance that it will be a hotbed for crime today and tomorrow.

      determine that a lower police presence area should then exist and seek that area out to then commit crimes in

      I think you may be giving violent cri

      • Yeah, I just read an article about simple algorithms 'predicting' where crimes will occur, based upon historical data. No AI needed.

        Yeah, I have seen that MANY criminals aren't that smart about things, and for the smart ones, it seems to me that if they had put the same level of effort into legit enterprises, they would have had the same level of success as they did in their criminal enterprises, but without the risk... for a very few though that I have met, the 5 - 10 years of prison for a $50 million do

        • The problem is, Criminality isn't a function of intelligence, its a function of personality. The smarter criminals aren't running about stealing cars, they are running crypto scams or whatever. There are a number of personality disorders that highly correlate with criminal behavior. Sociopathy and Psychopathy, Antisocial personality disorder, oppositional defiant disorder and Narcisism are all personality disorders that correlate highly with a predisposition towards criminal behavior. Mix in external aggrav

          • Very cogent and interesting points, thank you. I am aware of the FBI statistics which show a general decline in violent crime over the last 30 or so years... I wish more people were!

            It is something that irks me no end when I hear people screaming about how the USA is overrun with violent crime, when clearly that is not the case.

    • If it was used to station police officers in likely high crime areas, I would support that

      Bye bye racial profiling, hello predictive policing. Putting an unaccountable machine which can't and won't explain its decisions in charge of law enforcement sounds like a plan that just fits right in with the goals of Trump's handlers.

      • thanks for pointing that out. Yeah, not a fan of racial profiling. My wife is black and I have been with her when she has been pulled over for 'driving while black' ...

        Storytime: so, we were driving to the mall one day when she got pulled over, she was driving, we were in her car. Cop comes up to the window and says license and registration, she counters with what did I do wrong? He says, just give me your license and registration. Being as how she has been pulled over before for driving while black, she

      • Bye bye racial profiling, hello predictive policing. Putting an unaccountable machine which can't and won't explain its decisions in charge of law enforcement sounds like a plan that just fits right in with the goals of Trump's handlers.

        Won't last long...just as soon as the machine, using logic, data, etc...starts telling that we need extra policing in minority neighborhoods, it will be thrown out and declared a racial algorithm.

        Facts and numbers be damned....we can't have the truth out if it goes up agai

        • Can you imagine what it'd do if it analyzed the FBI crime statistics? It's going to take a lot of exceptions to make sure it doesn't let the cat outta the bag. Without someone to say "You can't say that! You can't even think it!" the thing is going to just summarize crime like a Twitter user: "It's them. It's almost always them."
    • They aren't accusing Meta of doing any of that, the law just explicitly forbids the practice, and TFA writers thought that sounded nice and sensational to mention when talking about a social media pariah, as it were.
    • ... sounds a lot like arresting people before they perform an illegal act based upon the AI suggesting that they are about to perform an illegal act. If it was used to station police officers in likely high crime areas, I would support that

      It turns out to be a self-fulfilling prophecy.

      The prediction of an area needing more police leads to more police in that area. More police there means more arrests, which validates the need for more police.

      • i would hope that the more police in an area would lead to deterrence... But then again, I naively hope that Police are there to help.
    • If it was used to station police officers in likely high crime areas, I would support that,

      We already know what the high crime areas are. But rather than fixing the circumstances that forces people into crime, we punish them harder for their misery.

      It is totally clear these systems are already used for the wrong problems.

      • good point, thank you for sharing that. I used to volunteer in prisons and try to mentor prisoners, so I have seen the desperation that so many have, believing that crime is their only way out of soul-crushing poverty.
  • by Anubis IV ( 1279820 ) on Thursday July 18, 2024 @12:59PM (#64635379)

    Did The Verge add something new to the story that we didn't already hear from Axios yesterday?

    https://meta.slashdot.org/stor... [slashdot.org]

  • Once again, impeccable work by the Slashdot editors. http://www.slashdot.org/story/... [slashdot.org]
    • by Anonymous Coward

      I like your sig, far classier than the "I don't read or respond to AC comments" norm. I've discovered that most of those posters WILL respond to AC comments that aren't trolling. So now you know. I'm thinking of writing an academic paper on the subject.

  • Good! (Score:5, Insightful)

    by Going_Digital ( 1485615 ) on Thursday July 18, 2024 @01:22PM (#64635423)
    I’m quite frankly fed up with having this AI rubbish foisted on me. AI has a place in some specific scenarios, where the model can be well trained and tweaked, but this general AI rubbish bolted on to everything is just utter junk and frustrating, in most cases i5 just gets in the way of doing something you already know how to do. The world will be a better place when the AI hype dies down, and AI gets put to work in specialist tasks, that it is actually useful for.
  • & yeah, it's the usual "throwing their toys out of the pram" display that corporate executives often give to govt regulators. They'll be back but I hope they take their time with it. These "AI" models & algorithms are far too new & untested to know what they'll actually be useful for, how useful they'll actually be, & what problems are likely to emerge from their use. We need pilot projects, independent reviews, & above all, transparency & honesty about them.

    Can you see Microsoft,

Never trust a computer you can't repair yourself.

Working...