AI Translations Are Adding 'Hallucinations' To Wikipedia Articles (404media.co) 23
An anonymous reader quotes a report from 404 Media: Wikipedia editors have implemented new policies and restricted a number of contributors who were paid to use AI to translate existing Wikipedia articles into other languages after they discovered these AI translations added AI "hallucinations," or errors, to the resulting article. The new restrictions show how Wikipedia editors continue to fight the flood of generative AI across the internet from diminishing the reliability of the world's largest repository of knowledge. The incident also reveals how even well-intentioned efforts to expand Wikipedia are prone to errors when they rely on generative AI, and how they're remedied by Wikipedia's open governance model. The issue centers around a program run by the Open Knowledge Association (OKA), a nonprofit that was found to be "mostly relying on cheap labor from contractors in the Global South" to translate English Wikipedia articles into other languages. Some translators began using tools like Google Gemini and ChatGPT to speed up the process, but editors reviewing the work found numerous hallucinations, including factual errors, missing citations, and references to unrelated sources.
"Ultimately the editors decided to implement restrictions against OKA translators who make multiple errors, but not block OKA translation as a rule," reports 404 Media.
"Ultimately the editors decided to implement restrictions against OKA translators who make multiple errors, but not block OKA translation as a rule," reports 404 Media.
Treat all AI edits as vandalism (Score:5, Insightful)
Re: Treat all AI edits as vandalism (Score:1)
How does this help the AI company shareholders (this is the purpose of all sentiment life - get with the program).
Re: (Score:2)
Or, maybe, we'd be better off without LLM-AI at all.
Re: (Score:3)
Wikipedia was about half hallucination anyway.
Let's check that. Here is the link that takes you to a random Wikipedia article: https://en.wikipedia.org/wiki/... [wikipedia.org]
How many times do you have to click that link to find an example of the hallucinations that you claim are "about half" of Wikipedia?
Re: (Score:1)
I came here to say the same thing. Unless youâ(TM)re dealing with an article on physics, geography, or engineering, almost everything on Wikipedia has a noticeable slant. It is not limited to politics, it encompasses pretty much everything except the hard sciences.
Apparently, we're supposed to fall in line with the bias and not supposed to point it out. For those of us that grew up with actual Encyclopedias, Wiki always comes off as a little iff-tacular. Like you say, in the harder sciences you tend to only get editors with a lot of training in the field, but most other topics? It's randomized edit-wars of counter-biases.
Re: They're catching up to us! (Score:2)
Wait until you find out that encyclopedia articles are also written by humans with biases and change over time. Go look up Gaza in some new and old versions of the same encyclopedia some time.
Re: (Score:3)
Yes, well, reality has a well known liberal bias, so it can't be helped really.
Re: (Score:2)
Yes, well, reality has a well known liberal bias, so it can't be helped really.
You are 100% correct sir! As proof, let’s consult the early 2020s 100% liberal-approved fact check guide:
* The inflation is minimal and, regardless, is “temporary”.
* The border is secure, and cannot be closed any further without new congressional legislation.
* Biden is fully mentally competent.
* Judging by identity instead of character is democratic.
* Decriminalizing crime and defunding policing are wonderful ideas.
* Violent crime rates haven’t spiked.
* The working class is doing wel
WP editors are NOT known for using best practices (Score:5, Insightful)
Problem (Score:2)
It is crazy just HOW wrong and weird these "hallucinations" can be. Just last night I was asking Gemini about TV shows to watch and it literally made up a TV show that didn't exist as a recommendation.
Re: (Score:2)
"literally"
I think not.
Re: (Score:2)
Re: (Score:2)
Is "letterly" a cromulent word, though?
Re: (Score:2)
Yes, if you embiggen your vocabulary.
Re: (Score:2)
It metaphorically made up a show to a great extent??
What do you think is meant by "literally" here?
Re: Problem (Score:1)
My take is, in times of ever-present falsehood, 'literally' is used as a tag to indicate truth..."I'm not lying when I say..."
Hey, that's MY job (Score:3)
How dare AI still my job of adding inaccurate information to wikipedia articles for no reason.
Proof reading is a concept (Score:2)
Translators have been using various forms of tools, including "AI" tools for many years. The point is: as a professional translator, you treat what the tools produce as a draft, and do the final corrections yourself.
Of course, Wikimedia probably didn't hire qualified, professional translators, because that would cost actual money.
We Need AI Control Laws (Score:2)
Re: (Score:2)
The text prediction engine that cell phones use is fine, so is Bixby (it's a limited AI thing, don't think it can generate code for programs).
Anything higher that that isn't really needed... OpenAI doesn't need ten factory-sized building full of their machines to power their stoned LLM-AI, and we absolutely shouldn't be using it at the Pentagon for anything... launching missiles or declaring war HAS to be a human-only thing based on human-only gathered information.