Facebook Admits Flaw in Image Moderation After BBC Report (bbc.com) 57
From a report on BBC: A Facebook executive has admitted to MPs its moderating process "was not working" following a BBC investigation. BBC News reported 100 posts featuring sexualised images and comments about children, but 82 were deemed not to "breach community standards." Facebook UK director Simon Milner told MPs the problem was now fixed. He was speaking to the Commons Home Affairs committee alongside bosses from Twitter and Google as part of an investigation into online hate crime. The BBC investigation reported dozens of posts through the website tool, including images from groups where users were discussing swapping what appeared to be child abuse material. When journalists went back to Facebook with the images that had not been taken down, the company reported them to the police and cancelled an interview, saying in a statement: "It is against the law for anyone to distribute images of child exploitation."
"Appeared" to be (Score:4, Insightful)
Re: (Score:2)
You appear to be a racist terrorist misogynist pedophile.
And if you appear to be then you are.
Re: (Score:2)
In other words, copyright violation.
Re:"Appeared" to be (Score:5, Insightful)
"Zero tolerance" laws are always, no exception, a bad idea.
It's sad when an e-mail service flags and blocks pictures I send to my brother of himself as a child, in a bath tub. Who knows what lists you can get added to because the laws are just plain wrong.
Re: (Score:2)
""Zero tolerance" laws are always, no exception, a bad idea."
Er, is that a "law"?
Sorry, I haven't had my coffee yet and am still feeling pedanticly.
Re: (Score:3)
"Zero tolerance" laws are always, no exception, a bad idea.
I like to call them Zero Intelligence laws.
Re: (Score:3)
So, would you say you have zero tolerance for zero tolerance laws?
Re: (Score:2)
Laugh! No, I tolerate them, but they are still a bad idea. I support getting rid of them, no matter what the zero tolerance policy is; whether it's banning Erlenmeyer flasks, banning knifes at school (prompting a crafts teacher to hand children chisels), or anything else.
(Which is why I'm also against common law and prefer civil law - in common law, stupid precedents become law, and courts can't use common sense to interpret the meaning of the law, and get stuck on the letter of the law. Zero tolerance p
Re: (Score:2)
I agree. I would throw "three strikes" laws in with this as a subcategory of zero tolerance-type laws.
Re: (Score:2)
I agree. I would throw "three strikes" laws in with this as a subcategory of zero tolerance-type laws.
Yes, someone stealing *food* a total of three times going to jail for life is not too helpful. And far more expensive than e.g. creating work programs.
In my opinion, most knee-jerk good intention laws are flawed.
That also includes idiotic laws that make rape carry far stronger sentences than other acts of violence, which have unintended consequences like women not reporting rape because they don't want their husband or family member to go to jail for life, or where the perpetrators may choose to kill their
Re: (Score:2)
I'm reminded of the couple in the '90s who were charged with Child Pornography for putting a photo of their 2 year-old playing naked in the lawn sprinkler on their web page.
Re: (Score:3)
This stop and makes you think about all of the fuss over nakedness, is it somehow unnatural?
Re: (Score:3)
Zero tolerance laws seem to exist mostly to protect officials from accountability, so they can overreact in safety.
"Yes, I know it seems unreasonable to expel your child for pretending to shoot at a friend with a drinking straw, but my hands are tied. I'm sorry, but I have no choice in the matter. It's the law."
Facebook flaunts failure (Score:2)
More importantly... (Score:1)
...Is the capture and punishment of those creating these materials. The fact that these materials (photos, videos, etc.) exist is secondary to the more serious crime of child exploitation itself. Humanity often seems to be more concerned with treating the symptom instead of the root cause of the problem, so to speak.
Re: (Score:2)
The interesting thing is Facebook reporting the reporters to the police.
Of course AI cannot determine this. (Score:1)
It's like asking AI to define pornography. You'll 64% know it when you see it, maybe, if the image fits a certain general profile. That's not going to work. Hire eyeballs with your BILLIONS OF DOLLARS.
Re: (Score:2)
It really does traumatize workers, even in the high agencies.
Automation is fine, but not when it's equated with a verdict. It supplements. It does not substitute.
But hey, our military weapons make sure there's a human sanity check somewhere in the automation chain, so at least the important priorities are being supported.
Or so I'm told. inb4 drone autonuke mod gets hacked ggezpz
Well That Explains It (Score:3)
Re: (Score:1)
Why should they be taken down? Videos showing actual homicide aren't illegal. From the tone of the article, neither were the images BBC was reporting. Sounds more like it was teenager bikini selfies and stuff out of underwear catalogs. Just because someone might be offended isn't a good reason to go around censoring everything.
Re: (Score:1)
Why does it matter if someone masturbates to a swimsuit catalogue? Even if that catalogue was for younger girls? The point is the parents of those children signed releases for their images to be used in advertising material and distributed widely. In some cases the producer and artistic director of the shoot pushed the boundaries of what society considers decent, or maybe they did not: though at all times the figure behind the [entirely sanctioned and legal] camera was trying to make the shot "special" or t
Impossible to judge. (Score:1)
Without knowing exactly what was in the offending posts, how can we possibly know what to think? Somehow I doubt it's that easy to find hardcore child abuse images on facebook, so these might be almost innocent - it wouldn't surprise me if most of them are just swimsuit images from someone's family holiday, or screencaps from Toddlers and Tiaras or some other TV program.
It reminds me of certain very socially-conservative news sites I've seen decrying something or other as an affront to all that is good in t
Re: (Score:3)
People who can't distinguish paedophilia from child abuse should be put behind bars, as they're a danger to society.
It would surprise me if the great majority of paedophiles and zoophiles aren't abstaining from pursuing their desires(?). My guess is that we only hear about the exceptions.
Or put it this way, have you ever dreamt or thought about killing someone? Does that make you a murderer who should face the penalty for such?
As for preventative actions, judging from all statistics I've seen, children h
Re: (Score:3)
So speaks ignorance. Pedophiles have a lower recidivism rate than most other crimes i.e. not only CAN they be reformed, if not cured, but we are better at doing it than for most other crimes.
People convicted of sex crimes against children and released have less than 4% chance of being arrested again for sex crimes against children - but over 40% chance of being arrested again for any crime. And yes, they check the computer when they arrest them for littering.
In other words, the cops go after them, but
Cannot be fixed, not really (Score:2)
And computer scientists know there is no automated way to screen these photos without generating false positives. Even an algorithm that was 99% accurate,
Re: (Score:2)
Even if someone can review 1,000 pictures a day facebook would have to hire 300,000 people to ensure none of the pictures posted are 'kiddie porn'
Good. Facebook's stupid system should be unbearably expensive, even with content moderator farms in Morocco where your image doesn't get more than a second's consideration.
It's currently a kiddie-pool design, where people can get accounts semi-anonymously, people can report things that make them feel uncomfortable anonymously, and the images are taken down anony
Re: (Score:2)
How 'bout this:? You show some ID to get an account.
How 'bout I don't show some ID to get an account at a place that doesn't require all that shit?
Facebook is not a goddam governmental agency where membership is a requirement.
Facebook is a business and will spend money to fend off external efforts to get it to spend money.
Re: (Score:2)
Re: (Score:2)
Even if someone can review 1,000 pictures a day facebook would have to hire 300,000 people to ensure none of the pictures posted are 'kiddie porn'
Why did you pull such a bad number out of your ass?
1000 is basically only 2 images per minute for an 8 hour work shift. If you worked for me I'd be looking into your productivity right now.
Re: (Score:1)
Large fines (Score:2)
We need to level fines at such a large level that they act as a deterrent. FB would quickly find a solution to this problem.
Re: (Score:2)
Yes, and the solution would be to withdraw all management from the US and incorporate in Bermuda.
Oh please ... (Score:3)
'Content Moderation' == Censorship (Score:2)
However: So-called 'content moderation' should be called what it is: censorship. Doesn't matter if it's a non-governmental, privately-owned company, enforcing their own rules on their own website, it's still censorship. Stop with the 'newspeak' already.
Re: (Score:2)
Censorship is not inherently bad unless it affects me personally, then it's REALLY BAD
Fixed that for you. People like you typically don't care what happens to anyone else, but you get all righteously indignated and suddenly become 'activists' when it affects you.
Re: (Score:2)