Facebook Reports BBC To Police Following Publication's 'Sexualized Images' Investigation (bbc.com) 122
"Grave doubts" have emerged about the effectiveness of Facebook's moderation system after an investigation by the BBC last year revealed the social network was failing to remove sexualised images of children even after they were reported. Damian Collins, chair of the culture, media and sport committee, made the comments as he criticised Facebook's handling of the images, dozens of which were reported to the company by the BBC and fewer than 20% were removed. After the BBC sent evidence of the photos to Facebook, the social media company reported the BBC to the police for distributing the images, which had been shared on private Facebook groups intended for paedophiles. From a report on BBC: When provided with examples of the images, Facebook reported the BBC journalists involved to the police and cancelled plans for an interview. It subsequently issued a statement: "It is against the law for anyone to distribute images of child exploitation." Mr Collins said it was extraordinary that the BBC had been reported to the authorities when it was trying to "help clean up the network." [...] Information the BBC provided to the police led to one man being sent to prison for four years.
What is Facebook thinking? (Score:5, Insightful)
I take it FB is unfamiliar with the Streisand effect.
Re:What is Facebook thinking? (Score:5, Informative)
It is even worse than the summary suggested. The BBC did not originally send the evidence to FB. They did so when FB asked them to ahead of an interview arranged with FB's director of policy Simon Milner. Reporting them to the police for providing what they were requested to beggars belief.
Re: What is Facebook thinking? (Score:5, Insightful)
Police put child pornographers in prison after asking them for photos. That's how the law works. It's a dumb law, but THINK OF THE CHILDREN!!!!!!!!!!!!!!!
Re: (Score:2)
It's a dumb law, but THINK OF THE CHILDREN!!!!!!!!!!!!!!!
Isn't that what landed the child pornographers in prison in the first place? Hell no I'm not gonna do that!
Re:What is Facebook thinking? (Score:5, Insightful)
They did so when FB asked them to ahead of an interview arranged with FB's director of policy Simon Milner.
In many (most?) US states, this is legally required. That's how far the pedophile witch hunt has gone. It's a felony in most places, often with the same penalty as actually producing the material, to be aware that someone has child exploitation images without immediately reporting it to the police. The law doesn't mention anything about journalists.
This is something anyone who repairs PCs for a living knows. Sure, common sense might say "but this is an exception", but the law doesn't. There are some states in the US where no intent is required to be guilty of possession - doesn't matter why, unless you're a policeman investigating the particular crime. Heck, even the defense lawyer and jury may not be allowed to legally see the images except in tightly controlled circumstances, as if the witches might cast their evil magic unless the correct protective ritual is performed.
Re: (Score:2)
Well, see the problem has an easy solution. If you come across dangerous images in a dangerous location, either delete as fast as possible and forget about it or do what you said, forward to the authorities and then delete as fast as possible, done and finished. No different to hacking tools whether embedded in email or in any other format. Simply delete as fast as possible or appropriately forward to the authorities and then delete as fast as possible. Although the exposure to the first has been really lim
Re: (Score:2)
Thing is if you do this and report it, you may be guilty of destroying evidence (even if it's unintentional and recoverable). I think the formal term is "spoliation", and burden of proof/consequences vary greatly.
Not a lawyer, though, so I'm probably wrong.
Re: (Score:2)
That's what I said?
Re: (Score:1)
FB might have expected to receive hashes of the images. Most big cos (Google, Microsoft, Facebook) have hashes of images, which can be used for detecting, filtering and alerting.
https://www.theregister.co.uk/2015/11/17/iwf_hash_list/
So guessing BBC technological incompetence and FB imprecise request could be explanations.
Re: (Score:2)
BBC tells FB that there are inappropriate child images on FB and FB turns around and reports the BBC to the police that BBC is distributing these images?
I take it FB is unfamiliar with the Streisand effect.
You almost made it sound like somebody at the scale of influence of Facebook gave 2 shits about "streisand effect".
Re: (Score:2)
They might give 2 shits about being prosecuted for wasting police time.
Report them! (Score:1)
FTA: "It is against the law for anyone to distribute images of child exploitation.
"When the BBC sent us such images we followed our industry's standard practice and reported them to Ceop [Child Exploitation & Online Protection Centre].
After which we dutifully reported the employee that sent the material to the Ceop for the same reason. This could mean the end of Facebook!
Comment removed (Score:5, Funny)
Re: (Score:1, Insightful)
> FTA: "It is against the law for anyone to distribute images of child exploitation.
Except they are *not* images of people exploiting children. They are pictures that young adults have taken of themselves and posted.
Standard teen facebook stuff that gets lots of likes and annoys their parents. Then the perverts copy the photos and add distasteful comments. In a hidden, private forum.
So what do you do? Where draw the line? Do we end up dressing out kids in burqas, or shrug and move on with our lives?
Re:Report them! (Score:5, Informative)
They are pictures that young adults have taken of themselves and posted.
FTA:
Images appeared to be stolen from newspapers, blogs and even clothing catalogues, while some were photographs taken secretly, and up close, in public places. One user had even posted a video of a children's dance show.
TFA is not about "young adults" nor pictures "taken of themselves and posted." The only ages I see cited are 10-11. It's about pictures people have taken of children that are being treated as sexual.
Re: (Score:1)
It's about pictures people have taken of children that are being treated as sexual.
Oh, so it's literally just thoughtcrime. Knowing this, I can't say I have much sympathy for the BBC.
Re: (Score:3)
It's about pictures people have taken of children that are being treated as sexual.
The problem with an outcry over this is that we would effectively have to ban all pictures of people under the age of 18. If you photograph a tree and post it online, somewhere someone will treat it as sexual.
We are approaching the point where thought crimes don't even need to be the thoughts of the accused.
Re: (Score:2)
How dare you post links to smut like that! /sarcasm
Re: (Score:2)
Should have sent links, to the authorities not FB (Score:1)
Why in the world did they send actual images?
Re:Should have sent links, to the authorities not (Score:5, Informative)
Because Facebook said that before they'd grant an interview, they wanted some examples of the material they'd failed to remove.
not the first time (Score:5, Interesting)
There have been journalists who tried to cover this beat before and been charged with child pornography and sent to jail. [cnet.com] Depending on who the prosecutor is, this is the untouchable story. There is no safe harbor when it comes to kids and sex.
Re: not the first time (Score:3)
There is no safe harbor when it comes to kids and sex.
History tells us there's no safe harbor when it comes to anything. Think of this like the Inquisition: a large, shady organization revealing its true colors while the rest of us attempt to derive what comfort we can by telling ourselves that "events" are bewildering and inexplicable... when, in fact, they're anything but.
Re: (Score:2)
Everything you said is correct, but the Electoral College is specified in the constitution. It's not based on the popular vote. Even the election of Senators was not originally based on the popular vote, because the founders didn't trust it.
FWIW, I believe that they were right not to trust the popular vote, but unwise in trusting the votes of the wealthy and powerful. There probably *isn't* a decent way to elect a government. A lottery might not be any better, but it probably wouldn't be any worse. It
Re: (Score:2)
A lottery is the best way, IMO, along with the provision that a background investigation needs to be done on the winner and if they have a history of really wanting to hold public office, they're not allowed to have the position.
Arthur C Clarke wrote a book called "Songs of Distant Earth" that had a society with exactly this system of choosing leaders.
You'd be likely to get people who were careless or unqualified, but look what we've been ending up with.
Exactly. Picking people at random simply cannot be wo
Re: (Score:2)
Re: (Score:2)
> You mean like how Jeff Sessions lied in front of Congress, about meeting the Russians?
He did *NOT* lie. He was asked if he *CO-ORDINATED THE 2016 ELECTION CAMPAIGN STRATEGY* with the Russian ambassador. He said "No". However, *IN HIS CAPACITY AS A MEMBER OF THE SENATE ARMED SERVICES COMMITTEE ON GOVERNMENT BUSINESS* he did meet with the Russian ambassador in 2016, as did other congresscritters.
While we're at, Democrat Nancy Pelosi, minority house leader, met with the Russian ambassador http://www.polit [politico.com]
Re: (Score:2)
There is no safe harbor when it comes to kids and sex.
Due to the irrational hysteria surrounding child porn, the only safe harbor is to simply pretend you never saw it.
1) You can't report it to the company, because they will turn your attempt at fighting child porn into a potential life sentence in prison. So you can't trust the company to fight child porn.
2) You can't report it to the police, because they will arrest you for having viewed it, even (or maybe especially) if it was unwillingly. So you can't trust the police to fight child porn.
So basically, yo
Re: (Score:2)
But the U.S. District Attorney's Office in Maryland counters Matthews's defense that he was researching child porn--not collecting it--on grounds that the defense also could block the prosecution of pedophiles who have any journalism credentials.
God help us if any of these prosecutor types ever get access to nukes. They'd immediately launch them at their own cities to eliminate any chance of child porn or drug use there.
The BBC is not the FBI (Score:2)
The FBI is allowed to distribute child pornography. The BBC should have let the FBI handle this investigation.
Re: (Score:3)
I am also very sceptical to your claim that "FBI is allowed to distribute child pornography". To what end? Entrapment?
Yes. A quick internet search will clearly show multiple incidents where the FBI has run dozens of child porn websites. Generally, this seems to have happened when they've taken control of an illegal site, and then they keep it running for months to try to catch users, but frankly it wouldn't surprise me at all if this were sometimes expanded to blatant distribution for entrapment purposes.
It brings up all sorts of questions, and I'm really not sure how one can justify it legally. In the U.S., the logic
Go to the police! (Score:3, Insightful)
Anytime you find kiddie images, you must immediately report them to the proper authorities or else you will face prosecution. Had the BBC not spent time trying to make someone look bad and instead reported these images to the police, the police would have then contacted Facebook who would have removed them in a timely manner.
Here's hoping there was a lengthy penalty by the police to said "journalist" for trying to manufacture outrage!
Re: (Score:2, Insightful)
(rtfa)
Re:Go to the police! (Score:5, Insightful)
I've been on the net for a very long time. Over a quarter century. I have never "come across" kiddie images. I have no idea how people seem to do this, or expect people to believe they just randomly came across child porn.
That said, if I did, I wouldn't report. I would format my system and disappear for a few years. If a teenage boy can go to prison and be labeled a sex offender because his teenage girlfriend sent him a boob shot then there is no way I could expect a fair trial should they decide to charge me with something.
Re: (Score:3)
I've stumbled upon them. Playing with freenet I followed a link that I wish I hadn't. In the early days of the file sharing networks I would get the occasional nasty download. Downloading binaries from usenet many years ago would occasionally yield an image that I would have liked to excise from my brain. All you can do is purge the file the best you can and hope you weren't being set up by some kind of a sting operation.
Re: (Score:3)
An acquaintance of mine has been sitting in County for five months so far for this with no trial, just a two minute pleading of not guilty in front of a judge.
He selected all of the top downloads on emule or something like that and one video happened to be a plant.
No other priors, no evidence of actively searching this out. He said he didn't even know he had downloaded it because he hadn't gone through the files yet.
I do believe him, and what we've heard from the public defender corroborates his explanatio
Re:Go to the police! (Score:5, Informative)
Far scarier than a letter from the copyright police. Those incidents all changed my behavior - like your acquaintance, I would just select all the files that matched a search term and download them without a thought. I'd come home from work and see what treasures I'd downloaded. Once I got some questionable pictures or videos, I was a lot more discerning. I used to run a usenet picture extraction tool - it would comb though usenet looking for - ahem - good pictures and download them to your hard drive. I stopped using that tool once it dutifully filled a directory with horrible images. The scary thing about that one is that I didn't even notice the folder for a couple of weeks or months... sometimes not having backups is a good thing! Freenet works by downloading the content on to your own PC. I get that you have a reasonable defense if you don't know what the encrypted files on your drive are... but once I visited the bad freenet site, I could no longer credibly say that I didn't know what was in the encrypted files. I stopped running freenet. Scary stuff, indeed. We've already started teaching our kids not to take any pictures of themselves or their friends naked or in revealing outfits, not because it is morally wrong but because prosecutors have demonstrated a willingness to use "child protection" statutes against children.
Re: (Score:2)
People have emailed such images to me from disposable accounts. Didn't even notice until one day I was checking spam messages when I was expecting something important to arrive.
The same IP addresses tried to get into my Mediawiki install, presumably to post the same stuff on my web site. Fortunately I had 2 factor auth turned on.
Re: (Score:2)
I have, twice.
Once when someone thought it would be funny to send a link to one of the sales staff that flooded his screen with porn popups. He called for me, I walked up and yanked the power cord out of the back of his machine figuring that cleaning up after an unclean shutdown was better than having to look at the pictures.
And once when some kiddy porners were using my employer's top list system to evade website shutdowns and I found it completely by accident when browsing the lists..
Didn't take long f
Re: (Score:3)
I've been on the net for a very long time. Over a quarter century. I have never "come across" kiddie images. I have no idea how people seem to do this, or expect people to believe they just randomly came across child porn.
If you've been watching porn on the Internet, there's a high probability you've seen someone underage. Every so often there's a scandal about teens faking it into adult porn, well there's probably more going under the radar and in amateur porn and sexting there's no age checks. It doesn't help that the adult industry hire "barely legal" 18yos that look more like 14yo, so if you see an actual 14yo it probably just looks exactly like you're used to. I remember there was a case here in Norway where the ex-bf o
Re: (Score:2)
> I have never "come across" kiddie images...That said, if I did, I wouldn't report.
This depends on where you live, and how much empathy you have vs. fear of persecution.
I have 'come across' kiddie images, in the early days of the Internet when I'm pretty sure cops didn't even know what the Internet was. Did I report? Hell no.
Today, however, I'm older, I have kids of my own (which really ups the empathy), and I am mature enough to admit I was looking for porn and found something illegal while doing so,
Re: (Score:1)
I know it's passe to actually RTFA, but they (1) pressed the report button on 100 images (of which it sounds like a lot were of the "xxx schoolgirls" and "a [presumably legal] image of a 16 year old with obscene comments beside it", (2) only 18 were removed, and (3) the rest were (according to Facebook) acceptable by "community standards". So, when they got around to sending the imag
code words (Score:5, Informative)
Sexualized Images is popo codeword for teen selfie in a bikini, or in lacey panties with butt facing the camera... probably with duckface. If it was real porn or nudity they would use different phrasing. Sexualized Images simply means normal pics with 'sexual' connotation, something all teens love to do.
Re: (Score:2)
I do. I mean, of course a large country has a diverse population and no random person there is personally responsible for the worst products of that country, but collectively, countries need to own up to the the evil things they've produced, and for the US that's things like the Trail of Tears, the WWII internments, the Kent State Massacre, and Facebook.
Excessive Censorship can't solve this problem (Score:2)
Many of the pictures, taken at face value, aren't sexually explicit. They are only being interpreted that way b/c certain pervs are sexually attracted to them. Short of creating a prison state, we will never get rid of this via criminalization. Trying to censor everything these people might use to satisfy their distorted sexuality is a fools errand, and will likely drag a lot of other people down needlessly as well. Consider by analogy Muslim countries that have gone so far to force all woman to wear veils
Significant cultural differences between US and UK (Score:2, Insightful)
The common language makes it easy to overlook significant cultural differences between the UK and the US. I believe one of them may have played a role here. In many places there is a tendency for mere technical questions of process to make people lose sight of the original sense (purpose and intent) of rules. (This is not one-dimensional. As a German I keep being surprised by how the balance works in the UK. Sometimes the balance in the UK is much more on the procedure side than I am used to, and sometimes
Google teenager in bikini (Score:1)
pot meet kettle (Score:4, Insightful)
"It is against the law for anyone to distribute images of child exploitation." ...said the company responsible for hosting and making availble those images.
Re: (Score:2)
distribute images of child exploitation
Interestingly enough this story doesn't appear to be about any child exploitation.
But FB is good at removing images of nude statues (Score:4, Informative)
Images of fine art in museums get deleted, but....
Meanwhile, post a link to an Oglaf strip... (Score:4, Insightful)
My work internet blocks Oglaf, so I can't find and link the strip in question, but a couple weeks ago my partner put the strip up on her Facebook page. A day later, the strip had been taken down because it was 'offensive'. It's a cartoon, and the punchline was basically that a guy fucks lemons. Woo. It's NSFW, I guess, but it involves two adults and lemons. It's really no big deal, and it's pretty funny.
I have friends that are models. Heaven forbid they show even the barest bit of nipple. Sometimes it doesn't even take that much. They have pictures taken down and temp-bans put on them.
So my question is who are they employing to scan these images, and why do they find partially clothed women more offensive than pictures of exploited kids?
Re: (Score:1)
- Using the word "children" to refer to "teenagers"
- Using vague but suggestive phrases like "sexualized images" to refer to any sexually suggestive aspect of the image
Thus "teen girl puts on pushup bra, aims camera at her face and boobs, takes picture" becomes "sexualized images of children".
It was even more egregious (Score:2)
The whole story was on BBC Newshour this morning:
After finding these images were still on FB, the BBC asked for an interview with FB to get an explanation of why this was still going on. FB then asked them, please send us the images you want to talk about. We want to see it before we say anything.
The BBC complied that request. And then got reported.
So FB ASKED them for the images, which were already present on FB. And instead of doing anything about the images, they went to the police with them.
FB, you
Re: (Score:1)
Slashdot, we get that you have to cycle through your daily Facebook and Uber jones, but do you think maybe the Vault 7 story might just get a little air time here?
It's on the front page, now. 0 comments. Go wild, sport.
Re: (Score:2)
Golly, you have issues.
The issues have been reported as Playboy, Hustler, and Penthouse.
Re: (Score:1)
Because Slashdot isn't Putin's bitch. Wikileaks pretty much destroyed all of its credibility when it decided to get a fucking Russian-controlled Fascist elected as US president, because of a rumor that his opponent made a joke about "droning" Julian Assange. Fucking sell-out snowflakes.
Re: (Score:1)
Your link does not contain a single mention of "America", "USA" or "US", they do not ask the question you posted or anything similar, and "many did want it (Sharia law) in the US" is completely unsupported and unrelated to the article you linked.
The "Facebook censored me" is just complete bullshit, repeated again and again, usually in Facebook posts that state that "Facebook banned my post", but somehow always include the supposed banned post.
Facebook may have banned it if you preceded it with a stream of r