Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Social Networks Facebook Media The Internet Your Rights Online

Facebook's Sheryl Sandberg On 'Napalm Girl' Photo: 'We Don't Always Get it Right' (theguardian.com) 196

Facebook will learn from a mistake it made by deleting a historic Vietnam war photo of a naked girl fleeing a napalm attack, said Sheryl Sandberg, the company's chief operating officer. The photograph was removed from several accounts on Friday, including that of the Norwegian prime minister, Erna Solberg, on the grounds that it violated Facebook's restrictions on nudity. It was reinstated after Solberg accused Facebook of censorship and of editing history, The Guardian reports. From the article:"These are difficult decisions and we don't always get it right," Sandberg wrote in a letter to the prime minister, obtained by Reuters on Monday under Norway's freedom of information rules. "Even with clear standards, screening millions of posts on a case-by-case basis every week is challenging," Sandberg wrote. "Nonetheless, we intend to do better. We are committed to listening to our community and evolving. Thank you for helping us get this right," she wrote. She said the letter was a sign of "how seriously we take this matter and how we are handling it."
This discussion has been archived. No new comments can be posted.

Facebook's Sheryl Sandberg On 'Napalm Girl' Photo: 'We Don't Always Get it Right'

Comments Filter:
  • by wierd_w ( 1375923 ) on Tuesday September 13, 2016 @03:16PM (#52881239)

    Seriously-- You got caught red handed being censorship loving fuckwits who refuse to accept community feedback on policy decisions, naturally, you got your asses handed to you over it, and now you want to cuddle back into good graces so you can once again start dishing out your authoritarian horseshit once this blows over.

    Fuck you.

    (and for the people with the usual "Their service, their rules!" attitudes, fuck you idiots too. Facebook has maneuvered itself as a major gatekeeper between the press and their readers. That is what caused this whole censorship issue to explode like this in the first place. Once you start acting like a monopoly, or at least the major stake holder for a necessary position for society, you stop being allowed to have authoritarian control, and need to be more civically minded.)

    • by The-Ixian ( 168184 ) on Tuesday September 13, 2016 @03:25PM (#52881317)

      Don't use FB... problems solved.

      FB is far from a monopoly and they are not a charity. They are a for-profit company and they can run their company any way that they see fit. If you don't like it, vote with your feet and uninstall the app, delete your account and walk away. If enough people do this, then FB will not have the power it has now. FB is only as powerful as your make it.

      • Re: (Score:2, Interesting)

        by OzPeter ( 195038 )

        Don't use FB... problems solved.

        FB is far from a monopoly and they are not a charity.

        FB has just shy on 24% of the entire Earth's human population connected to it. Do you want revisit your idea that that they are not a gatekeeper in social media?

        For comparison, twitter only has accounts for about 4.3% of the Earths population.

        • by tnk1 ( 899206 )

          That said... am I really going to lack for knowledge of certain historical photos or current events even if FB does censor them? I knew of that particular picture for decades before FB came out. The reason is that it was plastered all over the place in journalism, in art and even was in one or two music videos that I can recall. Not to mention, you know, history books.

          I agree that they tend to box you into your own little echo chamber if you let them, but I am frequently annoyed and even somewhat offende

          • by Calydor ( 739835 ) on Tuesday September 13, 2016 @05:19PM (#52882145)

            You say you knew about the picture for decades before Facebook existed. That's great. That's how we learned about things BACK THEN.

            The world has changed. You may not like it, I certainly don't like it, but when a sizable portion of the population only really visits Facebook and relies on Facebook for news, events, all that sort of stuff, ANY kind of censorship is getting dangerously close to revisionism.

            There is not much difference between the main source of information, be it BREAKING NEWS! or cat videos, saying "There was no naked Vietnamese girl running from a napalm attack" or saying "There was no protesting student run over by a tank on Tiananmen Square".

            Be very careful what you allow.

          • Do what I did ( if you still can) Block everyone from posting to your wall, timeline or what ever they call it this week. If someone wants to chat with you they will have to message you. If you want to talk to send them your email, and then talk with out having to worry about your conversation ending up in a Facebook ad. Use the connections of Facebook provides without really using it.
          • by Askmum ( 1038780 )

            I agree that they tend to box you into your own little echo chamber if you let them, but I am frequently annoyed and even somewhat offended by what people sometimes post on my FB.

            Then don't let people post on your FB. Even though the principle is no different then sending you offending letters, on FB I believe you can at least block people from posting on your page.
            But in principle: you don't get it, do you? This is not about recognizing that this is a historic picture, this is about the fact that if this was a current newspicture, it would be censored by Facebook. Facebook would ignore it, deny it, say this never happened. And therefore American attacts on Vietnamese villages neve

        • Yeah, about that active user number.
          https://www.google.com/trends/... [google.com]
          Granted, some of that downwards trend might be due to people using the native app instead. However, so far, there are no examples of social networks with a double peak. They go up, and they come down. Facebook is bigger, but not a special snowflake in that regard. It will fade.
      • Think again Potsy: "News organizations are uncomfortably reliant on Facebook to reach an online audience. According to a 2016 study by Pew Research Center, 44% of US adults get their news on Facebook. " https://www.theguardian.com/te... [theguardian.com]
      • by swb ( 14022 ) on Tuesday September 13, 2016 @03:50PM (#52881495)

        These days when I go down to the public square to stand on my soap box and make my voice heard, the public square is empty.

        The public, which used to mostly be reachable via the public sphere, has all moved into spaces which are privately owned and publicly accessible for commerce, but not publicly accessible for free speech.

        This is the problem with the "go somewhere else" argument. There is nowhere else.

        • If the public is filled with idiots, they will get the government they deserve.

          Exhibit A: I give you Trump vs Clinton. Don't look at the speech issuing forth from the fans of either side and tell me they deserve something different.
        • by AthanasiusKircher ( 1333179 ) on Tuesday September 13, 2016 @08:09PM (#52882911)

          This is the problem with the "go somewhere else" argument. There is nowhere else.

          This is the disturbing natural of reality -- or perhaps SURreality? -- these days.

          I remember suffering through reading Jean Baudrillard's [wikipedia.org] musings about "simulacra" decades ago, when he famously published a set of essays including "The Gulf War Did Not Take Place" [wikipedia.org]. Of course, Baudrillard understood that the war DID take place, but he argued that the media portrayals and 24-hour news cycle that emerged had created an almost separate reality.

          Long before "The Matrix," Baudrillard talked about constructed reality and its ability to deceive and to woo humanity into complacency. But of course he is a horrendous writer and has rightly been ridiculed for willfully obscure nonsense, and at the time I dismissed what little sense I found as po-mo BS.

          Alas, now it feels it has all come to pass, and I think of good ole Baudrillard with each year's new trends into the depths of the simulacrum. Encyclopedias and reference works have ceded their authority to wikiality and truthiness, a la Stephen Colbert. Investigative journalism has been replaced by Facebook and Twitter posts. Most people live within the simulacrum, rarely bothering to try to dig deeper and see whether all of this mediated experience actually corresponds to the real world.

          And now we've delegated the authority once possessed by CNN and such to the mob of folks on Facebook. In some cases, this has undoubtedly been a good thing -- bringing a fresh democratic voice to things the "old" media would have never bothered with. But it's also a huge problem, since basic quality vetting, fact-checking, etc. are rarely done by the mob before they retweet, like, and repost.

          But that's the "reality" we live in now. Rather depressing. It would not surprise me one bit if this led to a new "dark age" as facts become less important than "likes."

        • by AmiMoJo ( 196126 )

          Before the internet it was TV and radio, which are heavily controlled and content restricted in most places. Before that it was newspapers, content decided by the editor/owner. Before that you had to compete with the other people on their soap boxes in the square, and if there were too many the police would tell you to come back tomorrow.

          No one owes you a platform.

          • by swb ( 14022 )

            TV & Radio were heavily regulated and we even had the Fairness Doctrine which *required* media companies to broadcast contrasting viewpoints. In the Fairness Doctrine era, the vast majority of top 25 TV markets had maybe 5 commercial television stations.

            So even though the television stations played games with timeslots and formats, they did give up air time to competing views, and it's kind of astonishing to think that in a given broadcast area a competing view being aired on one station literally rep

            • by dbIII ( 701233 )

              TV & Radio were heavily regulated and we even had the Fairness Doctrine which *required* media companies to broadcast contrasting viewpoints. In the Fairness Doctrine era, the vast majority of top 25 TV markets had maybe 5 commercial television stations.

              That was a reaction to earlier deliberately biased material like the Hearst newspapers. Fox tries to do something similar but the scale is vastly different - Murdoch doesn't seem to have ever had anyone framed or beaten up while Hearst had a reputation

            • by AmiMoJo ( 196126 )

              When you say "competing view", you mean "mainstream competing view", for the most part. We are far better off with the internet in that respect. I doubt things like the "manosphere" would exist if it was down to TV to air those views. Aside from not being mainstream, it would have been balanced by opposing views and thus unable to breed in an echo chamber like it does now.

              If you want the fairness doctorin for YouTube and Facebook just say so. What we have now is better though.

      • Don't use FB... problems solved.

        Because everything is just that simple. Never give feedback, never take feedback, and if you don't like the color of their webpages, go away.

        The problem with your simplistic view of life is that people often actually like feedback.

        And when we get to altering historic photographs, it gets a little into the area of politics.

        Part of the horror of that photograph is that a little kid gets napalmed, her clothes burnt off, and someone is worried that some folks want to fuck her. That's sick on so many l

        • by aevan ( 903814 )
          Not just that. Some games and products use Facebook as their contact/login system. If you want to report a bug, get support, even post on their forums? You best have a facebook account, even if faked details and never used but for this purpose. Granted that's a statement more on the companies involved than any malfeasance on Facebook's part, but still a point towards having to use it.
      • > Don't use FB... problems solved.

        Ignoring a problem doesn't make it go away.

        If FecesBook does dumb shit like censorship then according to your "solution" they would never get feedback on their stupid policy since apparently everyone would stop using it. Let's pretend that you're right and that everyone left FB. All you've done is moved the goal post. With the _next_ social platform everyone eventually would have the _exact_ same problem.

        The line must be drawn somewhere, eventually.
        Or as the colloquia

    • by 110010001000 ( 697113 ) on Tuesday September 13, 2016 @03:34PM (#52881371) Homepage Journal
      I think you are forgetting something: Facebook didn't do this on purpose. They don't hand-screen posts. It is an algorithm. It detected a naked child so it got flagged. No one should be using Facebook anyway.
      • by The-Ixian ( 168184 ) on Tuesday September 13, 2016 @03:42PM (#52881433)

        True, but in this case, it was manually removed by a FB drone after being flagged by another user as inappropriate.

        Which is fine. The drone, just like the algorithm, is just doing their pre-programmed job.

      • Another pointless post. RTFA:

        "In his open letter, Hansen points out that the types of decision Facebook makes about what kind of content is promoted, tolerated, or banned – whether it makes those decisions algorithmically or not – are functionally editorial.

        “The media have a responsibility to consider publication in every single case,” he wrote. “This right and duty, which all editors in the world have, should not be undermined by algorithms encoded in your office in Cal
        • by jratcliffe ( 208809 ) on Tuesday September 13, 2016 @03:51PM (#52881499)
          Except that it also has to be feasible. We're talking about tens, if not hundreds, of millions of pictures a day. Any screening process (and there has to be a screening process) is going to occasionally have a false positive.
          • by wierd_w ( 1375923 ) on Tuesday September 13, 2016 @05:45PM (#52882301)

            In which case, the appropriate action for Facebook to take is to have a human review the image once the poster disputes the takedown, and to act sensibly, rationally, and in a Kong authoritarian manner.

            You know, NOT telling the journalist that the image is infringing without any room to contest. NOT taking down not only the journalist's open letter about the improper takedown, and NOT deleting the PRIME MINISTER'S post about it, while pretending that doing those things is all hunky dory.

            You know, NOT the way Facebook chose to handle this, and now is trying hard to soon its way out of being caught red handed doing, and publicly shamed for.

        • I don't care what "Hansen" says. Facebook isn't going to hand edit anything. They have no responsibility at all. They are just a crap company spewing bshit.
        • by HiThere ( 15173 )

          OK, then make them criminally liable for criminal posts. They shouldn't be able to have it both ways. Either they don't censor, or they are liable for posted content. Since they are clearly censoring, they should be liable. Since the aren't being properly prosecuted, harsh public criticism is a lot less than they deserve.

    • Re: (Score:3, Insightful)

      by Kjella ( 173770 )

      I think it's a lot simpler than that they are "censorship loving fuckwits", they're a private company trying to make money. You don't make money if you need an f...ing lawyer to check for rules and exceptions and exceptions to the exceptions and any applicable precedents as to whether or not a particular post is permitted. It used to be as simple as no nudity, not because there's anything wrong with tits and ass but that's not what it's about. Then they started having issues because people posted about brea

      • by wierd_w ( 1375923 ) on Tuesday September 13, 2016 @04:14PM (#52881673)

        There is a problem with your argument:

        The censorship did not stop at just the image. A public open letter to Facebook about this issue by a freaking prime minister was deleted.

        Censorship. The real deal.

        • by Kjella ( 173770 )

          There is a problem with your argument: The censorship did not stop at just the image. A public open letter to Facebook about this issue by a freaking prime minister was deleted.

          No, it stopped at the image as the public open letter attached the same uncensored image saying Facebook is wrong to censor this photo. Facebook removed that post, she reposted it with a censored image. I have Norwegian sources to back that up if that got lost in translation. It's part of the problem of complaining about Facebook on Facebook, showing what's being removed as a violation of the guidelines is in itself a violation of the guidelines.

      • This sounds like along winded way to say that no company of any size could have culpability because they're all just doing their job trying to make money. The people affected by such actions are too simple to understand the nuance of why things occur.

        There is a fine line between making money and being a corporate sociopath.

        (And now for the record, you can find a naked 9yo on Facebook.) Correction you can find an iconic portrait of a 9 YEAR old who had just been napalmed and spent the next 17 months in a

      • by dryeo ( 100693 )

        The 1st amendment is very simple. It does not say that you should have endless free speech, it just said that Congress can't pass any laws limiting speech. Nothing about a town passing a noise by-law (something that seems fine as long as it treats everyone the same). Nothing about the courts not being able to limit speech, at a time when many things were illegal even without statutes being passed. Serious threats, slander, libel, fraud, perjury were all illegal under the common law and didn't need Congress

    • by guises ( 2423402 )
      I'm not sure that blaming Facebook is the right call for this particular censorship. The picture was removed for containing underage nudity, which is accurate. The fact that society and the law has become so very intolerant of anything even remotely related to sex and children is the root of this, and napalm girl isn't the first time [wikipedia.org] it has come up.
      • A situation that Facebook, in its majority stakeholder position, does precisely nothing to correct, and instead gives complicit support for.

        I definitely will blame them for that.

    • by jaa101 ( 627731 )

      Once you start acting like a monopoly, or at least the major stake holder for a necessary position for society, you stop being allowed to have authoritarian control, and need to be more civically minded.

      "Need to" and "legally required to" are two different things, or is there some law you're aware of enforcing this principle? Any jurisdiction will do.

      • Does law of the jungle count?

        When you make your customers leave, you don't get money, and you die.

        Facebook is aware of this, but does not wish to change its behavior. That is why it is smearing pablum all over this issue, and trying to kiss and make up, cheater style. (Cause they really love you. Honest.)

      • by HiThere ( 15173 )

        The argument is about whether they should be criticized, not about whether their actions were legal. If you want to talk about legal, then you need to consider "under the laws of which country". And would it be proper for a country to ban Facebook for doing this in favor of a local company which refrained? If not, why not?

        You don't get into such things if you are just considering whether they should be criticized, and I feel that they should be severely and repeatedly criticized. I'd be willing to consi

    • Services like Facebook are so huge that they constitute the de facto "public square" [wikipedia.org] of the Internet. As such, I think that all "public place" rules (such as the protections of the First Amendment) should be forced upon them. Non-participation in a site as huge as Facebook makes you a non-participant in modern society. [youtube.com] Unfortunately, this also means that participation in digital society is heavily controlled by whoever controls Facebook, and if they disagree with you or don't like you for any reason, they c
  • by JoeyRox ( 2711699 ) on Tuesday September 13, 2016 @03:25PM (#52881313)
    Stop trying to "get it right". You're not the arbiter of art or journalism. Just stick to what you do best - monetizing people's privacy.
    • No, but they are the arbiter of their own service. (You are not.)

      They are also forbidden by law from hosting child pornography.

    • "Getting it right" depends on what they're trying to get right. TheIntercept.com tells us "Facebook Is Collaborating With the Israeli Government to Determine What Should Be Censored [theintercept.com]" and Glenn Greenwald told us about these problems before as did Richard Stallman and Eben Moglen before Greenwald (the latter two rightly calling Facebook "a monstrous surveillance engine" and the like). As Moglen points out in every one of his speeches in the past few years (if not longer) that "Stallman was right". But back to

  • Phan Th Kim Phúc (Score:5, Informative)

    by ClickOnThis ( 137803 ) on Tuesday September 13, 2016 @03:26PM (#52881321) Journal

    In case anyone wonders what happened to her, Phan Th Kim Phúc (the girl in the photo) survived the napalm attack, albeit with injuries. She is now a Canadian citizen, living in Ajax, Ontario with her husband and two children. In 2015 she began getting laser treatments for her burn scars in Miami.

    https://en.wikipedia.org/wiki/... [wikipedia.org]

    • I wonder. If she had an issue with the photo and requested that FB take it down, would that be a different story?

      This is hypothetical, of course.

  • by cold fjord ( 826450 ) on Tuesday September 13, 2016 @03:27PM (#52881331)

    Her name is Kim Phuc and she now lives in Canada. She was fleeing a napalm strike by the South Vietnamese Air Force.

    How the Vietnam War's 'Napalm Girl' Is Finally Getting Her Scars Treated – 43 Years Later [people.com]
    The girl in the picture: Kim Phuc's journey from war to forgiveness [cnn.com]
    'Napalm Girl': An Iconic Image Of War Turns 40 [npr.org]

    The Kim Foundation International [kimfoundation.com]

  • Facebook is a private website dedicated to monetizing the information they have collected from the members of that private site. Facebook can do whatever they please on their private website.

    .
    If there is a problem, it is those people who mistake Facebook for journalism or integrity.

  • by John Allsup ( 987 ) <slashdot&chalisque,net> on Tuesday September 13, 2016 @03:43PM (#52881441) Homepage Journal

    There needs to be acceptable nudity policies. These should require users who upload photos with nudity to tag them as such, including whether sexual or non-sexual (the napalm girl is clearly non-sexual), and even pornographic (if there is a service that allows pornographic images). The rule then is that the uploader must tag certain tags if appropriate (e.g. non-sexual nudity), and so on. Then users have users settings on whether to block such tags, and if they see untagged images which should have been tagged, and would have been blocked given their settings, then there is the 'inappropriate image' system. When it comes to sexual nudity stuff, if present at all, there should be checks on users. Then AI can flag possible non-tagged images. This really ought to be well within what Facebook can do. In addition, with sensitive stuff (like the revenge porn stuff), there should be terms and conditions where blatant stuff like that european lawsuit is about can lead to details of uploaders being sent either to police or the victim's lawyers.

    The problem is to try too hard to have an idiot-proof one-size-fits-all acceptable image policy.

    • The same AI they used to recognize naked and frightened children, could just as easily fingerprint famous images. (actually it would be simpler). It would be pleasant if people had thicker skins.

      • This wasn't an AI. It was a non-A I. Facebook use an army of human moderators. If you click the button to report inappropriate content, they are the ones who inspect it.

        Some minimum-wage drone had this flash on their screen, checked it against their list of forbidden material, and ticked 'child with exposed genitalia' or something like that.

    • by HiThere ( 15173 )

      Why do there need to be "acceptable nudity policies"?

      I generally consider people who worry about nudity to be sick, and I wouldn't object if they got free medical treatment for their problem (except I don't think there is any accepted medical treatment).

      I also consider those who support most censorship to be sick, with a similar comment. There are a few cases where public safety does indicate that censorship is desirable. E.g., people's bank account numbers, instructions in how to weaponize anthrax, etc.

    • by AmiMoJo ( 196126 )

      Maybe we should just get away from nudity being an issue. The more we try to hide it, the more it gets fetishised.

  • by clovis ( 4684 ) on Tuesday September 13, 2016 @03:45PM (#52881459)

    What happened at Facebook was a mistake, but I would have made the same mistake.

    If I owned Facebook, I would have a censorship policy. No naked children would be near the top of the list. It might even be the only thing on it.
    I'm certain that most of the photos of naked children in existence are perfectly innocent. I have some of my kids and my parents have some of me.

    But I don't want to host child porn, child rape, or anything like that. It's a plain and simple fact that there are people who abuse children in horrible ways, and if I didn't censor that kind of thing it would be all over the place. I don't give a shit if the law says it's OK for me to host it; I don't want to be part of it.
    And you know what else? I don't want to have to examine photos of naked children to try to guess what's going on.
    So. No naked children.

    So all my minions would know this and censor publication of the Kim Phuc photo because they want to keep their jobs and perhaps because they agree with me.

    And then the world would come down on me over the Kim Phuc photo, pointing out that I'm being a dumbass and this is so very clearly and important and historical photo, and I'd relent because in this case they're right and I'm wrong. But no way would I roll over for just anyone out there - it would have to take a lot of pressure for a specific case.

    • According to the Intercept they collaborating with the Israeli Government to decide what should be censored. The next time the censorship subject comes up about Facebook it won't be about naked children. Justifying censorship as an algorithmic decision absolves no blame.

  • I'm sure it will be fine after you control all the news: https://www.yahoo.com/news/fac... [yahoo.com]
  • Facebook hardly ever gets it right. Whatever it does. Facebook can not meaningfully be associated with "right" within any moral framework worthy of that name.

  • If they would have kept their strong policy here, there would be a general discussion about the policy. So they admit a small failure, allow the image and life goes on. Nobody needs to discuss the policies any further, because the image is there, isn't it?

Don't tell me how hard you work. Tell me how much you get done. -- James J. Ling

Working...