Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Facebook

Facebook's Success Was Built on Algorithms. Can They Also Fix It? (cnn.com) 70

Experts tell CNN that Facebook's algorithms could be improved. "It will, however, require something Facebook has so far appeared reluctant to offer (despite executive talking points): more transparency and control for users." Margaret Mitchell, who leads artificial intelligence ethics for AI model builder Hugging Face and formerly co-led Google's ethical AI team, thinks this could be done by allowing you to view details about why you're seeing what you're seeing on a social network, such as in response to the posts, ads, and other things you look at and interact with. "You can even imagine having some say in it. You might be able to select preferences for the kinds of things you want to be optimized for you," she said, such as how often you want to see content from your immediate family, high school friends, or baby pictures. All of those things may change over time. Why not let users control them? Transparency is key, she said, because it incentivizes good behavior from the social networks.

Another way social networks could be pushed in the direction of increased transparency is by increasing independent auditing of their algorithmic practices, according to Sasha Costanza-Chock, director of research and design at the Algorithmic Justice League. They envision this as including fully independent researchers, investigative journalists, or people inside regulatory bodies — not social media companies themselves, or companies they hire — who have the knowledge, skills, and legal authority to demand access to algorithmic systems in order to ensure laws aren't violated and best practices are followed.

James Mickens, a computer science professor at Harvard and co-director of the Berkman Klein Center's Institute for Rebooting Social Media, suggests looking to the ways elections can be audited without revealing private information about voters (such as who each person voted for) for insights about how algorithms may be audited and reformed. He thinks that could give some insights for building an audit system that would allow people outside of Facebook to provide oversight while protecting sensitive data. A big hurdle, experts say, to making meaningful improvements is social networks' current focus on the importance of engagement, or the amount of time users spend scrolling, clicking, and otherwise interacting with social media posts and ads... Changing this is tricky, experts said, though several agreed that it may involve considering the feelings users have when using social media and not just the amount of time they spend using it.

"Engagement is not a synonym for good mental health," said Mickens.

This discussion has been archived. No new comments can be posted.

Facebook's Success Was Built on Algorithms. Can They Also Fix It?

Comments Filter:
  • No it wasnt (Score:5, Insightful)

    by ClueHammer ( 6261830 ) on Monday October 11, 2021 @07:37AM (#61879859)
    It was build on voyeurism! Everyone wanting to know what everyone else is up to all the time. No matter how trivial or boring!
    • Re: No it wasnt (Score:4, Insightful)

      by e3m4n ( 947977 ) on Monday October 11, 2021 @08:16AM (#61879933)
      Dont forget his first attempt, FaceMash; which was nothing but a site designed to rate people based on nothing but appearance. You know the age old chemistry axiom: Garbage in; garbage out.
      • Only facepalm turned out to be a giant garbage collection routine.
        • by Anonymous Coward
          Follow the money. Everything else is pointless teeth-gnashing that won't change a damned thing.
          • gold mine.
            facebook has found a goose that lays golden eggs.
            facebook now has to find a way to guard that goose even more

    • Re:No it wasnt (Score:5, Insightful)

      by bigdavex ( 155746 ) on Monday October 11, 2021 @09:38AM (#61880113)

      I think it's more built on exhibitionism. Yes, people do LIKE other people's breakfast, but it's mostly in the hopes that their own breakfasts will be LIKED in return or to validate the other person, not because they want to see breakfasts.

    • by gweihir ( 88907 )

      It was build on voyeurism! Everyone wanting to know what everyone else is up to all the time. No matter how trivial or boring!

      Pretty much. And that _cannot_ be fixed.

    • Re: (Score:2, Insightful)

      by Joce640k ( 829181 )

      It was build on voyeurism! Everyone wanting to know what everyone else is up to all the time. No matter how trivial or boring!

      Nope. Completely wrong.

      It's built on people posting things and hoping that other people will like them for it, eg. selfies.

    • It was build on voyeurism! Everyone wanting to know what everyone else is up to all the time. No matter how trivial or boring!

      I mostly agree, especially given some of Zuck's early comments about Facebook users. Nevertheless, I would instead say that its early success was based on people simply wanting to connect with others. I think people joined out of a desire to stay current with others when time and/or geography made in-person visits impractical. Then, its explosive growth was fueled largely by voyeurism - but I suspect even then the algorithms purposely encouraged that voyeurism.

      I love the idea of Facebook - I had an account

    • by hey! ( 33014 )

      To succeed, you have to outcompete others trying to do exactly the same thing. Facebook succeeded because its algorithms boosted site engagement, making it a more valuable [statista.com] property than, say, MySpace. So yes, algorithms are *a* cause of its success; but we have to bear in mind that in the real world, things have multiple causes.

      That pretty much leads to the answer to the question posed. In theory algorithms could fix what we are calling "Facebook's problems", but that won't happen until addressing those

    • You misspelt insecurity.

      A pseudo form of validation because their own lives too fucking boring so they have to follow someone else's fake life in the hope that they will get noticed for 5 seconds.

      --
      Do you really want to use a website where the owner (Zuckerberg) call his customers: They "trust" me. Dumb fucks.

  • Can They fix it? (Score:4, Insightful)

    by IdanceNmyCar ( 7335658 ) on Monday October 11, 2021 @07:41AM (#61879871)

    No. First paragraph suggests they take a solution that destroys their bottom line. Second paragraph then proposes auditing which isn't even them fixing it, but regulations stepping in.

    No they won't fix it and as whistles blow and leaders in the corp even ask for regulation to step in, it's all saying Facebook has no desire to fix it.

    • Of course things can be fixed, or at least massively improved upon. Let's open-source and regulate the algorithms in-use. There's no justification for the status quo at this point. This has nothing to do with Section 230. Why not, unless you're invested in Mark Zuckerberg's camp and afraid of losing power and control?
      • by e3m4n ( 947977 ) on Monday October 11, 2021 @08:21AM (#61879945)
        You know how stocks work right? Buy the rumor, sell the news. The minute anything you suggest happens, investors will start dumping their stocks. There wont be much opportunity to experience 20% growth year over year. As stocks are sold off, the share price drops. It becomes a feedback loop until the shares are devalued to a fraction of what they are worth today. As a result, FB lays off 2/3 of their staffers. Thus begins the slow march toward oblivion. Thats why they fight tooth and nail to avoid reform.
      • I am not trying to support any status quo. I am just saying they won't freely fix it.

        Open source the algorithm is pretty heavy handed and is regulation as I mention. Auditing is a more mild approach but put posing to confiscate their IP is rather extreme. Person abolition seems the best to me. "Nuke the entire site from orbit--it's the only way to be sure"

        • I am not trying to support any status quo. I am just saying they won't freely fix it.

          I generally work on the assumption that large corporations act selfishly -- they don't do things out of the kindness of their heart, because they don't have a heart, in the moral sense.

          Having said that, corporations can act for the good of society, when that benefits the corporation. If Facebook suffers a loss of business due to being perceived as evil by many people, then I can imagine they would put some effort into implementing the appropriate algorithms to make their service more appealing to users. How

      • Re:Can They fix it? (Score:4, Interesting)

        by DarkOx ( 621550 ) on Monday October 11, 2021 @09:22AM (#61880055) Journal

        This has nothing to do with Section 230.

        BS it has EVERYTHING to do with 230. No 230 did not make them pursue growth and engagement maximization through conflict generation but it sure as hell enabled them to do so! 230 shields them from basically anyone harmed by their providing the megaphone to anyone wishing to make libelous statements.

        It might not be responsible for some of the other effects like body imagine issues and envy and depression over seeing other peoples 'apparently perfect lives' but not having to face the same legal liabilities and rules governing every other publisher is precisely how facebook got as big as it did without having the means, resources, and understanding need to moderate its own platform and respond to the other issues.

      • by Zak3056 ( 69287 )

        Let's open-source and regulate the algorithms in-use

        Constitutionally, this would almost certainly be a "taking" that the owners would have to be justly compensated for. If you take the headline of this article for granted "Facebook's success built in algorithm" then it's a fair case that the value of said algorithms are a substantial portion of Facebook's value and annual revenue. Given that their market cap of almost a trillion dollars and annual revenue of > $80 billion, I'd say "just compensation" probably has ten or eleven zeros in the number.

        • Constitutionally, this would almost certainly be a "taking" that the owners would have to be justly compensated for. If you take the headline of this article for granted "Facebook's success built in algorithm"

          Facebook claims to be first and foremost a way for you to keep in touch with other people but it is actually first and foremost a place to fuck with you in order to generate ad revenue. As such their whole business model is based on fraud and that makes it fundamentally illegal. Seizing and publicizing the tools they use for their illegal endeavor wouldn't require compensation, only proving that they acted with malicious intent.

    • IBM use to be in the business of making typewriters, and other office tools. It was late to the game in the computer market. However it quickly changed its strategy and business plan, to become one of the classic Computer Companies that we all associate with computers. Even now as IBM is transiting off computer sales and support to rely more on its Cloud and Services business.

      If a companies bottom line, is so fixed on one business model, it is doomed to fail.

      The algorithm was a good idea at first, but it

      • by Zak3056 ( 69287 )

        IBM use to be in the business of making typewriters, and other office tools. It was late to the game in the computer market

        IBM was making computers in 1952.

        • And IBM was making computing tools before there were computers. Companies would become IBM were making punchcard tabulators in 1884. They were used in the 1890 US census.

      • I agree, companies need to diversify. Facebook's strategy has been more to eat up all competitors for social networks. This likely will fail in comparison to something like Amazon.

        As for Facebook being scummy, that's already happened. There just aren't better alternatives with critical mass...

    • by leptons ( 891340 )
      The same level of awareness that created a problem cannot be used to fix the problem. Facebook has a problem. Facebook is the problem.
    • by dan325 ( 1221648 )
      This is exactly right. I'm not sure people understand how fundamental engagement is to social networks. This is like asking Exxon Mobil to stop drilling for oil. They're an oil company. Facebook is an engagement company.
  • Pipe dream... (Score:4, Insightful)

    by bradley13 ( 1118935 ) on Monday October 11, 2021 @07:50AM (#61879881) Homepage

    "You can even imagine having some say in [what you see]"

    The fact that TFS says "you could even imagine" - as though this is something far fetched? That's just sad. Of course,. users want to decide what they see, how it's ordered, etc.. That's basic, that's not even a questions.

    Equally of course: that's not what Facebook wants. Facebook make $billions by *not* doing what users want. So it's not going to happen.

    Eventually, Facebook will go the way of MySpace. There is an important role for government, though: Facebook (or any such behemoth) needs to be prohibited from buying up the competition. Above a certain size, a company should no longer be allowed to do any acquisitions or mergers.

    • There is an important role for government, though: Facebook (or any such behemoth) needs to be prohibited from buying up the competition.

      I have read in various places that economists believe that anti-trust laws have failed to prevent the creation of near-monopolies such as Facebook, Google, and Amazon. One problem seems to be that these companies do not engage in the usual monopoly behaviour, typified by greatly increasing prices, so superficially, they are not harming consumers. In the case of Facebook and Google, consumers do not pay anything directly, and goods from Amazon are generally very cheap.

      If Facebook is a monopoly, stifling comp

  • this could be done by allowing you to view details about why you're seeing what you're seeing on a social network, such as in response to the posts, ads, and other things you look at and interact with. "You can even imagine having some say in it. You might be able to select preferences for the kinds of things you want to be optimized for you,"

    Showing users details and letting them set preferences would require Facebook to either lose a lot of page views or show they have categories for stuff like white supr

  • by Rosco P. Coltrane ( 209368 ) on Monday October 11, 2021 @07:57AM (#61879899)

    on it's users' inanity, credulity and inability to resist oh-shiny. None of that can be fixed. The only way for Facebook to get better is for Facebook to refrain from building a business that rests on people's stupidity and to stop addicting them to its slot-machine-like features - in other words, disappear.

    • What, Farmville ruined your life huh?

      Nah. Your points are spot on. Facebook gives a lot of people exactly what they want which includes politically fueled dichotomic fits of rage which is what set of this whole thing.

      I wonder what solution you would purpose/support though?

      • Commercial social media has no solution, because ultimately, it always works against, not for, the society.

        Fits of rage get an order of magnitude more attention, and attention is what commercial social media is monetizing, one way or another. Remember how a bad review of a restaurant needs 10 good ones to counter balance? It's a similar mechanism.

        So whoever will feed more outrage to the masses, will get to harness more eyeballs. And if we regulate that, whoever stetches the limits of regulation in that dire

      • Facebook gives a lot of people exactly what they want which includes politically fueled dichotomic fits of rage

        Never seen it put so well. But I think it's clear there are many reasons people use facebook and none of them are easily fixable. Your point, voyeurism, egoism, the need for community and social contact in a modern society that offers little to none.

        Facebook is cancer and Twitter deserves its moniker as the "hellsite." But it's hard to see how to reform either unless society has a major awakening and Dorsey and Zuckerberg share a cell at the Hague.

    • The only way for Facebook to get better is for Facebook to refrain from building a business that rests on people's stupidity and to stop addicting them to its slot-machine-like features - in other words, disappear

      This would be similar to the situation of tobacco companies. We would all be better off if tobacco products had never existed.

      The trouble I have with that analogy is that Facebook can actually be quite useful, and I don't think I have been harmed by using it. By contrast, I am pretty sure I was harmed by smoking for years. The useful features of Facebook include being able to publish your ideas to a group of friends. It is considerably more convenient than email in this respect. My ideas tend to be philosop

  • by Dan East ( 318230 ) on Monday October 11, 2021 @08:00AM (#61879907) Journal

    As a fairly long-time FB user, the success most certainly was not due to algorithms. There was a time when most peoples' friends were actually people they knew outside of FB, and there was a more real and intimate connection between people. It really was used to re-connect and keep up with family, friends, schoolmates, past coworkers, etc. The majority of the content was actually created by regular human beings.

    Now everything is changed. Even the postings by real people are mostly regurgitated things they are simply re-sharing (please no more memes). Instead of the majority of content comprising actual posts by friends, it is now a small minority. The majority of friends are people who have never met outside of FB. That's because growth is paramount. FB wants you to network with more "friends", otherwise the platform has the appearance of stagnating.

    FB got its core following and its momentum in the early stages, and now it has been commercialized and monetized to the point that it is something else entirely. I suppose this is inevitable for any platform once it matures to a certain point, because the potential profits for turning the user base into a commodity to which it can show ads and promoted material are too vast.

    • FB got its core following and its momentum in the early stages

      ...when its purpose was to collect women's PII for the purpose of stalking them.

      • FB got its core following and its momentum in the early stages

        ...when its purpose was to collect women's PII for the purpose of stalking them.

        Hence the "Privacy Rapists" nickname I give Facebook.

    • For the most part everyone hates the algorithm. And many people are annoyed on how Facebook keeps sorting them in the order of the most stupid on top, vs newest on top. People jumped to Facebook from MySpace, mostly because it was a less crowded and more precise view of their friends and families. Where they can keep a tab on what is new.

      Not the algorithm that gives us crap.

      • But FB has to on keep selling the idea of algorithms because that's what the advertisers buy. Admit that the algos are shit and there goes their revenue stream. Wait until the advertisers figure that out for themselves and the lawsuits start.

      • For the most part everyone hates the algorithm.

        The "everyone" you speak of are I would guess people of your social group, which is probably not the target of Facebook's algorithms. I don't think Facebook are so incompetent as to make their service annoying to users in general, but they can't please everybody. Pissing off a few nerds is I suppose a price worth paying, so all those ordinary folks can post pictures of their lunch.

    • by _xeno_ ( 155264 )

      As a fairly long-time FB user, the success most certainly was not due to algorithms. There was a time when most peoples' friends were actually people they knew outside of FB, and there was a more real and intimate connection between people. It really was used to re-connect and keep up with family, friends, schoolmates, past coworkers, etc. The majority of the content was actually created by regular human beings.

      Yep.

      FB got its core following and its momentum in the early stages, and now it has been commercialized and monetized to the point that it is something else entirely.

      I think people forget just how much regular users fought against this change too. There were entire browser mods designed to restore the original "newest first, posts from friends" view. And Facebook developed an entire web UI framework that randomly rearranges HTML and CSS to prevent them from working. People fought against Facebook turning their feed into algorithmic crap. No one really wanted it except Facebook.

      From the recent whistleblower stuff it sounds like Facebook is actually suffering from it.

    • These days Facebook has lifted a page from the spammers palybook. Every new post of an identical meme now comes from a different entity. "Hide all" doesn't work any more when they repost under a new astroturfed user name and the same people keep sharing it mindlessly. The "friends' that do this get snoozed for 30 days, especially with old political memes that are intended to stir up negative emotions about someone that thinks in a different direction than the poster. Snooze for 30 days, unfriend the actua
  • ... that when you're in a hole, the first thing to do is to stop digging?

    Pfizer dug its hole by excessive trust in algorithms. They will not be the way out.

    An algorithm is essentially the same as a computer model. The weakness of each is that it codifies a set of assumptions and beliefs, usually with absolutely no way of learning from experience.

    That, you might say, is the fundamental difference between an algorithm or model and a nervous system - even the most primitive.

  • Can they fix it? I'm not entirely sure that they can.

    WILL they fix it? Only if it makes them more money that it will cost them to fix it.

    Given that the result of the fix is likely to be that people are less likely to maintain engagement, this is highly unlikely.

  • by jarkus4 ( 1627895 ) on Monday October 11, 2021 @09:05AM (#61880015)

    The core of the issue is all kinds of social media connect people directly, bypassing any kind of social censorship. It works for both good and bad causes (classification often depends on your own position on particular issue). Engagement boosting algorithms mostly improve discoverability of like minded people and accelerate what would happen anyway. Then those people start reinforcing their beliefs forming deeper convictions.

    What we are seeing now as troublesome groups always existed on the fringe of the society, but they didn't have the critical mass to really become noticeable. Now they can find each other regardless of geographical location and present their viewpoint to the masses bypassing societal censorship provided by traditional media.

  • ... for doing what people designed and configured the algorithms to do.
    • ... for doing what people designed and configured the algorithms to do.

      But aren't the algorithms "AI" and therefore responsible for their own actions? /sarc

  • by DarkOx ( 621550 ) on Monday October 11, 2021 @09:13AM (#61880023) Journal

    The fundamental problem here is facebook got better at being facebook, which is tool to entertain people while you show them ads.

    Its literally no different than TV, movies, news papers, etc except in where and how they source content. Name one good movie that does not have antagonist, even if they antagonist is some creation of the protagonists imagination? (man vs self). Without conflict there is no story.

    Independent of if facebook execs understood it or not when they started down the path of adding more 'intelligence' to the algorithms that pick stuff for your news feed, they needed more conflict to drive more engagement. The old adage applies be careful what you measure you'll get more of it. Facebook measured engagement and they got more engagement period.

    I suspect the reality is facebook could tweak things to turn the frequency it selects certain topics, news sources, memes, etc - the result would be less conflict and less engagement. Turn it down as much as the 'chattering-class' seems to think they should and the result will be it becomes a boring place where people post wedding, birth, obituary, and party announcements, and they occasional personal triumph like the rec baseball team winning the city championship, new house, or a job promotion. People will check it about as often as they both visiting linkedin.

    Reality is for facebook to be successful they need to some how get back to where they were in like 2k12-14. Lots engagement, lots of ad revenue, lots of conflict but not the bitter alienating dog fight that gets to much of the wrong kind of attention. However I am not sure they can get there now because they have already garnered to much of the wrong attention. Even back than the occasional maladjusted adolescent decided to get self destructive and "social media" was cited as being partly responsible. The trouble they have at this point is people don't say 'social media' when that happens anymore they say 'facebook' or 'instagram' and people are tetinized to the story. When they hear it now - its facebook's fault because in their minds facebook is a evil corporation with a history of endangering children - response "facebook bad!"; different than 10 years ago when they did not have such a poor reputation "facebook was the fun page they share vacation photos on" and that kid must had some really bad parents/uncaring friends, etc - response "so sad facebook was just the medium though its not at fault".

    • The fundamental problem here is facebook got better at being facebook, which is tool to entertain people while you show them ads.

      Facebook is far more than an ad-slinger. As far as I can tell, what Facebook sell is detailed marketing data, because Facebook users freely give up personal information. Getting this quality and quantity of data by conventional means would involve large scale surveys, which ain't cheap.

      Though this is a bit off-topic, I do notice how targeted ads pop up all over the place. I am pretty sure one of the electronics distributors I use has sold my search data, because now I am flooded with ads for surface mount p

      • by DarkOx ( 621550 )

        Yes they are more than ad slinger, I agree. However the need is the same, keep you on the site as much as possible so the ad impression numbers keeping rolling over, and keep you interacting with facebook content either on facebook or instagram sites directly or through various web parts so they can keep gathering data.

        It comes down they need to keep you interacting as long and as much as possible. Their ML and data science has zeroed in on exactly how to do that. Which is to make you upset about stuff, eit

  • by StormReaver ( 59959 ) on Monday October 11, 2021 @09:15AM (#61880035)

    Of course Facebook's success was built on algorithms. All software is built from algorithms. An algorithm is simply a defined list of steps to take to get from Point A to Point B. Every piece of software ever written was built on algorithms. Even the standard Hello World starter program is built on an algorithm. The headline is a lot like saying that the bonfire's success was built on it's heat. No kidding.

    Facebook's success was built on exploiting the knowledge that a great many people will actively harm themselves in exchange for knowing what other people are doing. And the need to know what other people are doing is driven by low self-worth (other people are better than I am). It takes a particular breed of sociopathy to exploit that widespread mental disorder and actively use it against those afflicted with it.

  • by Chris Mattern ( 191822 ) on Monday October 11, 2021 @09:24AM (#61880061)

    Because in Facebook's view, there's nothing to fix. It is enormously successful at its intended purpose: it makes them gobs of money.

  • by JBMcB ( 73720 ) on Monday October 11, 2021 @09:41AM (#61880119)

    I tend to post this every time Facebook's algorithms are questioned.

    They work fine. A few years ago I spent a few minutes every day clicking on "hide" and "show" on Facebook posts. Now Facebook gives me *exactly* what I want. Posts about architecture, old computers, and my friend's and family's kids being cute. That's what I want to see, that's what I told Facebook I want to see, and that's what I get. I don't get random political garbage. I don't get celebrity gossip. I don't even get ads for junk I don't care about. I have a couple of friends who post political junk. I unfollow them. Usually Facebook is smart enough to shove those posts down the list for me.

    The algorithm is fine. You just have to feed it properly. It took me, maybe, twenty minutes out of a single week.

    • by DarkOx ( 621550 )

      The thing is you are not most people. There are handful of folks that don't readily become chemically dependent on cigarettes either and can go out on Friday nights smoke a handful at the bar and than not use them the rest of the week. It does not mean others are automatically capable of responding the same way you do and it does not mean the product has not been engineered to trigger the every waking hour habit many others experience in as many people as possible.

      I am not suggesting btw that facebook use

  • Another way social networks could be pushed in the direction of increased transparency is by increasing independent auditing of their algorithmic practices...

    No. This would be too complex, too costly, and too prone to being corrupted or gamed. The solution is to either scrap social networks, or declare them part of the common where they can be administered by people who ultimately report to the voters. Yes, I know the current system either neuters or bypasses the voters much of the time. But even at that, Facebook would probably be better if it was taken out of the hands of its psychopath-in-chief and run by a branch of the government. Social media has gained th

    • ...has gained the status of infrastructure, only for morons. There, fixed that for you, now fix your silly belief.

    • Facebook would probably be better if it was taken out of the hands of its psychopath-in-chief and run by a branch of the government.

      You just invented Pravda. Do you really trust politicians of the current variety with a tool as powerful as Facebook?

  • They are working just fine. They are raking in the money.
    This latest story about Facebook's misdeeds is going to end up like the millions before them. Outrage for a few days and then nothing.
  • No (Score:4, Interesting)

    by dskoll ( 99328 ) on Monday October 11, 2021 @11:20AM (#61880469) Homepage

    No, they cannot fix it with algorithms. The problem is not the algorithm. The problem is that the goal given to the algorithm is: Maximize revenue.

    Until a primary goal of Do Not Harm is put in place, which is unlikely to happen without regulation or legislation, the algorithm will continue to maximize revenue and do harm as a side-effect.

  • It's your friends that are the problem.

  • Any creation of man can be used for both good and evil. This goes with the fact that every human endeavor involves money somewhere along the way. It's inevitable that the Facebook algorithms would eventually be perverted (no pun intended) by some people for nefarious purposes. So, no, it can't be fixed. It can, however, be exposed as such.

    • So, no, it can't be fixed. It can, however, be exposed as such.

      That would no doubt go down well with the readers of The Washington Post, or the Guardian. But for people whose ideas come from populist rags and "news" websites, I don't suppose it will make the blindest bit of difference.

  • Like how Facebook was. Now its nothing but censorship and it algorithms trying to put content I didn't ask for in front of me. And what is with the "sensitive" blocks? I'm not sensitive so stop trying to block stuff from my non-sensitive eyes.

  • Facebook is working exactly as intended. It's optimally configured to generate the maximum revenue available. Any deviation from this current configuration is likely to reduce revenues. What we are proposing isn't to fix Facebook but rather to declare it anti-social & harmful to society. We don't want to fix Facebook; we want to stop it from harming people & societies around the world.

I program, therefore I am.

Working...