Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Facebook Government Social Networks

How Should Facebook Be Fixed? (vox.com) 171

The technology site Recode interviewed 12 "leading thinkers and leaders on Facebook today," including the Senator pushing tech-industry updates for U.S. antitrust law, an early researcher on viral misinformation, and a now-critical former Facebook executive. "[M]ost believe that Facebook can be fixed, or at least that some of its issues are possible to improve..." Their ideas are wide-ranging, with some more ambitious and unexpected than others. But common themes emerge in many of their answers that reveal a growing consensus about what Facebook needs to change and a few different paths that regulators and the company itself could take to make it happen:

- Antitrust enforcement. Facebook isn't just Facebook but, under the Meta umbrella, also Instagram, WhatsApp, Messenger, and Oculus. And several experts Recode interviewed believe that forcing Facebook to spin off these businesses would defang it of its concentrated power, allow smaller competitors to arise, and challenge the company to do better by offering customers alternatives for information and communication.

- Create a federal agency to oversee social media, like the Food and Drug Administration. The social media industry has no dedicated oversight agency in the U.S. the way that other industries do, despite its growing power and influence in society. That's why some people we interviewed advocated for making a new agency — or at least increasing funding for the existing FTC — so that it could regulate safety standards on the internet the same way the FDA does for food and pharmaceutical drugs.

- Change Facebook's leadership. Facebook's problems are almost synonymous with the leadership of Mark Zuckerberg, who has unilaterally controlled the company he started in his Harvard dorm room in 2004. Many interviewees believe that for any meaningful change to happen, Facebook needs an executive shake-up, starting from the very top... some experts Recode interviewed suggested that Facebook executives should be criminally prosecuted for either misleading business partners or downplaying human harms their company causes.

The experts also want reforms on the safety-from-prosecution shields of Section 230 "in a way that won't run into First Amendment challenges," and also increased transparency from social media companies about problematic content.

"Some of the experts interviewed by Recode argued that Facebook and other social media companies should be legally required to share certain internal data with vetted researchers about what information is circulating on their platforms."
This discussion has been archived. No new comments can be posted.

How Should Facebook Be Fixed?

Comments Filter:
  • The usual procedure (Score:4, Informative)

    by quonset ( 4839537 ) on Saturday November 13, 2021 @09:38PM (#61985661)

    Is either snip-snip or dig-dig. Take your pick.

    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Saturday November 13, 2021 @10:01PM (#61985703)
      Comment removed based on user account deletion
    • Seems a rather disappointing FP. But you were in a hurry to FP, right? Care to explain your intention for either option?

      I really had high hopes for a constructive discussion when I saw the story. Since it's still early in the discussion, I'll go ahead and throw in my two solution ideas, one at the personal level and one at the global level, though with a focus on how it affects the personal level.

      At the personal level, my beef with Facebook was the waste of time. I'd check in and then get "engaged" for 30 m

      • At the global level, the fix I favor would involve vertical division into competing companies. Each DoF (Descendant of Facebook) would have the personal data from some fraction of the global users.

        So, given your fix were adopted, how would you force any particular user of FB to stay on the DoF you selected for them, if it turns out that the people they want to read about (natter with, whatever) have been put on a different DoF?

  • easy (Score:5, Insightful)

    by plate_o_shrimp ( 948271 ) on Saturday November 13, 2021 @09:39PM (#61985663)
    Shut it down
  • Treatment plan (Score:5, Insightful)

    by backslashdot ( 95548 ) on Saturday November 13, 2021 @09:41PM (#61985669)

    Like any cancer, it should be fixed by surgical removal or other means of extirpation.

    • Get users educated enough that they realise the "stranger danger" is actually a problem when exposure is scaled up to a massive pool of people outside of their local community. Then teach them that the solution to this problem is well-organised use of instant messaging, SMS and telephony. Zuckerbook's advantages dried up the cloud services became cheap for consumers, rather than just the businesses leveraging them.

      The future is people paying pennies for the services they use and advertising relegated t
  • Clear Malicious Bias (Score:3, Informative)

    by Kunedog ( 1033226 ) on Saturday November 13, 2021 @09:43PM (#61985671)
    Because videos of the incident were available almost immediately, it's been obvious since the beginning that Kyle Rittenhouse acted in self defense and there was zero case for a murder charge. It's a travesty that this farce made it to trial.

    I bring the trial up because if you have any remaining doubt about the degree of censorship (and worse) bias of Big Tech, remember that GoFundme banned Rittenhouse's defense fundraisers [washingtontimes.com] and Facebook outright called Rittenhouse a mass murderer [washingtontimes.com] and banned posts stating/arguing that he defended himself:

    https://twitter.com/brianfishm... [twitter.com]

    Mainstream media coverage of the trial has likewise been atrocious, complete with footage of that night precisely edited to remove the moments showing Rittenhouse was attacked first. Obviously Youtube (and Facebook too) shoves these "authoritative" and "trustworthy" news sources' videos in everyones' faces, on which everyone can also see the red flag of a lopsided dislike ratio (but not for long [slashdot.org] of course!).

    https://www.youtube.com/watch?... [youtube.com]

    None of these proposed measures will "fix" this problem with Facebook (or social media in general), and in fact are probably intended to make it worse.
    • Yeah, but those same accusations can be leveled at Fox or CNN or any number of media outlets, so what makes Facebook special in regards to regulation or increased scrutiny?

      Unless you reeaaally want to revive the ghost of the the Fairness doctrine with regards to the media, this seems more like market filtering (and if you are unaware most tech is biased, there is no helping you at this point).

      People who believe in Bigfoot and a balanced budgets also vote and voice opinions. *shrugs*

      There's something to be s

    • Great sources there, the Washington Times.

    • by LeeLynx ( 6219816 ) on Saturday November 13, 2021 @11:27PM (#61985837)

      it's been obvious since the beginning that Kyle Rittenhouse acted in self defense

      As a general rule, you don't get to cry 'self defense' when you provoke a dangerous situation. I get that it is "obvious" when you approach this with the reductive assessment that 'well, open carry is legal, so he wasn't doing anything wrong', but that's not how this works. Trials would be way easier if they just took the defendant's word for what their intent was. Unfortunately, they look at the surrounding circumstances, too:

      - KR traveled a considerable distance to a city he knew was suffering civil unrest;
      - KR traveled to Kenosha specifically because it was suffering civil unrest;
      - KR had no business in Kenosha unrelated to that civil unrest;
      - KR armed himself when traveling to that city;
      - KR can be seen, also on video, patrolling Kenosha armed;
      - KR openly expressed the intent to do so;
      - KR, at a minimum, made no effort to avoid protestors and, given his express intent to patrol the city, almost certainly sought them out;
      - KR knew or should have known that confronting protestors while armed would be perceived as a threat;
      - KR, having shot one of these protestors, represented to emergency operators that he had merely came to Kenosha to act as a 'medic'.

      These things may all be perfectly legal. However, taken together, they are a strong indication that KR went to Kenosha with the intent to seek out protestors, armed or unarmed, and, through force of arms, either bring them to heel without legal authority to do so or, in the alternative, violently subdue them; then, upon the occurrence of the latter, realized that he was engaging in less-than-noble behavior, and attempted to mislead authorities about his intent.

      Even under Wisconsin law, that could readily knock down a self-defense argument. Will they get a conviction? I don't think so, and they probably shouldn't - from an evidentiary standpoint, it's rather thin.

      I usually favor the defense in a criminal trial, and just because I find a defendant's behavior abhorrent and obviously wrong, does not mean that I'd want to see the legal standard thrown out in the name of convicting one asshole. Besides, this is a 17-year-old kid - this is the influence of years of the same Tea Party horseshit that has turned the GOP from the bulwark of sanity and reasonable positions to a goddamn clown show.

      • by russotto ( 537200 ) on Saturday November 13, 2021 @11:53PM (#61985905) Journal

        You appear to do a better job of putting together a case than the actual prosecution. Though

        1) That's a low bar and

        2) A lot of this wouldn't actually be allowed in court.

        Your first two items are not relevant to a self-defense claim. Your third is not true; Rittenhouse worked there and his father lived there. Your fourth through sixth are indisputable but also not harmful to a self defense claim. Your seventh is, again, not true -- we see him flee from Rosenbaum. Your eighth both assumes things which haven't been shown (Rittenhouse confronting protestors) and is not sufficient to harm a self-defense claim: It does not matter if protestors would have perceived him as a threat; he would have actually had to provoke them to spoil the self-defense claim. And your ninth is irrelevant even if true.

        • by LeeLynx ( 6219816 ) on Sunday November 14, 2021 @12:46AM (#61986003)
          Did you just read my list then stop, or...?

          Not really sure what "wouldn't actually be allowed in court" here, since *all* of this speaks to intent, but to quote my own post (emphasis added):

          However, taken together , they are a strong indication that KR went to Kenosha with the intent to seek out protestors, armed or unarmed, and, through force of arms, either bring them to heel without legal authority to do so or, in the alternative, violently subdue them; then, upon the occurrence of the latter, realized that he was engaging in less-than-noble behavior, and attempted to mislead authorities about his intent.

          Your first two items are not relevant to a self-defense claim.

          See above.

          Your third is not true; Rittenhouse worked there and his father lived there.

          He was neither going to work or visiting his father.

          Your fourth through sixth are indisputable but also not harmful to a self defense claim.

          See above.

          Your seventh is, again, not true -- we see him flee from Rosenbaum.

          He found Rosenbaum how, again? It wasn't because Rosenbaum went out looking for armed kids from Illinois, and found KR hanging out in an alleyway minding his own business.

          Your eighth both assumes things which haven't been shown (Rittenhouse confronting protestors) and is not sufficient to harm a self-defense claim

          Seeking out protestors is a reasonable inference to draw from his stated intent. Not to mention, well, what he was actually seen doing. There's not a whole lot of other reasons to seek them out other than to confront them. He was carrying a rifle; he wasn't carrying a camera or some 'nocs for looter-watching.

          It does not matter if protestors would have perceived him as a threat; he would have actually had to provoke them to spoil the self-defense claim.

          This is actually the whole point of the eighth entry in my list, he intended to do something he knew or should have known would be perceived as a threat. It most definitely speaks to whether he intended to provoke them. The alternative is that he intended the behavior, but not the effect he knew would result from it. While a jury is free to believe that, it's a tough pill to swallow.

          And your ninth is irrelevant even if true.

          Hardly. It speaks to his mindset which, again, is what this all about. It would also go to credibility since he testified, though that's another matter entirely.

          I said in my post that I don't think their case will fly. Try reading a post before responding to it, you'll get far less spittle on your keyboard in the long run. That said, when not held to the burden of proof in a criminal trial, a claim of self defense here is laughable. When you go out looking for trouble, I have no sympathy for you when you find it. The fact that this kid is 17 is his *only* excuse.

        • And, to the original point of the op: the very fact that some people care to pump the brakes on the public lynching of a 17 year old (cf. Nicholas Sandmann) leads to them and their conversations being silenced by a clearly-not-objective mediator.

          Unless you're a frothing zealot on one side or the other, I think most people would agree this case is worth discussing. That FB (and much other social media) puts their finger on the scales against specifically one side is the problem.

      • Re: (Score:2, Interesting)

        by imidan ( 559239 )
        Great list. One item I would add: KR, unable to legally purchase the rifle himself, induced a friend to commit a felony straw purchase, which is why he had the gun in the first place. (There is a dubious defense that KR gifted his friend the exact amount of money for the rifle, and arranged for a person-to-person transfer once he turned 18. This sounds to me like straw purchasing with a flimsy veneer to attempt to skirt the law.)
    • You got tagged +5, before the angry down voted you. \o/

      The media is complete propaganda trash.
    • by lsllll ( 830002 )

      So, let me get on my soapbox again. From TFS:

      The experts also want reforms on the safety-from-prosecution shields of Section 230

      When you become the size of FB or Twitter, you're equivalent to the public square. I don't care that you're a private company. So, you have 2 choices: don't curate the content users post and don't face any liability, or curate the content, knowing that the content you leave behind can make you liable.

      To those who respond to the first scenario with "I don't want to see Nazi symbols in my content," I say "tough shit". Skip over it. Slashdot is very readable wi

      • When you become the size of FB or Twitter, you're equivalent to the public square.

        Size has never been a factor in such a thing, and there's no reason why it ought to be.

        they need to be treated with the level of transparency and non-intervention as common carriers.

        Common carriage has never been applied in such circumstances either.

        What you're really looking for is an end to free speech for entities you dislike (and given that you seem okay with swastikas, I can take a pretty good guess as to who you dislike). I don't like Facebook, but I'd rather have that than what you propose.

        • by lsllll ( 830002 )
          It's cute you can guess whom I dislike because you think I'm okay with swastikas. You misread my whole post. Obviously if "the experts also want reforms on the safety-from-prosecution shields of Section 230", they see something wrong with it. Size may not have been a factor up to now, but it may become one. Common carrier has never been applied in such circumstances, but it may now. And when I say "tough shit, get over it" for you cry babies who want to shield yourself from a world that contains swasti
    • by dohzer ( 867770 )

      AHAHAAHAAHAHAHAHahahahaahah
      Nice joke.

    • by Moryath ( 553296 )
      Let's see... links to the Moonie Times (white supremacist site), twitter of "Brian Fishman" (known white supremacist), and youtube channel of a neoconfederate white-supremacist named Matt Christiansen. Yeah, I'll totally believe the pack of dumb fucking lies you're tossing out here.
  • by Crashmarik ( 635988 ) on Saturday November 13, 2021 @09:52PM (#61985691)

    Section 230 of the Communications act, is what allows FB et al to publish all the crap they do. It's contingent on them acting as a neutral platform and not a publisher. Actually enforcing that would solve most of the problems with social media.

    https://crsreports.congress.go... [congress.gov]

    Of course this would deprive the nose in the air crowd of their greatest joy in life, looking down on others, and screwing around with others in small petty ways.

    • Problem is, 230 has been interpreted by the courts as essentially a get out of jail free card. So unless SCOTUS gets involved and greatly limits that interpretation, which has yet to happen and I'm not seeing it happening anytime soon, the best bet is to can 230 and redo it with far stricter language to avoid these idiotic legal interpretations. Given their lobbying power though, I have little faith that any replacement would actually honestly address the issues and likely would instead just cement their
      • Problem is, 230 has been interpreted by the courts as essentially a get out of jail free card.

        That's not a problem, and it's not the force behind Facebook's success. That force is corporatism, not freedom from prosecution for the content posted by others on their platform. Facebook could still survive absent Section 230, and furthermore, it could still be toxic. Facts can be presented in misleading ways, for example based on the order in which they are presented to make it look like one thing leads to the next when they are unrelated, or vice versa.

        Without Section 230 providing immunity from prosecu

        • Ah the good old '230 is god's gift to laws and perfect in every way' argument. Yeah, no. I may have little faith that we'd get something better, but I'm not stupid enough to think 230 is perfect like you're trying to argue, and yes it is a major part of the problem because it allows FB, and many others, to effective speak, via algorithms, slanted enforcement, modification of user posts, and opaque policies without any of the responsibilities of such.
    • Section 230 of the Communications act, is what allows FB et al to publish all the crap they do. It's contingent on them acting as a neutral platform and not a publisher.

      No, in fact that is a lie. There is no requirement that a website be purely a platform for others and not a publisher in order to receive protection for things other people post on that platform, and there is no mechanism in Section 230 that will cause you to lose your protections under that section if you also publish content. Section 230 explicitly permits you to take down some content while not taking down other content, and still receive those protections. The facts are literally the exact opposite of w

      • Fine words for a liar.

        Yeah, it takes a certain special kind of reasoning to ignore the fact that 230 doesn't apply to the platform's publications and that if they aren't neutral they have become defacto publishers.

        But as usual, you ad hominem anyone that disagrees with you.

    • Section 230 of the Communications act, is what allows FB et al to publish all the crap they do. It's contingent on them acting as a neutral platform and not a publisher. Actually enforcing that would solve most of the problems with social media.

      It has never been contingent on them being neutral. Anyone who thinks this has been misinformed. Section 230, simply put, says that it's the one who speaks that are liable for that speech and that owners of websites actually retain their first amendment rights.

      Also, the whole "neutral platform" thing, the person who came up with that is an idiot or perhaps very cunning - because it introduced a concept that has no basis in reality that has led to endless stupid discussions since there are people who thinks

  • by backslashdot ( 95548 ) on Saturday November 13, 2021 @09:54PM (#61985693)

    Would it be legal to drop a nuke from orbit?

  • by gurps_npc ( 621217 ) on Saturday November 13, 2021 @09:56PM (#61985695) Homepage

    1) End no permission data collection. There is no reason for Facebook to be allowed to copy my information merely because I gave it to my friend. I gave my phone # to them, I did not give legal permission to give it to anyone else, even if Facebook says they 'need it'. Require Facebook to delete any data they have for people that does not have an account with them. Punishable by $1000 payment to anyone whose phone, address, name, and/or email has been retained by Facebook without permission from them.

    2) End focused algorithms. No reason Facebook should be allowed to channel people into extreme views. The algorithms should be allowed to determine issues, but not the views. That is if someone cares about gun politics, they need to be sent just as many gun-control based data as they get anti-gun-control data. Yes this will piss off advertisers who have to advertise to both sides. So what? There is no legal right to advertise only to the people that you like.

    3) End default user-only pages. There is no reason Facebook should be allowed to force people to join them, not even to view Facebook content. If their service is valuable people will want to join them. If not, they should not be forced to join them merely to view some band or other groups public information. If someone wants to limit their views to just people with a Facebook account, let them. But the default should be open to anyone, even people that are not logged in/have no Facebook account.

  • by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Saturday November 13, 2021 @09:58PM (#61985697) Homepage

    As a top three, I'd say:

    - The privacy invasion and the awesome power of all that data.
    - The amplification of fake news, conspiracy theories, and hate speech.
    - The addiction/depression it causes in everyone but especially young teens.

    How do you possibly fix Facebook, Instagram, and the others?

    Their business model depends on all of these things staying true. Look at how alarming and harmful the recent move to opt-in privacy invasion on Apple devices was to them.

    I don't think it's possible to fix. Just shut it down.

  • by pecosdave ( 536896 ) on Saturday November 13, 2021 @10:03PM (#61985705) Homepage Journal

    We avoid them.

    If no one goes there eventually it goes away.

    If you don't like my opinion on the matter I invite you to rip me apart publicly on my Friendster page.

  • "Privacy Rapists Inc." happens to also describe them exactly.

  • by mrsam ( 12205 ) on Saturday November 13, 2021 @10:06PM (#61985711) Homepage

    This is very simple: rediscover Usenet. This is the original, decentralized, peer-to-peer social media platform. Extra bonus: plenty of pr0n if you know where to look (or at least AFAIK, there was plenty back in the hey-days of Usenet, maybe not as much now).

    I can post to Usenet anything that's on my mind, and I cannot be censored or deplatformed. It's simply not technically possible. And Usenet is still around.

    • So you discovered a.s.s - good for you.

    • I can post to Usenet anything that's on my mind, and I cannot be censored or deplatformed.

      That's why everyone left Usenet; spammers could post anything that was on their mind and could not practically be censored or deplatformed, and the whole thing turned into crap.

      • That can't be it. The same shit happens on Facebook and people still use it.

        • That can't be it. The same shit happens on Facebook and people still use it.

          It doesn't happen in the same way, even if it happens to the same degree, because Facebook users are not shown all content. Facebook filters out the spam they don't think users will engage with, from their feeds anyway. They also DO remove SOME of it from their platform entirely, when enough users report it. Then it gets posted again, but that is the Tao of spam.

          • A similar system could be employed for Usenet, where if enough people report it, it gets "downmodded" and you could set a threshold that you want to view, along with the reason like "I don't want to see spam" or "I only want to see postings others have considered sensible"...

            Where have I seen something like that before?

            • Yes, it could be done. But the system would still have to deal with the load. The problem with USENET is that it isn't distributed, every server node has to carry ALL of the content that any user wants to consume. This has obvious benefits in that the content is available without having to go to a remote server, but it has obvious drawbacks in the storage and transport departments.

              You could solve this by breaking USENET up into multiple smaller networks, of course, but this creates other problems. Maybe the

      • by mrsam ( 12205 )

        I have no idea what you're talking about.

        I was active on Usenet since the early 1990s, all the way through its peak and decline, and I do not recall spam being much of a problem. I saw very little of it, and whatever little it was, was quite manageable.

        The reason Usenet decline, and was superceded by myface, twatter, spacebook, et. al. has nothing to do with spam, but with simple human factor. It does take some minimal amount of effort to organize one's thoughts and type up a coherent message that's readabl

  • The quest for the almighty buck will make anything short of divestiture and forced competition between the successor companies useless.

    At the same time, it is important to understand that the real problem is in the advertisers, and without regulating their behavior they will still be the ones pulling the strings. After Citizens United, I won’t hold my breath

  • Most believe that Facebook can be fixed

    And most believe that Facebook is a cancer. The way you "fix" cancer is to remove the central tumor (in this case, Zuckerberg), then treat what cancerous cells might remain (Sandberg et al.) with chemotherapy until there's no more sings of cancer.

  • Why not all three. Criminal prosecution for any number of crimes, antitrust, and regulation. The biggest problem with regulation is that it runs afoul of free speech. Certainly making companies like Facebook responsible for what they publish would remove most of the objectional media, it was also make their websites unusable, since they would have to monitor every most before it when up. I would not miss any of them.
    • this is a waste of time and tax money.

      most of these companies want regulation, because it raises the barrier to entry for any small competitors, and eliminates or severely hampers smaller competitors that might be nipping at their heals. They lobby the government to get the terms they want and have the income to ensure it won't hurt them as much as smaller competitors that might overtake them at some point.
      what did Microsoft do when they had been accused of being a monopoly? that was when they put an office

  • They should purchase a rival site, but they should maintain the same name on their new purchase so people don't know it's Facebook 2.0. I'm thinking something like Instagram. Oh wait....

  • And sell off the competitors they kept buying and stop letting them buy competitors. Every time Facebook's dominance is challenged they just buy up the competitoring question and we let them do that because we keep electing politicians who don't enforce antitrust laws.
  • by rbgnr111 ( 324379 ) on Saturday November 13, 2021 @10:18PM (#61985745)

    isn't facebook dying, and changing its name to try to grab the attention of the younger crowd who couldn't care less about it, because facebook is the platform grandma and grandpa are on?
    Much like Twitter, they are struggling to stay relevant, and given time will probably just fall to the same fate as Freinster or Myspace.

    • Facebook is just a facade for a tracking hive and the free access you have is just there to confirm your identity. They probably have enough on 50% of the population that uses the internet to commit identity fraud.

    • Facebook (or perhaps better said as Meta) realizes that Facebook's popularity will likely wane at some point, which is why they try to buy whatever platform people might migrate to like Instagram, and also the reason why they rebranded themselves as Meta.

  • to the ground

  • Their business model is to monetize the limbic hack.

    ISP's need to not be able to block "servers" and people need to be able to buy a $20 box that hosts their own content and plug it in.

    Centralized social media will always harm the user. But local franchise monopolies ensure that they're the only option.
    .

  • by ikhider ( 2837593 ) on Saturday November 13, 2021 @10:39PM (#61985775)
    Facebook made the internet a worse place. Shun those that harm the internet and support those that make it great. I have not used Facebook in years and do not miss it one bit. Same with Twitter, Instagram, Google Plus, et al. Mainstream social media was never there for our benefit, we were the product all along.
  • How about all three? (Score:5, Interesting)

    by aerogems ( 339274 ) on Saturday November 13, 2021 @10:43PM (#61985779)

    Start by breaking the company in two (at least) with hardware products going to one company, software another, maybe a third company for advertising.

    Then maybe just put the CFPB in charge of regulating social media rather than create a new agency. Honestly, they should probably just be put in charge of regulating online advertising period. Keep Section 230 largely intact, but make a few tweaks to get rid of the exemptions for politicians or other people with large numbers of followers and something that will at least make life more difficult for the QAnon morons. Make the fines for companies proportional to their gross revenues, not some fixed maximum like now, where the maximum fine is something like 1-2 seconds worth of profit.

    Next up, do a clean sweep of the entire C-Suite and Board of Directors. Force Zuckerberg to convert all his shares to non-voting, give him the boot from the company, and all the sycophants, enablers, and hangers-on who have created the mess we find ourselves in. Bar the whole lot of them from serving as a C-Suite level executive or board member for any social media company for at least the next two presidential election cycles.

    Finally, and this applies across all industries, start fucking enforcing antitrust laws and don't allow companies to get this goddamn big in the first goddamn place. Stop allowing companies to merge with competitors or buy up potential insurgent competitors. The old AT&T has almost put itself back together between AT&T and Verizon. If those two ever merged, we would effectively have the old AT&T back. Banks are larger now than during the Great Recession when they were considered too big to fail, and naturally we've removed basically all the protections put in place to prevent a second Great Depression.

    Of course none of this will ever happen because it would require the kind of courage that no longer exists in American politics. There are too many people these days who are happy to burn everything to the ground as long as they get rich in the process. Not to mention the GOP has been steadily working towards rigging the system for decades. They've been beating the "voter fraud" drum since before a lot of us were probably born, and Jan 6 was just the natural conclusion to priming people to believe that there's massive fraud taking place all the time during elections, some of them for literally their entire lives. That's given certain Republican legislatures cover to start passing laws that will let them overturn any elections they don't like in the future, which is to say any election where the Republican loses. Surely the only reason a Republican candidate loses is because there's fraud. It couldn't possibly be that the voters chose someone else.

  • by PopeRatzo ( 965947 ) on Saturday November 13, 2021 @11:49PM (#61985893) Journal

    You don't have to wait for anyone to "fix" Facebook. F.B.Purity extension does it now and makes FB perfectly usable for staying in touch with groups of people. No ads, customize your feed, no "news". Free. Then, we wait until someone makes a better platform.

    https://www.fbpurity.com/ [fbpurity.com]

  • Current anti-trust court rulings are based on a change of thought from the 1970s. While I'm not that old, it's rather more modern than I would have expected. Prior to the 1970s anti-trust was about competition even if that competition was worse. After the 1970s the courts changed their thinking and decided it was all about moving towards cheaper offerings. If something made someone cheaper, that was better for society and thus they were allowed to do anything that made them cheaper.

    My thought is we keep

  • They've lost the entire 18-34 crowd. Yeah, a chunk of them moved to Instagram, which Facebook owns, but even Instagram is fading fast. WhatsApp is sort-of popular, but it is *also* fading out.

    This is the natural evolution of "social media", I think. The various apps/sites are essentially "fads". Facebook held on longer than its predecessors, but the writing was on the wall once smartphones became ubiquitous. There just isn't much need for social media when you can send SMS messages to anyone at any time.

    Tha

  • I say we kill it just like we killed AOL. In fact, I think it is already happening.

  • It's the only way to be sure
  • several experts Recode interviewed believe that forcing Facebook to spin off these businesses would defang it of its concentrated power

    Finally. Above a certain size, *any* business has too much power. Above a certain size, divestment should be mandatory.

  • Just let Facebook do whatever, and, like, don't use it because it's bad. "Oh, my family uses and I have to!" No. You're weak. Just don't use it and stop bitching about it. "It's evil, I need the government to save me from it's evil machinations." It's just a web site. Stop acting like an abused girlfriend that just can't leave.

  • ...can anyone else not look at Mark Zuckerberg without thinking "Lex Luther".
    If all his hair falls out I'm finally going to delete my FB account.

  • Ban personalization algorithms. Ban ML curation of news feeds. Ban any adtech that could be used for disinformation or psychological exploitation, which is all of it. Technology designed to get people to make bad financial decisions will inevitably be used to get them to make bad political and social decisions too.
  • The simplest solution is force them to conform to an open standard for interaction so people have a real choice in platforms

  • Cigarettes are required to have prominent textual warnings on their packaging. For FB, every third advertisement shown in someone's feed should be on one of the platform's deleterious effects on individuals and society.

  • by bb_matt ( 5705262 ) on Sunday November 14, 2021 @03:47AM (#61986223)

    If folk are willing to pay for streaming TV services, I'm sure they'll pay a smaller value for a social service that doesn't serve ads, doesn't slurp up their data, has trusted private areas for family and friends etc.

    The vast majority of users, I think - could be wrong? - are on Facebook for family and friends.
    Sadly, because of the way revenue is derived, they are bombarded with shit posts, many of which stir up emotion - purposefully.
    They end up sharing content which is clearly directed at swaying opinion.
    When I was on FaceBook, heck, over 2 years back, I ended up unfriending so many contacts when faced with their political comments.
    I think half the problem here, is people run out of things to say, so they share ... shit.

    A subscription based service could alleviate this, but I guess that leads to what sort of pricing tiers?
    It could never go completely subscription only, as people who leave in their droves (no bad thing.)
    But a move should be made for people who do want a service where they can connect with Family and Friends, that is secure and above all else, Private.

    If this can't be done, just shut the entire shit show down, because it is unworkable as it is and a threat to democracy and social stability.

  • ..an industry whose business model is inciting indignant outrage? You don't. You put it out of its misery.
  • by berj ( 754323 ) on Sunday November 14, 2021 @06:08AM (#61986445)
    cd /
    rm -rf *
  • by Slicker ( 102588 ) on Sunday November 14, 2021 @10:09AM (#61986869)

    Give each user up to 100% of their reputation to assign to the posts of others that they receive. Then allow them to assign a percent of their own reputation to each post, as it comes in. The amount of reputation one receives from others weighs how much their reputation adds to whoever their assign their reputation to. This way, the views of less reputable people will mean less. Also, make sure there is a decent rubric upon which to assign reputation to. For example, allow them to assign to say: Truthful, Intelligent, Humorous, Respectful. Each post will accumulate scores on each of those rubrics and the sum of a user's reputation is the accumulation thereof. Reputation must also be maintained because, as one assigns to a new post, all previous assignments will diminish proportionately because you only have 100% to assign. This would create a balanced ecosystem enabling people to filter for the qualities of content they'd like to see.. And to filter out crap.

    Another approach is Pro/Con Upvotes. That is, for each post, allow viewers to upvote its Pro or its Con. So the highest of both is to be shown first when comments are listed. It's either going to say, effectively, that this comment (relating to the post) is really good or really wrong. Either way, it tells you something vital. Wikipedia proved that the malicious actors in the world are far outnumbered by those will good intentions. So this will put wrong-doers in their place, hopefully leading them to correct their ways by means of social pressure.

  • They completely ruined the internet culture, changing the focus from what you do on the internet being the most important thing to who you are.

  • How to "fix" facebook? Make it into a public utility with a reasonably priced subscription with no advertising and no content moderation. Regulate it the way you would any other coffee klatsch, which is basically what it is at essence.

  • I'll see myself out.
  • It's what makes Facebook so damaging and disruptive. Without the motive to generate ever increasing profits, Facebook could be a pretty good platform for staying connected to distant friends and family.

  • Just kill Facebook entirely. It's cancerous.
  • Most of them involve strategically placed C4 charges.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...