Social Media Platforms Leave 95 Percent of Reported Fake Accounts Up, Study Finds (arstechnica.com) 32
An anonymous reader quotes a report from Ars Technica: The report comes this week from researchers with the NATO Strategic Communication Centre of Excellence (StratCom). Through the four-month period between May and August of this year, the research team conducted an experiment to see just how easy it is to buy your way into a network of fake accounts and how hard it is to get social media platforms to do anything about it. The research team spent about $332 to purchase engagement on Facebook, Instagram, Twitter, and YouTube, the report (PDF) explains. That sum bought 3,520 comments, 25,750 likes, 20,000 views, and 5,100 followers. They then used those interactions to work backward to about 19,000 inauthentic accounts that were used for social media manipulation purposes.
About a month after buying all that engagement, the research team looked at the status of all those fake accounts and found that about 80 percent were still active. So they reported a sample selection of those accounts to the platforms as fraudulent. Then came the most damning statistic: three weeks after being reported as fake, 95 percent of the fake accounts were still active. "Based on this experiment and several other studies we have conducted over the last two years, we assess that Facebook, Instagram, Twitter, and YouTube are still failing to adequately counter inauthentic behavior on their platforms," the researchers concluded. "Self-regulation is not working."
About a month after buying all that engagement, the research team looked at the status of all those fake accounts and found that about 80 percent were still active. So they reported a sample selection of those accounts to the platforms as fraudulent. Then came the most damning statistic: three weeks after being reported as fake, 95 percent of the fake accounts were still active. "Based on this experiment and several other studies we have conducted over the last two years, we assess that Facebook, Instagram, Twitter, and YouTube are still failing to adequately counter inauthentic behavior on their platforms," the researchers concluded. "Self-regulation is not working."
This is correct (Score:3)
It's possible some people report such fake accounts in large quantities, possibly for NATO, and that the networks are then observed and crawled and interconnected, and then we take them all down at once when they start a major operation.
Not that we'd admit this.
But it's usually one of these places involved: Russia, China, Iran, or Saudi Arabia.
Key word: usually.
Re: This is correct (Score:2)
Re: (Score:2)
"Self-regulation is not working." (Score:1)
But the Libertarian's Magical Mystical Free Market Fairy says that's impossible!
Re: "Self-regulation is not working." (Score:1)
Re: "Self-regulation is not working." (Score:1)
Re: "Self-regulation is not working." (Score:2)
Nope (Score:2)
The financial incentive to self regulate is if the ADVERTISERS went away. They couldn't give two shits about the "users".
And if advertisers think they're not getting their money's worth, they will go away. Right now, IG is the best place to advertise in the US for most things, in my opinion (unfortunately).
Re: (Score:2)
The ugly truth is that these platforms don't want to get rid of the bots, because it would quickly show how little engagement or traction their platforms actually generate. Th
Re: "Self-regulation is not working." (Score:2)
Re: (Score:2)
I'd add a link to that, but there's just some things that you're better off not knowing about.
I probably shouldn't have written that since it just makes people more curious, but then if that were my intent I probably wouldn't have written this either.
Re: (Score:2)
But the Libertarian's Magical Mystical Free Market Fairy says that's impossible!
No, libertarians don't say that.
We say that fake social media accounts are not an existential crisis threatening the future of humanity. Yes, people lie on social media, but that doesn't really matter very much.
Social media manipulation is not why the public has rejected the progressive world view, and censoring Facebook is not going to convince Trump/Brexit/LePen voters to switch to Bernie Sanders and Jeremy Corbyn.
You should focus on plausible policies and positive messages, rather than trying to silence
Re: "Self-regulation is not working." (Score:2)
Re: (Score:2)
Re: "Self-regulation is not working." (Score:2)
The research is highly flawed and biased against these companies. We all know it is possible to buy your way into the ecosystem, that's how it is intended to work.
On the other hand I wouldn't want anyone to be able to take down my account on 1 report of being fake. These researchers think they are the arbiters of truth and can just decide who should be taken down based on some criteria they set.
Re: (Score:2)
Yeah, that's my first thought, too.
If you've been around for more than an eye blink, you've noticed that the "moderators" of most sites essentially just take any complaint at face value.
(Right to appeal, right to face the accuser, standards of evidence... all that jazz if for stuff that actually matters, this is just the internet...)
Re: (Score:2)
I have no doubt this is true (Score:3)
But of course those accounts are real... (Score:2)
surprised (Score:3)
Re: surprised (Score:2)
Re: (Score:1)
5% of accounts being closed does not equal 5% of sock-puppeteers unable to distribute fake news or manipulate Facebook subscribers. One problem being, a banned sock-puppeteer can create multiple new accounts and continue business as usual. Even better, new accounts are counted as market 'growth' even though revenue is the same.
Re: surprised (Score:2)
As long as it costs less than 5% of total revenue, why bother. The effectiveness of ads isn't at all close to 95%, so really who cares that an extra 5% is wasted.
The pull quote (Score:3)
"Self-regulation is not working."
The goal is to take away free speech.
Re: (Score:2)
You missed his point. Some people would like to see the government "step in" to regulate social media in order to combat too many people saying things they don't want them to say. Proclaiming the companies involved aren't able to regulate themselves is a means to that end.
Lazy (Score:2)