Bluesky Has an Extortion Problem (tedium.co) 27
A cybersquatting scheme targeting prominent writers and entrepreneurs has exposed flaws in Bluesky's domain-based verification system, newsletter Tedium reports, citing users.
Bloomberg columnist Conor Sen reported receiving an extortion attempt this week when an anonymous user who had purchased his namesake domain demanded payment to transfer ownership. The episode has unraveled wider revelations of similar attacks targeting at least five other well-known users, including political blogger Matt Yglesias and The Hustle founder Sam Parr.
The platform's moderation team initially banned Parr's legitimate account while leaving the impersonator active, Sen told Tedium. The fake account was only removed after users escalated the issue to senior Bluesky staff.
Bloomberg columnist Conor Sen reported receiving an extortion attempt this week when an anonymous user who had purchased his namesake domain demanded payment to transfer ownership. The episode has unraveled wider revelations of similar attacks targeting at least five other well-known users, including political blogger Matt Yglesias and The Hustle founder Sam Parr.
The platform's moderation team initially banned Parr's legitimate account while leaving the impersonator active, Sen told Tedium. The fake account was only removed after users escalated the issue to senior Bluesky staff.
Re: (Score:1)
Re: (Score:2, Flamebait)
I'm just saying that it's a pretty "yawn" topic.
Despite your qualifications preceding this comment, your final statement is never true. If it ever becomes true we fail as a society.
Re: (Score:3)
If it ever becomes true we fail as a society.
That's a fair assessment. To further qualify the statement: It's "yawn" because dealing with it is a solved problem. There's perverts and trolls in the world, that's unfortunately not going to change. The only thing a service like Bluesky can do is delete the nonsense that they post, and from what I've read in the five minutes I spent "researching" the topic, they have a decent enough system in place to do so. [safer.io] But, again, you are right. Hand-waving it away as isn't the right way to put it. It should be de
Re: (Score:2)
I got slightly off topic, but my point
Re:45000... (Score:4, Insightful)
I'm just saying that it's a pretty "yawn" topic.
Despite your qualifications preceding this comment, your final statement is never true. If it ever becomes true we fail as a society.
Nope. The GP is absolutely right. Bad people exist, therefore bad people exist on the Internet. Child porn exists, therefore child porn exists on the Internet. And so on. So it should be a pretty "yawn" topic that it exists on a popular Internet site. The mere fact that it exists is entirely expected, because the site's contents are demographically representative of the people who use it and the people who advertise on it and the world in which it exists.
This is not the same thing as saying that it would be a "yawn" topic if it starts showing up unexpectedly for people who aren't intentionally trying to find it, of course, because that would be an epic algorithmic fail. It should be taken down when someone reports it, and if the site is big enough, they should be actively trying to prevent it, but expecting it to not exist is like expecting drivers not to speed or the wind not to blow.
Similarly, the number of reports is indicative of the number of users, not indicative of anything particularly problematic. 45,000 reports across 25 million users means that at *most* 0.18% of their users posted that sort of content, assuming that every one of those reported posts came from a different account, and the real number is probably a tiny fraction of that.
And this assumes that all the CSAM takedowns really were CSAM, too. Just because they took it down doesn't mean there was any abuse. Companies are likely to take down anything reported that even plausibly could be CSAM out of an abundance of caution, whereas some portion of that content could really be generative AI content that "looked too young", pictures of somebody's kid's rash sent to the doctor (probably not on that site, but you get the point), young people sending borderline images to their friends/girlfriends/boyfriends, and so on. So I wouldn't automatically assume that all of those 45,000 reports were legitimate CSAM.
Now if you said that there were 45,000 new burner accounts created every day that were all posting CSAM, that might be indicative of a real weakness in their new account validation, but otherwise, that's not a particularly alarmingly large number, IMO. Google had 5 million takedowns in 2023 (source [debuglies.com]), which is about 14k per day, and Google doesn't even run a social media site anymore. Meta had 24 million, which is ~66k per day.
So yeah, 45,000 really is not hopelessly outside the expected range for a social network. It's maybe 20x higher than Facebook based on the number of users, but that assumes that Facebook's user count actually matches reality, which it probably does not, so I wouldn't read too much into it.
In other words, it's not a yawn because CSAM isn't important to prevent. It's a yawn because given the number of users and how active they presumably are, that's not a surprisingly high number compared with other similar sites.
Re: (Score:3)
Re: (Score:2)
White lists to the rescue (Score:2)
[Feeding the trolls never helps, but Slashdot makes it too easy to propagate vacuous Subjects.]
The bad guys will keep looking for ways to break any nice thing. The economic models need to be carefully designed. On one side that means to benefit ALL of the people using the system (or company or technology) and on another side that means to block the bad economic models. (But in the case of Slashdot the economic model can't support fixing ANY of the problems and annoyances.)
But I do have a constructive sugges
Re: (Score:1, Informative)
Sounds like you're intimately familiar with (other) pedos.
Re: (Score:3, Informative)
He shows up 67 times in pictures with Epstein, more than any other person. He also gave Epstein 14 different phone numbers he could be reached at.
Re: (Score:2)
https://mynbc15.com/news/natio... [mynbc15.com]
The former president continued, saying the release of the documents could pose a hazard to major figures such as former independent presidential candidate Robert F. Kennedy Jr., who openly admitted to having traveled with Epstein on his jet.
To quote Elon Musk "concerning"
Re: (Score:3)
What an appropriate name (Score:2)
exposed flaws in Bluesky's domain-based verification system, newsletter Tedium reports
Indeed...
Re: (Score:2)
DNS based auth has DNS problems. (Score:4, Informative)
Why would anyone be surprised that DNS-based authentication might suffer from the same problems that DNS has been facing since the 1990s?
Domain squatting is a thing. It's always been a thing. And it will always be a thing until something is done to actually prevent it.
Re: (Score:2)
Domain registrar problem (Score:2)
Sounds more like a domain registrar problem than a Bluesky problem. Though in fairness, they are piggybacking their verification process onto a system was designed with brand names/trademarks in mind rather than personal names.
Re:Domain registrar problem (Score:4, Insightful)
Why would anyone expect people to treat their usernames as brands for which they should own the .com? Especially when most people have different usernames on different sites. It's absolutely a Bluesky problem.
Re: (Score:2)