YouTube's Recommendation System is Criticized as Harmful. Mozilla Wants To Research It (cnet.com) 84
YouTube's video recommendation system has been repeatedly accused by critics of sending people down rabbit holes of disinformation and extremism. Now Mozilla, the nonprofit that makes the Firefox browser, wants YouTube's users to help it research how the controversial algorithms work. From a report: Mozilla on Thursday announced a project that asks people to download a software tool that gives Mozilla's researchers information on what video recommendations people are receiving on the Google-owned platform. YouTube's algorithms recommend videos in the "What's next" column along the right side of the screen, inside the video player after the content has ended, or on the site's homepage. Each recommendation is tailored to the person watching, taking into account things like their watch history, list of channel subscriptions or location. The recommendations can be benign, like another live performance from the band you're watching. But critics say YouTube's recommendations can also lead viewers to fringe content, like medical misinformation or conspiracy theories.
Re: (Score:2)
Say there is a cheaper and better electronics and 3d Printing that isn't on your feed, that you would like to be interested in, but they are not considered a good choice.
Or perhaps your feed may be necessarily negative towards a particular product, method or style.... For example I did some searches on functional programming languages, on my recommended feed I got a set of Anti-OO videos as well. Learning Functional Languages doesn't mean that you need to be opposed to Object Oriented. Also if you are tr
Re:How is it harmful? (Score:5, Insightful)
The only point of the recommendation system is to get you to watch more videos. Of course they're going to recommend outrageous bullshit because you're more inclined to watch it.
Re: (Score:2)
Consider two options, one algorithm that gives only the most base, shallow, clickbait suggestions (like those awful "promoted stories" on mobile Slashdot). You may have the highest click-rate with it, but you'll be quickly disenfranchised.
Consider a second algorithm that has a lower initial click-rate, but instead introduces you to genuinely interesting content when it does catch your interest, leading you to perhaps
Re: How is it harmful? (Score:5, Insightful)
Re: (Score:2)
I think you can do more than just ignore them. I think there's a button with an "X" in the upper right of recommended categories that tells YouTube you're "Not Interested". My YouTube Music subscription also includes YouTube Premium, so I'm not sure if that's something that comes with the upgrade or not.
Re: (Score:2)
What blows my mind is that YouTube provides absolutely no affordance for telling it that you don't want to see ANY videos from a particular channel. Unless I'm missing something, the best you can do (apart from user scripts) is to tell YouTube to remove any PARTICULAR videos from the channel that appear in the recommended list, then click "Tell us why" and say that you didn't like it.
Blocking on social media is a pretty important function, and YouTube is social media.
Re: (Score:3)
I get recommendations for videos from Fox and CNN all the time. I never watch any of them and I doubt it's anything sin
Re: (Score:2)
The ads have gotten so out of hand though
It's weird. On my desktop computer, I get loads of ads, both prerolls (often 2 in a row) and during longer videos. On my laptop - same browser, same OS, same YT account, same LAN - I get almost no ads.
The recommendations were mostly cat videos, "next level" puzzles and road rage videos. Until I clicked no on a few of them. Now it's a lot closer to what I am actually interested in (and hell, I do click on the odd cat video)
Re: (Score:3)
The number of ads seems to increase the more you consume. So you probably watch fewer videos on your laptop.
Re: (Score:2)
Recruiting terrorists. (Score:4, Informative)
The portion of the youtube ui that shows "top videos" are all videos I'm interested in. Electronics, 3d printing, futurism podcasts, video gaming.
Outside the YouTube recommendation system I ran across a reference to a video on making strong and pure surfuric acid from epsom salts by electrolysis (using a low voltage power supply, some carbon rods, and a flowerpot). Being interested in renewable power, including recoditioning lead-acid batteries for energy storage, I watched the video.
When I watched it, and afterward for some weeks, the YouTube recommendation system presented me with a bunch of videos by Saudi Imams.
I'd heard a lot about how Islamic terrorists were recruiting impressionable kids in the US using videos on the internet. But I'd always wondered how the kids got exposed to this stuff if they weren't already looking for it. Now I know. B-b
Re: (Score:3)
I don't watch political/conspiracy stuff on YouTube, but their music recommendation system is about the best I've found. I'm always getting recommendations for new and old stuff I really like, even across a wide variety of genres. It's so good that I switched from Spotify to YouTube Music. Strangely, the recommendation system in YouTube Music is not nearly as good as the one in plain old YouTube. It's better than Spotify, but still needs improvement. I can't figure it out, beca
Re: (Score:2)
It has steadily stopped recommending anything of value to me.
It now only provides 3 year old video clips from my subscriptions channel. I find it hilarious and have to seek out content myself.
Re: (Score:2)
How this end up in muh recommended?
Re: (Score:2)
This explains the sudden surge in ads: YouTube [youtube.com]
Harmful (Score:5, Insightful)
Youtube looks at what you watch and recommends more videos like it. If you like videos about knitting, it gives you more videos about knitting. If you like videos about flat earth conspiracies, it gives you more videos on flat earth conspiracies. If you like videos about social justice, it gives you more videos about social justice.
The problem people have isn't the recommendation system, it's that they don't want certain things to be recommended. I think it works fine.
Re: Harmful (Score:5, Interesting)
Show of hands all, who has NOT had YouTube auto-play go from ok to what the hell in three videos.
Yah, now what were you saying?
Re: (Score:2)
Show of hands all, who has NOT had YouTube auto-play go from ok to what the hell in three videos.
Yah, now what were you saying?
[Raises hand]
But, that's probably because I don't really utilize YouTube much at all, and when I do, I most certainly do not go to the next video as it's usually a reference video for how to do something specific.
Re: (Score:2)
Youtube recommendation engine is a case study in radicalization. No matter what subject you pick, from crafts, to politics, to religion, the recommended videos are always more radical or extreme position of this. This is done to drive 'engagement' and to show you more advertisements. While getting more involved in, for example, woodworking is not harmful (maybe to your wallet, as you will be buying tools) getting more involved in a number of other areas is very dangerous as you running substan
Re: (Score:1)
Re: (Score:1)
And yet, many people might prefer to experiment with *different* recommendation algorithms, perhaps even a system with some human curation in the system in an attempt at raising quality as opposed to finding clickbait. But you champions of the status quo are denying these people the chance to even try such a system.
Re: (Score:2)
Youtube looks at what you watch and recommends more videos like it. If you like videos about knitting, it gives you more videos about knitting...
That's incorrect. Youtube looks at what you watch and recommends videos that it think will engage you. It does this because engagement = eyeballs = advertising revenue.
How is "engage" different from "like"? Well, you probably don't like watching videos that make you angry at other people, but the human brain being what it is, they appeal to primal instincts and draw you in and leave you wanting more.
Re: (Score:1)
Oh, user likes video games, HAVE YOU HEARD OF MINECRAFT AND FORTNITE, HERE'S THE TOP TWEEN E-CELEBS
It wants to hype you into hype crap. Oh you like Citizen Kane? You're into cinema? Here's the capeshit of the month! You gotta see some Kardasian's Infinity War opinion!
"Harmful" is trivial to how generally garbage it is at "finding". If you need help "finding" the highest-viewed video with the lowest signal:noise on a subject, full of fluff and ego, then... I guess you're in the right place. Youtube is about
Re: (Score:2)
The problem is how it determines what videos are like the ones you watched. If you watch a video about coronavirus safety it thinks videos about 5g conspiracies and anti-mask bullshit are similar and gives you those.
People know this so they try to get their conspiracy videos associated with all kinds of random things. In fact a lot of the conspiracies are just them noticing that videos on X are popular so what crap can we make up about X to get those recommendations?
Re: (Score:2)
Nope, that's not the way it works for me at all. I'm interested in history, space, engineering, and science and tend to watch those sort of videos, interspersed with EDM concerts and Andean folk music. What are their recommendations? Ancient aliens, phantasmagorical Atlanteans building Inca and Egyptian megaliths, Christian and Muslim apologetics, and most bizarrely of all Trump videos. I've tried clicking through to down vote them hoping to clean up the feed, but to no avail.
Re: (Score:2)
Ads should be equipped with a button saying "apply 50kV to the people who posted this".
My grandmother always used to say "the adverts speak very highly of it,
Re: (Score:2)
But if you like videos about science, it sends you 50% unscientific codswollop. At the very least there needs to be a button labled "I don't ever want to see this video again", and possibly one saying "I think it should be sent for psychological investigation". If more than 10,000 people press the button, it might be time for a human to investigate.
I mostly watch movie reviews and science/engineering videos on Youtube, and I hardly get any conspiracy or pseudoscientific nonsense videos. It recommended a couple to me once, and I clicked on the (X) button saying I don't like those videos, and I don't think I've seen one pop up in my feed in years. I'm even subscribed to a couple of channels that cover conspiracy theories (more from a critical angle) and those videos still don't pop up in my recommendations. Getting inundated with nonsense videos hasn't
Re:Harmful (Score:4, Interesting)
Re: (Score:2)
About 20% of their recommendations for me are videos I have already watched.
About 40% are similar to stuff I've already watched, most of which I only watched about 30 seconds of.
About 20% have nothing whatsoever to do with anything I have ever watched. Extremist wingnut stuff would be a nice change of pace, but I get cooking videos.
The remaining 20% are Gabriel Iglesias ("FLUFFY!"), all lined up in a row (most of which I've already watched.)
What users? (Score:2)
Mozilla wants its users to research YouTube's algorithms.
What users?
Re: (Score:2)
If they keep distracting themselves with irrelevant nonsense like this, they won't have to worry about those pesky users for much longer. This is insanity. What in the world could lead them to think that this should be a mission-critical task that deserves paid development time and attention?
YouTube and other platforms have also drawn blowback for helping to spread the QAnon conspiracy theory, which baselessly alleges that a group of "deep state" actors, including cannibals and pedophiles, are trying to bring down President Donald Trump. The stakes will continue to rise over the coming weeks, as Americans seek information online ahead of the US presidential election.
Ah, got it. It's political.
Well, good for you, Mozilla. Glad to see you're busy re-arranging those deck chairs while your flagship sinks from underneath you.
Re: (Score:2)
Mozilla wants ... users....
You have to read between the lines.
Fixing what. (Score:2, Insightful)
The cure is even more harmful than the disease.
Mozilla, please once again, stop being worried about politics and whatever keeps people busy. Focus on what you should do: build a browser. I understand you also hired a bunch of SJW's. I'm not suggesting to fire them, merely ignore them and their hobbies and tell them to go f*g back to work.
Re: (Score:1)
The cure is even more harmful than the disease.
Facts not in evidence.
Without knowing YT's algorithm, you do not know the severity of the disease.
Without knowing YT's algorithm, you do not know the consequences of the cure.
Yet you get modded insightful... bullshit. The opposite of that.
Re: (Score:3)
The cure is even more harmful than the disease.
Not necessary. If the cure is to remove all objectionable content, then yes, it is worse than the disease. If the cure is to change how recommendations are done, to limit unintentional exposure to extreme content, then it could work.
The goal should be not to censor, say flat earth society, but to stop showing flat earth videos to people looking for sunset videos.
Re: (Score:3, Insightful)
to limit unintentional exposure to extreme content
There, you just defined a political goal. Not relevant if the majority agrees on it, it's a political goal. You want to decide for others what they can see and not see, in this case unintentional exposure of people to extreme content. And up to (whoever) to define what extreme is.. See where this is going?
Youtube can moderate their content as they see fit, after all, it's their content. Others might give them feedback on that, sure. But most companies stay away from politic-sensitive subjects unless it's th
Re: (Score:2)
I want a recommendation system that gives me what I'm interested in, no matter what that may be. I don't want it to try and manipulate me away from my interests which it nod do. Currently it also have a memory problem with my subscriptions. It only show recommendations related to what I've watched within a few days. It never checks my subscription list and thinks "Hm he haven't watched a video from this channel in a while and it do have new videos, I should recommend a new video from that channel." Recommen
Re: (Score:3)
The problem isn't removing objectionable content, it's who gets to decide what's objectionable.
Because that will always be people who want to decide what's objectionable for everyone else. And those are the last people who should be allowed to.
Re: (Score:3)
But, lately they've been turning more and more into politics and I don't like it. I might even stop donating over it. Mozilla should be able to fulfill its mission without going so much into politics.
Re: (Score:3)
Mozilla has been supportive of efforts to make the web better for users for a long time. From the early days of helping ad blockers be more efficient to being the first to block 3rd party cookies...
Making YouTube better seems like a good thing you be doing.
This has nothing to do with SJWs (Score:2)
Though it's funny that whenever anyone questions if bad things are happening on a platform the immediate assumption is that those bad things involve people being exposed to extreme right wing politics. Nobody ever says "Man, I don't want people watching those crazy Breadtube videos".
Re: This has nothing to do with SJWs (Score:1)
Working on the browser? (Score:2)
How about they do some work on the web browser instead of worrying about which youtube video is queued? Does everyone there have ADD?
Oh Really? (Score:1)
"But critics say YouTube's recommendations can also lead viewers to fringe content, like medical misinformation or conspiracy theories. "
Like post from MSNBC, NBC, ABC, CBS, CNN, PBS, Fox News etc.
Seen more medical miss information on these "News" cast than anywhere.
Re: (Score:2)
Any given source on any given day has a reasonable chance of having misinformation. But in aggregate, across many news source, correcting themselves and each other, we get a fairly accurate news summary over time. The problem is the YouTube algorithm that tends to narrow rather than broaden the research scope of a given like. What I would want for myself is that if I liking a video on flat earth, YouTube would give me more videos about geography, not just more videos about flat earth. It does me no good to
Re: Oh Really? (Score:2)
If you're interested in flat Earth, you probably aren't interested in geography.
Re: (Score:2)
That, I think, is exactly the flaw in the algorithm. It makes assumptions about what will be interesting qualitatively instead of just finding associated topics.
Re: Oh Really? (Score:1)
Re: (Score:2)
I strongly disagree. We have the First Amendment specifically to keep government out of the censorship business. I would far rather the situation we have in the USA today than the situation in China today. Disinformation can be fought without killing people. Government enforced silence cannot be fought except with civil war.
Re: (Score:2)
If you want depth, read your news. Video or audio cannot convey depth nearly that well because you're limited by time and you can't skim a video.
It's probably true that consumer news is dumbed down so much that it's innacurate - especially medical, because most new discoveries do not establish new facts - at best, maybe a new theory. In trying to distill the most interesting part, they leave out the uncertainty.
Re: Oh Really? (Score:1)
Re: (Score:2)
I can see why you'd like video or audio. You are completely incoherent in text form.
Ed Snowden says... (Score:2)
Check out the recent JRE. Ed describes these people who complain about the suggestions as being responsible for the culture of censorship and deplatforming that we so decry.
Mozilla has recently been flexing itself as a bastion of illiberal thought. It's so heart-wrenching for those who have been following them since UIUC.
I like that Ed has the time to think deeply about these issues during his exile. Too bad the modern Solzhenitsyn is in hiding from the USA rather than the USSR!
Re: (Score:3)
Check out the recent JRE.
8u261? I mean, yeah - it's interesting that they've had 261 releases on the same version number, but it's not particularly interesting aside from TLS 1.3 support.
Re: Ed Snowden says... (Score:1)
Waste of resources (Score:1)
Re:Waste of resources (Score:4, Insightful)
Yes, but the evidence is: if you give them more money, they will spend it on making the browser worse.
The Social Dilemma (Score:5, Insightful)
Re: (Score:3)
It's a tricky set of issues though, and doing something about it would put us all in very new territory, and Mozilla is one of the least competent organizations I can think of to do an investigation in a new field.
I hate to sound like one of our conservative friends-- championing the freedumb of corporations to do to us whatever they want-- but it would be kind of cool if Mozilla prioritized working on the browser, and stopped fucking around with bold new initiatives.
But just to be clear, "working on th
Re: (Score:2)
But just to be clear, "working on the browser" does not require changing the UI every two weeks.
Of COURSE it doesn't. Who'd want to wait 14 long, entire days just to not be able to locate another icon or function? Just treat it as a never-ending treasure search -- every update brings a brand new surprise!
UI "programmers" are the original universal basic income example. You might pay them initially to do a job but then you keep ON paying them to NOT do their job. Because getting fired is just so archaic, don't-ya'-know?
Egypt? (Score:4)
Re: (Score:3)
Re: regulatings all thoughts and speech (Score:1)
It's completely broken (Score:2)
I spend sometimes hours certain my days clicking 'don't recommend channel' but I get always the same crap.
I'm a 64 old male Green Socialist from Europe and it thinks I'm a right wing Republican who likes FoxNews, Breitbart, manga, anime, (I don't) french bulldog farts, guns, revolvers, silencers,(I don't like those either) heavy makeup (sic), and lots of stuff in languages I don't comprehend, much less the alphabet they are written in, even though my VPN uses my home country as exit point.
Intangibles and tech giant blindness (Score:4, Insightful)
There's nothing wrong with the crazy whacko recommendations on Youtube. They may well be linked to each other in ways users think are spooky or they may appear to be really off-the-wall sometimes, but as long as nobody is intentionally manipulating them they should remain. The people who find them upsetting and think people are being steered to "wrong" information have the same blindness that the tech giants have - and it's a widespread blindness in the tech community generally:
It's one thing to see a bunch of raw data, find links between it, sort it, categorize it, add up numbers about it, and so on, but it's an entirely different yet vital thing to know the WHYs of it. Joe can look at a conspiracy site because he's a moron who is easily sucked into it, but Sam might go because he's curious about the nonsense his neighbor seems to believe, and Susan can go there because she's had a bad day and need a laugh. Edna might go to a neo NAZI site or vid because she has a fetish for guys in tall black boots and uniforms, George might because he is a loner who fancies himself as a superior Arian, but Mike might because he's a history buff and followed a link on an academic site that lead him down a rabbit hole and he's looking at the modern idiots involved in this stuff. Jeanette might look because she overheard something some kid said to her kid, and Erik might because he had grandparents in the holocaust and is hyper vigilant yet Betty might look because she's writing a novel and looking for ideas to plug a plot hole. Mike might dive into Moon Hoaxerisms or 9-11 or chemtrails lunacy because he's fallen for it, or because he's looking to steer his teenage kid away from it, or because he thinks it's LOL funny.
Mel Brooks has said that the reason he played with the NAZI stuff in The Producers was that the best way he could oppose that stuff was to make fun of it and encourage people to laugh at it. Charlie Chaplain's little dictator was similar. Both men's work has historically been misinterpreted by some people and certainly if they had done their work in the internet era both that work itself and any web browsing histories they might have created while doing their research and writing could have been completely misunderstood. Web browsing by any of their cast, crew, set decorators, wardrobe department etc, along the way would be similarly fraught with hazard. This recent idea that a person's history of media consumption is up for analysis is a bad one - it was not too many years ago that the left in the USA was hyper-sensitive about the book checkout histories of people who used libraries; we actually had political fights about this stuff back then and it's probably true that people on the right were not sufficiently concerned about the possible misinterpretations of this stuff and the abuses of it, but now the matter seems entirely upended.
As in many things in life, across many fields, the WHY of a thing is often far more important that the who, where, when, or what of a thing. The WHY is also the one part of this that is impossible to glean from data mining; it's simply impossible to get to if you cannot actually read a person's mind. A pattern of a person studying something he abhors, or laughing at something he thinks is insane, can be indistinguishable from the pattern of a person enthralled - particularly given the all-too-human tendency to multi-task. A person who is at one moment looking at one thing seriously, can at another moment seek levity or seek to satisfy a curiosity, before returning to a serious bit. You might think you can get to a person's motives for a thing by looking at other stuff they have read/looked at, but that would be incorrect. There's simply no algorithm here that a quasi-cyborg like Zuckerberg and his army of minions can turn to and implement in code that will read a mind or analyze a soul. No amount of AI will solve this for big tech - it will always lead to bad analysis and wrong conclusions.
The opt-in[or out] problem (Score:2)
The problem with an option to "opt in" to wild conspiracy crap, or "explicit violence" or "white supremacist" junk is that you still leave it to the tech titans to decide what stuff is in those buckets. AFAIK Facebook has not yet fixed their censorship (they would say some bull like "content moderation") board which was initially filled with left wing Democrats - not a single Republican, a single "pro-life" person, or a single religious Christian or Jew. That fixed-in-concrete left-leaning bias is how abomi
What we really want is a good search system. (Score:1)
What we really need is a platform with
Yes... (Score:2)
YouTube has changed (Score:2)
I've broken the algorithms after using YouTube for years, liking thousands of videos and subscribing to hundreds of channels. At this point 50% of what YouTube recommends to me are videos that are 3 or more years old, and often from abandoned channels that have not posted new content in years.
If I log out and do private browsing with YouTube, I see a very different picture. Lots of misleading titles designed to get you to work. Lots of duplicate content from content farms. Videos with a thumbnail preview th
YouTube knows I am a 40 year old male virgin (Score:2)