Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Mozilla Youtube

YouTube's Recommendation System is Criticized as Harmful. Mozilla Wants To Research It (cnet.com) 84

YouTube's video recommendation system has been repeatedly accused by critics of sending people down rabbit holes of disinformation and extremism. Now Mozilla, the nonprofit that makes the Firefox browser, wants YouTube's users to help it research how the controversial algorithms work. From a report: Mozilla on Thursday announced a project that asks people to download a software tool that gives Mozilla's researchers information on what video recommendations people are receiving on the Google-owned platform. YouTube's algorithms recommend videos in the "What's next" column along the right side of the screen, inside the video player after the content has ended, or on the site's homepage. Each recommendation is tailored to the person watching, taking into account things like their watch history, list of channel subscriptions or location. The recommendations can be benign, like another live performance from the band you're watching. But critics say YouTube's recommendations can also lead viewers to fringe content, like medical misinformation or conspiracy theories.
This discussion has been archived. No new comments can be posted.

YouTube's Recommendation System is Criticized as Harmful. Mozilla Wants To Research It

Comments Filter:
  • Harmful (Score:5, Insightful)

    by JBMcB ( 73720 ) on Thursday September 17, 2020 @11:06AM (#60515534)

    Youtube looks at what you watch and recommends more videos like it. If you like videos about knitting, it gives you more videos about knitting. If you like videos about flat earth conspiracies, it gives you more videos on flat earth conspiracies. If you like videos about social justice, it gives you more videos about social justice.

    The problem people have isn't the recommendation system, it's that they don't want certain things to be recommended. I think it works fine.

    • Re: Harmful (Score:5, Interesting)

      by ToasterMonkey ( 467067 ) on Thursday September 17, 2020 @11:44AM (#60515732) Homepage

      Show of hands all, who has NOT had YouTube auto-play go from ok to what the hell in three videos.

      Yah, now what were you saying?

      • by aitikin ( 909209 )

        Show of hands all, who has NOT had YouTube auto-play go from ok to what the hell in three videos.

        Yah, now what were you saying?

        [Raises hand]

        But, that's probably because I don't really utilize YouTube much at all, and when I do, I most certainly do not go to the next video as it's usually a reference video for how to do something specific.

      • by sinij ( 911942 )
        Raises hand.

        Youtube recommendation engine is a case study in radicalization. No matter what subject you pick, from crafts, to politics, to religion, the recommended videos are always more radical or extreme position of this. This is done to drive 'engagement' and to show you more advertisements. While getting more involved in, for example, woodworking is not harmful (maybe to your wallet, as you will be buying tools) getting more involved in a number of other areas is very dangerous as you running substan
        • The conspiratorial thinking is the thinking that says: Hmmm. This here video site is showing people links to stuff I know isn't true. This is dangerous. To combat the danger we should put the power over what videos can be seen in the hands of the same people who can send people with guns to kick down your front door. All in the name of protecting your freedom, of course.
    • by doom ( 14564 )

      And yet, many people might prefer to experiment with *different* recommendation algorithms, perhaps even a system with some human curation in the system in an attempt at raising quality as opposed to finding clickbait. But you champions of the status quo are denying these people the chance to even try such a system.

    • by ljw1004 ( 764174 )

      Youtube looks at what you watch and recommends more videos like it. If you like videos about knitting, it gives you more videos about knitting...

      That's incorrect. Youtube looks at what you watch and recommends videos that it think will engage you. It does this because engagement = eyeballs = advertising revenue.

      How is "engage" different from "like"? Well, you probably don't like watching videos that make you angry at other people, but the human brain being what it is, they appeal to primal instincts and draw you in and leave you wanting more.

      • by Anonymous Coward

        Oh, user likes video games, HAVE YOU HEARD OF MINECRAFT AND FORTNITE, HERE'S THE TOP TWEEN E-CELEBS

        It wants to hype you into hype crap. Oh you like Citizen Kane? You're into cinema? Here's the capeshit of the month! You gotta see some Kardasian's Infinity War opinion!

        "Harmful" is trivial to how generally garbage it is at "finding". If you need help "finding" the highest-viewed video with the lowest signal:noise on a subject, full of fluff and ego, then... I guess you're in the right place. Youtube is about

    • by AmiMoJo ( 196126 )

      The problem is how it determines what videos are like the ones you watched. If you watch a video about coronavirus safety it thinks videos about 5g conspiracies and anti-mask bullshit are similar and gives you those.

      People know this so they try to get their conspiracy videos associated with all kinds of random things. In fact a lot of the conspiracies are just them noticing that videos on X are popular so what crap can we make up about X to get those recommendations?

    • by cusco ( 717999 )

      Nope, that's not the way it works for me at all. I'm interested in history, space, engineering, and science and tend to watch those sort of videos, interspersed with EDM concerts and Andean folk music. What are their recommendations? Ancient aliens, phantasmagorical Atlanteans building Inca and Egyptian megaliths, Christian and Muslim apologetics, and most bizarrely of all Trump videos. I've tried clicking through to down vote them hoping to clean up the feed, but to no avail.

    • But if you like videos about science, it sends you 50% unscientific codswollop. At the very least there needs to be a button labled "I don't ever want to see this video again", and possibly one saying "I think it should be sent for psychological investigation". If more than 10,000 people press the button, it might be time for a human to investigate.

      Ads should be equipped with a button saying "apply 50kV to the people who posted this".

      My grandmother always used to say "the adverts speak very highly of it,

      • by JBMcB ( 73720 )

        But if you like videos about science, it sends you 50% unscientific codswollop. At the very least there needs to be a button labled "I don't ever want to see this video again", and possibly one saying "I think it should be sent for psychological investigation". If more than 10,000 people press the button, it might be time for a human to investigate.

        I mostly watch movie reviews and science/engineering videos on Youtube, and I hardly get any conspiracy or pseudoscientific nonsense videos. It recommended a couple to me once, and I clicked on the (X) button saying I don't like those videos, and I don't think I've seen one pop up in my feed in years. I'm even subscribed to a couple of channels that cover conspiracy theories (more from a critical angle) and those videos still don't pop up in my recommendations. Getting inundated with nonsense videos hasn't

    • Re:Harmful (Score:4, Interesting)

      by iampiti ( 1059688 ) on Thursday September 17, 2020 @01:01PM (#60516078)
      Agree. Youtube's recommendations are usually very good and things I'm interested in. I don't think I've ever been suggested a video about extremist politics or conspiracy theories. And I suspect that if that happened such suggestions would go away if I just ignored them. Also, I'm an adult, I don't need babysitting
    • by taustin ( 171655 )

      About 20% of their recommendations for me are videos I have already watched.

      About 40% are similar to stuff I've already watched, most of which I only watched about 30 seconds of.

      About 20% have nothing whatsoever to do with anything I have ever watched. Extremist wingnut stuff would be a nice change of pace, but I get cooking videos.

      The remaining 20% are Gabriel Iglesias ("FLUFFY!"), all lined up in a row (most of which I've already watched.)

  • Mozilla wants its users to research YouTube's algorithms.

    What users?

    • If they keep distracting themselves with irrelevant nonsense like this, they won't have to worry about those pesky users for much longer. This is insanity. What in the world could lead them to think that this should be a mission-critical task that deserves paid development time and attention?

      YouTube and other platforms have also drawn blowback for helping to spread the QAnon conspiracy theory, which baselessly alleges that a group of "deep state" actors, including cannibals and pedophiles, are trying to bring down President Donald Trump. The stakes will continue to rise over the coming weeks, as Americans seek information online ahead of the US presidential election.

      Ah, got it. It's political.

      Well, good for you, Mozilla. Glad to see you're busy re-arranging those deck chairs while your flagship sinks from underneath you.

    • Mozilla wants ... users....

      You have to read between the lines.

  • Fixing what. (Score:2, Insightful)

    by xonen ( 774419 )

    The cure is even more harmful than the disease.

    Mozilla, please once again, stop being worried about politics and whatever keeps people busy. Focus on what you should do: build a browser. I understand you also hired a bunch of SJW's. I'm not suggesting to fire them, merely ignore them and their hobbies and tell them to go f*g back to work.

    • The cure is even more harmful than the disease.

      Facts not in evidence.

      Without knowing YT's algorithm, you do not know the severity of the disease.
      Without knowing YT's algorithm, you do not know the consequences of the cure.

      Yet you get modded insightful... bullshit. The opposite of that.

    • by sinij ( 911942 )

      The cure is even more harmful than the disease.

      Not necessary. If the cure is to remove all objectionable content, then yes, it is worse than the disease. If the cure is to change how recommendations are done, to limit unintentional exposure to extreme content, then it could work.

      The goal should be not to censor, say flat earth society, but to stop showing flat earth videos to people looking for sunset videos.

      • Re: (Score:3, Insightful)

        by xonen ( 774419 )

        to limit unintentional exposure to extreme content

        There, you just defined a political goal. Not relevant if the majority agrees on it, it's a political goal. You want to decide for others what they can see and not see, in this case unintentional exposure of people to extreme content. And up to (whoever) to define what extreme is.. See where this is going?

        Youtube can moderate their content as they see fit, after all, it's their content. Others might give them feedback on that, sure. But most companies stay away from politic-sensitive subjects unless it's th

        • I want a recommendation system that gives me what I'm interested in, no matter what that may be. I don't want it to try and manipulate me away from my interests which it nod do. Currently it also have a memory problem with my subscriptions. It only show recommendations related to what I've watched within a few days. It never checks my subscription list and thinks "Hm he haven't watched a video from this channel in a while and it do have new videos, I should recommend a new video from that channel." Recommen

      • by taustin ( 171655 )

        The problem isn't removing objectionable content, it's who gets to decide what's objectionable.

        Because that will always be people who want to decide what's objectionable for everyone else. And those are the last people who should be allowed to.

    • I donate a small amount every year to Mozilla because I like Firefox and I think it's worth it to keep alive an open source browser who isn't controlled by Google, Microsoft or Apple.
      But, lately they've been turning more and more into politics and I don't like it. I might even stop donating over it. Mozilla should be able to fulfill its mission without going so much into politics.
    • by AmiMoJo ( 196126 )

      Mozilla has been supportive of efforts to make the web better for users for a long time. From the early days of helping ad blockers be more efficient to being the first to block 3rd party cookies...

      Making YouTube better seems like a good thing you be doing.

    • and everything to do with getting you to install a plugin so they can research video watching habits.

      Though it's funny that whenever anyone questions if bad things are happening on a platform the immediate assumption is that those bad things involve people being exposed to extreme right wing politics. Nobody ever says "Man, I don't want people watching those crazy Breadtube videos".
      • Hi. Yeah indeed sickness of much of google youtube twitter facebook is they are mostly overpowered by sjws. Probably youtube to least extent. For seeing how social media and generally bigtech should be unbiased see duckduckgo bitchute and parler. If twatter facepalm and giggle allowed as much conservatism as these and as much as farleftism they allow id be more than happy. But here were riding slightly offtopic. As of whether yt recommends are harmful? Hmm im not 100% sure they are as long as if they didnt
  • How about they do some work on the web browser instead of worrying about which youtube video is queued? Does everyone there have ADD?

  • "But critics say YouTube's recommendations can also lead viewers to fringe content, like medical misinformation or conspiracy theories. "

    Like post from MSNBC, NBC, ABC, CBS, CNN, PBS, Fox News etc.

    Seen more medical miss information on these "News" cast than anywhere.

    • Any given source on any given day has a reasonable chance of having misinformation. But in aggregate, across many news source, correcting themselves and each other, we get a fairly accurate news summary over time. The problem is the YouTube algorithm that tends to narrow rather than broaden the research scope of a given like. What I would want for myself is that if I liking a video on flat earth, YouTube would give me more videos about geography, not just more videos about flat earth. It does me no good to

      • If you're interested in flat Earth, you probably aren't interested in geography.

        • That, I think, is exactly the flaw in the algorithm. It makes assumptions about what will be interesting qualitatively instead of just finding associated topics.

      • Indeed if we wanted less polarization of society and less far leftism and less dumbing down of general public that would be what bigtech would be doing. Yet free market is at works again. They want your eyeballs and they want your happiness. Fukin planet might be burning out and going to explode but theyll just keep you happy. What i might say with around 90% certainty leaving bigtech to free market was biggest mistake of last two decades or so. They could work better but they would have to run way duckduck
        • I strongly disagree. We have the First Amendment specifically to keep government out of the censorship business. I would far rather the situation we have in the USA today than the situation in China today. Disinformation can be fought without killing people. Government enforced silence cannot be fought except with civil war.

    • If you want depth, read your news. Video or audio cannot convey depth nearly that well because you're limited by time and you can't skim a video.

      It's probably true that consumer news is dumbed down so much that it's innacurate - especially medical, because most new discoveries do not establish new facts - at best, maybe a new theory. In trying to distill the most interesting part, they leave out the uncertainty.

      • I will refrain from commenting on scientificness of video and audio i dont get your comment and dont want to argue if no need be. However on expresivness of video especially but audio to some extent too i do hold different view. In times gone there was a lot of video content created both in my country in uk too and maybe in us as well whereas expressive power due to medium immersion was much higher than of written medium. One 1h episode of good video podcast can transfer much more knowledge than few scient
  • Check out the recent JRE. Ed describes these people who complain about the suggestions as being responsible for the culture of censorship and deplatforming that we so decry.

    Mozilla has recently been flexing itself as a bastion of illiberal thought. It's so heart-wrenching for those who have been following them since UIUC.

    I like that Ed has the time to think deeply about these issues during his exile. Too bad the modern Solzhenitsyn is in hiding from the USA rather than the USSR!

    • Check out the recent JRE.

      8u261? I mean, yeah - it's interesting that they've had 261 releases on the same version number, but it's not particularly interesting aside from TLS 1.3 support.

    • Point me to whats jre. Indeed agree mozilla could do something useful by tracking conspiracy theories extremism and far leftism of platforms. Even slashdot could be useful for this ways. Wed comment on content that passes on twatter and fcepalm and giggle and wed comment on content that got censored out. If im not mistaken if such tool was created b4 nov 3 we could change the way things will probably unroll in coming us election as way as we could crush unfixable cavity in bigtech products. Already thinking
  • Shouldn't they be spending the donations they get from their users on something useful, like you know, making their browser better.
  • The Social Dilemma (Score:5, Insightful)

    by 89cents ( 589228 ) on Thursday September 17, 2020 @11:30AM (#60515658)
    This exact problem is described in the documentary The Social Dilemma. https://www.imdb.com/title/tt1... [imdb.com] I'd recommend it. Basically is shows how tech company's algorithms designed to keep your attention to profile, track, and monetize you for the highest bidder are damaging to individuals and society.
    • by doom ( 14564 )

      It's a tricky set of issues though, and doing something about it would put us all in very new territory, and Mozilla is one of the least competent organizations I can think of to do an investigation in a new field.

      I hate to sound like one of our conservative friends-- championing the freedumb of corporations to do to us whatever they want-- but it would be kind of cool if Mozilla prioritized working on the browser, and stopped fucking around with bold new initiatives.

      But just to be clear, "working on th

      • But just to be clear, "working on the browser" does not require changing the UI every two weeks.

        Of COURSE it doesn't. Who'd want to wait 14 long, entire days just to not be able to locate another icon or function? Just treat it as a never-ending treasure search -- every update brings a brand new surprise!

        UI "programmers" are the original universal basic income example. You might pay them initially to do a job but then you keep ON paying them to NOT do their job. Because getting fired is just so archaic, don't-ya'-know?

  • by Arthur, KBE ( 6444066 ) on Thursday September 17, 2020 @11:52AM (#60515770)
    It seems no matter what type of content I'm watching, within two videos, nearly all the recommendations have something to do with ancient Egypt. This isn't something I've searched out, and it happens even without being logged in.
  • I spend sometimes hours certain my days clicking 'don't recommend channel' but I get always the same crap.

    I'm a 64 old male Green Socialist from Europe and it thinks I'm a right wing Republican who likes FoxNews, Breitbart, manga, anime, (I don't) french bulldog farts, guns, revolvers, silencers,(I don't like those either) heavy makeup (sic), and lots of stuff in languages I don't comprehend, much less the alphabet they are written in, even though my VPN uses my home country as exit point.

  • by tiqui ( 1024021 ) on Thursday September 17, 2020 @12:40PM (#60516000)

    There's nothing wrong with the crazy whacko recommendations on Youtube. They may well be linked to each other in ways users think are spooky or they may appear to be really off-the-wall sometimes, but as long as nobody is intentionally manipulating them they should remain. The people who find them upsetting and think people are being steered to "wrong" information have the same blindness that the tech giants have - and it's a widespread blindness in the tech community generally:

    It's one thing to see a bunch of raw data, find links between it, sort it, categorize it, add up numbers about it, and so on, but it's an entirely different yet vital thing to know the WHYs of it. Joe can look at a conspiracy site because he's a moron who is easily sucked into it, but Sam might go because he's curious about the nonsense his neighbor seems to believe, and Susan can go there because she's had a bad day and need a laugh. Edna might go to a neo NAZI site or vid because she has a fetish for guys in tall black boots and uniforms, George might because he is a loner who fancies himself as a superior Arian, but Mike might because he's a history buff and followed a link on an academic site that lead him down a rabbit hole and he's looking at the modern idiots involved in this stuff. Jeanette might look because she overheard something some kid said to her kid, and Erik might because he had grandparents in the holocaust and is hyper vigilant yet Betty might look because she's writing a novel and looking for ideas to plug a plot hole. Mike might dive into Moon Hoaxerisms or 9-11 or chemtrails lunacy because he's fallen for it, or because he's looking to steer his teenage kid away from it, or because he thinks it's LOL funny.

    Mel Brooks has said that the reason he played with the NAZI stuff in The Producers was that the best way he could oppose that stuff was to make fun of it and encourage people to laugh at it. Charlie Chaplain's little dictator was similar. Both men's work has historically been misinterpreted by some people and certainly if they had done their work in the internet era both that work itself and any web browsing histories they might have created while doing their research and writing could have been completely misunderstood. Web browsing by any of their cast, crew, set decorators, wardrobe department etc, along the way would be similarly fraught with hazard. This recent idea that a person's history of media consumption is up for analysis is a bad one - it was not too many years ago that the left in the USA was hyper-sensitive about the book checkout histories of people who used libraries; we actually had political fights about this stuff back then and it's probably true that people on the right were not sufficiently concerned about the possible misinterpretations of this stuff and the abuses of it, but now the matter seems entirely upended.

    As in many things in life, across many fields, the WHY of a thing is often far more important that the who, where, when, or what of a thing. The WHY is also the one part of this that is impossible to glean from data mining; it's simply impossible to get to if you cannot actually read a person's mind. A pattern of a person studying something he abhors, or laughing at something he thinks is insane, can be indistinguishable from the pattern of a person enthralled - particularly given the all-too-human tendency to multi-task. A person who is at one moment looking at one thing seriously, can at another moment seek levity or seek to satisfy a curiosity, before returning to a serious bit. You might think you can get to a person's motives for a thing by looking at other stuff they have read/looked at, but that would be incorrect. There's simply no algorithm here that a quasi-cyborg like Zuckerberg and his army of minions can turn to and implement in code that will read a mind or analyze a soul. No amount of AI will solve this for big tech - it will always lead to bad analysis and wrong conclusions.

  • There is no possible scenario in which a recommendation system provides a good experience for the user. Recommendations are not about what you do or don't like. It's about amplification. It's all about what you're the most likely to click on, get served ads and make money. Simple as that. It is and always has been about profit. The recommendation models have been trained with this specific goal in mind. The silo effect is just a byproduct of profit-greedy algorithms.

    What we really need is a platform with
  • I rarely, if ever, want to watch whatever is next, simply because 99.999(not sure how many more 9s)% of videos on YouTube are garbage. When they do correctly predict a video I would have liked to watch, it's usually something I have already watched or it covers the exact same topic. The viewer stack then quickly diverts to something like kids dropping mentos into coke, which was exciting for me for about 1 hour 20 years ago.
  • I've broken the algorithms after using YouTube for years, liking thousands of videos and subscribing to hundreds of channels. At this point 50% of what YouTube recommends to me are videos that are 3 or more years old, and often from abandoned channels that have not posted new content in years.

    If I log out and do private browsing with YouTube, I see a very different picture. Lots of misleading titles designed to get you to work. Lots of duplicate content from content farms. Videos with a thumbnail preview th

  • I watch a lot of Jordan Peterson videos and lately I have been getting videos from Better Bachelor which is basically MGTOW RedPill kind of stuff. It's really not healthy to watch the redpill/MGTOW/incel stuff but it's like jelly donuts in the break room, they know my weaknesses and they are exploiting them.

Receiving a million dollars tax free will make you feel better than being flat broke and having a stomach ache. -- Dolph Sharp, "I'm O.K., You're Not So Hot"

Working...