Very, Very Few People Are Falling Down the YouTube Rabbit Hole | The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that::The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that.

  • TropicalDingdong@lemmy.world
    link
    fedilink
    English
    arrow-up
    108
    ·
    11 months ago

    Bro people were eating tidepods and we saw a resurgence of nazism and white nationlism.

    I think we at least know the effects of what was happening before.

  • masterofn001@lemmy.ca
    link
    fedilink
    English
    arrow-up
    104
    arrow-down
    3
    ·
    edit-2
    11 months ago

    Recently watched a documentary called ‘the YouTube effect’ by Alex Winter (bill of bill and ted) which goes into how YouTube was essential in the current global state of radicalized individuals.

    In the earlyish days of the internet (late 1990s / early 00s) I fell deep down the rabbit hole of right wing hate and conspiracy theories…

    One subject of the doc explains his descent. It is almost exactly mine. Only these days it is hyper stimulated, laser targeted, data driven, psychological warfare, wrapped in polished, billionaire backed campaigns.

    It comes at you from wherever you are.

    Crypto bros. Health/hydro bros. incel bros. Christian bros. Muslim bros. Rogan bros. Peterson bros. Elmo bros. Tech bros. Anon bros. etc.

    By the time a lot of people realize what’s happened, if ever, they’re already in too deep.

    • mdm_@lemmy.ca
      link
      fedilink
      English
      arrow-up
      48
      arrow-down
      1
      ·
      11 months ago

      Crypto bros. Health/hydro bros. incel bros. Christian bros. Muslim bros. Rogan bros. Peterson bros. Elon bros. Tech bros. Anon bros. etc.

      Hmm I’m sensing a theme here…

      • ST5000@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        11 months ago

        There’s a recent documentary movie about that called Bros I suggest you check it out.

      • NewNewAccount@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        11 months ago

        Leftists aren’t immune. My YouTube has a lot of Vaush, Hasan, Sam Seder, etc.

        Though I do also get Patrick Bet David and PragerU thrown in too, I think because I can’t help but watch for a window into their line of thinking.

        • Fisk400@feddit.nu
          cake
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          1
          ·
          11 months ago

          It is not quite the same. It recommends things you watch. If you watch Hasan you tend to get more Hasan stuff but only on rare occasions do you get Vaush stuff.

          Back in the day you could watch one non political thunderfoot about some scam and the recommendations would be a rouges gallery of anti-sjws with no other recommendations.

          Now you can get radicalized because you want to be and it’s a nice saunter down the hill. Then it was a sheer cliff you could accidentally fall into. If you didn’t experience it you can’t really imagine how stupid it was.

      • kambusha@feddit.ch
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 months ago

        Not OP but I’m guessing the algorithms that recommend videos have gone down one direction, so you’re in an echo chamber where it seems like that is everything there is. You’d never hear a counter-argument; only ever one side of the argument.

      • masterofn001@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        11 months ago

        Whatever your interest or hobby, there is a psyop devoted to it. Wherever you are, whatever you do, whatever you’re curious about, you will find targeted propaganda.

        Because of the methods used. The conspiracies wrapped in a cozy blanket of semi truth and emotional manipulation make it easy to fall prey to.

        If you’re angry, it will make you angrier. Violent, even. If you’re happy, it can make you hate with the loving joy of false religious zeal. If you are confused and uncertain, it will provide the esoteric truths you seek, with the absolute certainty of a “final solution”

        Etc

        And it’s difficult to unwind.

  • Duamerthrax@lemmy.world
    link
    fedilink
    English
    arrow-up
    79
    ·
    11 months ago

    Weird. Youtube keeps recommending right wing videos even though I’ve purged them from my watch history and always selected Not Interested. It got to the point that I installed a 3rd party channel blocker.

    I don’t even watch too many left leaning political videos and even those are just tangentially political.

    • nutsack@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      arrow-down
      1
      ·
      edit-2
      11 months ago

      i think if you like economics or fast cars you will also get radical right wing talk videos. if you like guns it’s even worse.

        • nutsack@lemmy.world
          link
          fedilink
          English
          arrow-up
          12
          ·
          edit-2
          11 months ago

          i made a fresh google account specifically to watch daily streams from one stocks channel (the guy is a liberal) and i got cars, guns, right wing politics in the feed.

          my general use account suggestion feed is mostly camera gear, leftist video essays and debate bro drama.

      • Edgelord_Of_Tomorrow@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        11 months ago

        Oh, you like WW2 documentaries about how liberal democracy crushed fascism strategically, industrially, scientifically and morally?

        Well you might enjoy these videos made by actual Nazis complaining about gender neutral bathrooms!

    • Kuya@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      11 months ago

      I’ve been watching tutorials on jump ropes and kickboxing. I do watch YouTube shorts, but lately I’m being shown Andrew Tate stuff. I didn’t skip it quick enough, now 10% of the things I see are right leaning bot created contents. Slowly gun related, self defense, and Minecraft are taking over my YouTube shorts.

      • DreadPirateShawn@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        6
        ·
        11 months ago

        If you don’t already, you can view your watch history and delete things.

        I do that with anything not music related, and it keeps my recommendations extremely clean.

      • Ech@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        I like a few Minecraft channels, but I only watch it in private tabs because I know yt will flood my account with it if I’m not careful. There is no middle ground with The Algorithm.

        • MrScottyTay@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          11 months ago

          Yeah it’s too much skewed by recent viewing. Even if you’re subscribed to X amount of channels about topic Y but you just watched one video on topic Z, then say goodbye to Y, you only like Z now.

    • spacebirb@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      3
      ·
      11 months ago

      I know everyone likes to be conspiracy on this but it’s really just trying to get your attention any way possible. There’s more right wing popular political videos, so the algorithm is more likely to suggest them. These videos also get lots of views so again, more likely to be suggested.

      Just ignore them and watch what you like

    • bob_wiley@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      11 months ago

      I noticed when I went to a hotel that the recommended videos for a logged out user were drastically different than my own. For example, I always found it a bit odd that Mr Beast is the #1 person on YouTube, yet I almost never get recommended his videos, but they were all over the TV in the hotel.

      I decided to try a hard reset. I deleted my entire watch history, start at 0 again. I also deleted all but maybe 5 of my subscriptions. Almost nothing changed.

          • Asymptote@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            11 months ago

            I’m hopefully wrong because it might have GDPR implications, but on the other hand they’re probably using an LLm and once theyre on the back of that tiger they cnt let go of the tail.

      • djmarcone@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        11 months ago

        Yeah I didn’t even know who he was until a few months ago. Yet he is the top channel.

        YouTube knows w

    • doggle@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      I’m sure YouTube hangs on to that data even if you delete the history. I would guess that since you don’t watch left wing videos much their algorithm still thinks you are politically right of center? Although I would have expected it to just give up recommending political channels altogether at some point. I hardly ever get recommendations for political stuff, and right wing content is the minority of that

      • Duamerthrax@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 months ago

        I watch some left wing stuff, but I prefer my politics to be in text form. Too much dramatic music and manipulative editing even in things I agree with. The algorithm should see me as center left if anything, but because I watch some redneck engineering videos(that I ditch if they do get political), it seems to think I should also like transphobic videos.

  • there1snospoon@ttrpg.network
    link
    fedilink
    English
    arrow-up
    35
    ·
    11 months ago

    The article below:

    Around the time of the 2016 election, YouTube became known as a home to the rising alt-right and to massively popular conspiracy theorists. The Google-owned site had more than 1 billion users and was playing host to charismatic personalities who had developed intimate relationships with their audiences, potentially making it a powerful vector for political influence. At the time, Alex Jones’s channel, Infowars, had more than 2 million subscribers. And YouTube’s recommendation algorithm, which accounted for the majority of what people watched on the platform, looked to be pulling people deeper and deeper into dangerous delusions.

    The process of “falling down the rabbit hole” was memorably illustrated by personal accounts of people who had ended up on strange paths into the dark heart of the platform, where they were intrigued and then convinced by extremist rhetoric—an interest in critiques of feminism could lead to men’s rights and then white supremacy and then calls for violence. Most troubling is that a person who was not necessarily looking for extreme content could end up watching it because the algorithm noticed a whisper of something in their previous choices. It could exacerbate a person’s worst impulses and take them to a place they wouldn’t have chosen, but would have trouble getting out of.

    Just how big a rabbit-hole problem YouTube had wasn’t quite clear, and the company denied it had one at all even as it was making changes to address the criticisms. In early 2019, YouTube announced tweaks to its recommendation system with the goal of dramatically reducing the promotion of “harmful misinformation” and “borderline content” (the kinds of videos that were almost extreme enough to remove, but not quite). At the same time, it also went on a demonetizing spree, blocking shared-ad-revenue programs for YouTube creators who disobeyed its policies about hate speech.Whatever else YouTube continued to allow on its site, the idea was that the rabbit hole would be filled in.

    A new peer-reviewed study, published today in Science Advances, suggests that YouTube’s 2019 update worked. The research team was led by Brendan Nyhan, a government professor at Dartmouth who studies polarization in the context of the internet. Nyhan and his co-authors surveyed 1,181 people about their existing political attitudes and then used a custom browser extension to monitor all of their YouTube activity and recommendations for a period of several months at the end of 2020. It found that extremist videos were watched by only 6 percent of participants. Of those people, the majority had deliberately subscribed to at least one extremist channel, meaning that they hadn’t been pushed there by the algorithm. Further, these people were often coming to extremist videos from external links instead of from within YouTube.

    These viewing patterns showed no evidence of a rabbit-hole process as it’s typically imagined: Rather than naive users suddenly and unwittingly finding themselves funneled toward hateful content, “we see people with very high levels of gender and racial resentment seeking this content out,” Nyhan told me. That people are primarily viewing extremist content through subscriptions and external links is something “only [this team has] been able to capture, because of the method,” says Manoel Horta Ribeiro, a researcher at the Swiss Federal Institute of Technology Lausanne who wasn’t involved in the study. Whereas many previous studies of the YouTube rabbit hole have had to use bots to simulate the experience of navigating YouTube’s recommendations—by clicking mindlessly on the next suggested video over and over and over—this is the first that obtained such granular data on real, human behavior.

    The study does have an unavoidable flaw: It cannot account for anything that happened on YouTube before the data were collected, in 2020. “It may be the case that the susceptible population was already radicalized during YouTube’s pre-2019 era,” as Nyhan and his co-authors explain in the paper. Extremist content does still exist on YouTube, after all, and some people do still watch it. So there’s a chicken-and-egg dilemma: Which came first, the extremist who watches videos on YouTube, or the YouTuber who encounters extremist content there?

    Examining today’s YouTube to try to understand the YouTube of several years ago is, to deploy another metaphor, “a little bit ‘apples and oranges,’” Jonas Kaiser, a researcher at Harvard’s Berkman Klein Center for Internet and Society who wasn’t involved in the study, told me. Though he considers it a solid study, he said he also recognizes the difficulty of learning much about a platform’s past by looking at one sample of users from its present. This was also a significant issue with a collection of new studies about Facebook’s role in political polarization, which were published last month (Nyhan worked on one of them). Those studies demonstrated that, although echo chambers on Facebook do exist, they don’t have major effects on people’s political attitudes today. But they couldn’t demonstrate whether the echo chambers had already had those effects long before the study.

    The new research is still important, in part because it proposes a specific, technical definition of rabbit hole. The term has been used in different ways in common speech and even in academic research. Nyhan’s team defined a “rabbit hole event” as one in which a person follows a recommendation to get to a more extreme type of video than they were previously watching. They can’t have been subscribing to the channel they end up on, or to similarly extreme channels, before the recommendation pushed them. This mechanism wasn’t common in their findings at all. They saw it act on only 1 percent of participants, accounting for only 0.002 percent of all views of extremist-channel videos.

    This is great to know. But, again, it doesn’t mean that rabbit holes, as the team defined them, weren’t at one point a bigger problem. It’s just a good indication that they seem to be rare right now. Why did it take so long to go looking for the rabbit holes? “It’s a shame we didn’t catch them on both sides of the change,” Nyhan acknowledged. “That would have been ideal.” But it took time to build the browser extension (which is now open source, so it can be used by other researchers), and it also took time to come up with a whole bunch of money. Nyhan estimated that the study received about $100,000 in funding, but an additional National Science Foundation grant that went to a separate team that built the browser extension was huge—almost $500,000.

    Nyhan was careful not to say that this paper represents a total exoneration of YouTube. The platform hasn’t stopped letting its subscription feature drive traffic to extremists. It also continues to allow users to publish extremist videos. And learning that only a tiny percentage of users stumble across extremist content isn’t the same as learning that no one does; a tiny percentage of a gargantuan user base still represents a large number of people.

    This speaks to the broader problem with last month’s new Facebook research as well: Americans want to understand why the country is so dramatically polarized, and people have seen the huge changes in our technology use and information consumption in the years when that polarization became most obvious. But the web changes every day. Things that YouTube no longer wants to host could still find huge audiences, instead, on platforms such as Rumble; most young people now use TikTok, a platform that barely existed when we started talking about the effects of social media. As soon as we start to unravel one mystery about how the internet affects us, another one takes its place.

    • intensely_human@lemm.ee
      link
      fedilink
      English
      arrow-up
      16
      ·
      11 months ago

      Another way to put that study’s weakness, in scientific terms, is that there’s no control group against which the studied group is being compared. There’s zero indication that the 2019 changes had any effect at all, without some data from before those changes.

      • Nastybutler@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        11 months ago

        Always love when people try to hold social sciences to the same standard as physical sciences

    • gothicdecadence@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 months ago

      I’ve never heard of Rumble before, apparently it’s a video platform and the company that owns Truth social, so it’s very popular with the far right

    • Cosmic Cleric@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      11 months ago

      The article below:

      Honestly don’t mean this as an attack, but couldn’t people just clicked on the link, if they really wanted to read the article?

  • Uriel238 [all pronouns]@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    1
    ·
    11 months ago

    Weirdly, YouTube’s algo propelled me down the Pinko-commie anarcho-socialist boy-we-suck-at-democracy rabbit hole. I was an avid BreadTuber long before I ever heard the name BreadTube.

    • 31337@sh.itjust.works
      cake
      link
      fedilink
      English
      arrow-up
      5
      ·
      11 months ago

      Yeah, it started for me during Covid when I felt like I needed long-form podcasts/streamers in the background for noise while working from home. I think my progression was from The Worst Year Ever -> Chapo Trap House -> It Could Happen Here -> PhilosphyTube -> Contrapoints -> Vaush. TBF, I’ve been a leftist before Youtube existed, probably starting with Chomsky, Einstein’s article, and random pirated documentaries.

  • qwamqwamqwam@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    edit-2
    11 months ago

    Wait what? Maybe I’m misunderstanding, but this is what I got out of the article:

    “We had anecdotes and preliminary evidence of a phenomenon. A robust scientific study showed no evidence of said phenomenon. Therefore, the phenomenon was previously real but has now stopped.”

    That seems like really, really bad science. Or at least, really really bad science reporting. Like, if anecdotes are all it takes, here’s one from just a few weeks ago.

    I left some Andrew Tate-esque stuff running overnight by accident and ended up having to delete my watch history to get my homepage back to how it was before.

    • TimewornTraveler@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      11 months ago

      From the quoted bit it sounds like there was credible science that found nothing. That doesn’t mean there is nothing, but just that they found nothing.

    • Franzia@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      11 months ago

      We are not on a website. Therefore, we have a better solution than the top-down approach that Youtube uses.

    • ram@lemmy.ca
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      11 months ago

      Can always go to one of the many instances that defederated them. Not like there’s account-wide upvote points to lose or anything. (genuine suggestion)

    • doggle@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      9
      ·
      11 months ago

      There are many instances that have defederated them that you could join. Or if you’re really serious you could host your own.

    • o_d [he/him]@lemmygrad.ml
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      23
      ·
      11 months ago

      There’s plenty of instances that have de-federated from Hexbear. Go join them. Have fun in your echo-chamber. Loser. 👋

  • bob_wiley@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    11 months ago

    I still fall down YouTube rabbit holes all the time, just not ones for radicalization, which I don’t think I’ve ever actually seen.

  • skymtf@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    5
    ·
    11 months ago

    Who did these stats, I’m getting more right wing proganda than ever. Also Facebook is just as bad as ever. I really like stuff like the fediverse since I can control my feed.

    • Wolpertinger@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      Me, too. I’m always recommended Joe Rogan or Jordan Peterson videos, with a sprinkling of Ben Shapiro. I even got someone claiming the holocaust was overblown (i reported them). All within the past few months.

      I don’t get recommended regular videos like that, but youtube shorts are full of that garbage. I suspect it’s a blind spot

  • inspxtr@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    11 months ago

    I was aware of this study when they presented it virtually (can’t remember where), and while I don’t have an issue with their approach and results, I’m more concerned about the implications of these numbers. The few percent that were exposed to extremist content may seen small. But scaling that up to population level, personally that is worrisome to me … The impact of the few very very bad apples can still catastrophic.

  • scarabic@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    11 months ago

    If it’s true that they have closed the radicalization rabbit hole then that is a huge achievement and very very good news.

  • Anders429@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    11 months ago

    Very, Very Few People Are Falling Down the YouTube Rabbit Hole | The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that::The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that.