cross-posted from: https://lemmy.world/post/3320637

YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.

  • sabogato@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    3
    ·
    edit-2
    1 year ago

    I sub to primarily leftist content and their YouTube shorts algorithm insists on recommending the most vile far right content on the planet. It is to the point that I’m convinced YouTube is intentionally trying to shift people far right

    • pachrist@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      1 year ago

      I primarily watch woodworking or baking content on Youtube. I feel like the far right content is super prevalent with Shorts. I’ll watch something like a quick tool review, and the next video will be someone asking folks on the street if it’s ok to be white. What color you are isn’t your decision, but what you do every day is, and being some dumbass white kid accosting black tourists in Times Square for shitty reaction content is just gross.

      It doesn’t matter how often I say I dislike the content, block channels or whatever, Youtube has just decided it’s going to check in from time to time and see if I want to let loose my inner Boomer and rage with Rogan.

      • Koboldschadenfroh@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Yes they are. I mean, I watch some left wing political content sometimes and I think that’s why it gives me lot of that right wing shit, but I also think it’s both and youtube is also pushing this stuff. And their algorithms suck just so much.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      1 year ago

      It could be that pushing videos on the other side of the political spectrum gets interactions in the form of people sharing/commenting on it. Even if you disagree, going “Why does YouTube recommend this, this is awful” is still a share.

      The algorithm prioritises interactions above all else, and fewer things get people interacting more than being wrong, or them disagreeing vehemently.

      • Koboldschadenfroh@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        2
        ·
        edit-2
        1 year ago

        I deleted my whole history, because of the weird recommendations. But for some reason they won’t let you delete likes on comments anymore. But if I liked a negative comment or criticism on a video I don’t like I’m basically fucked and will always get recommended THAT kind of video, even though I don’t want to see it. It drove me kinda mad that they won’t let you delete those anymore. So I deleted my whole channel and use another one now, where I won’t like ANY comments. And I also installed an extension on my browser where you can block certain keywords. No more fucking Jordan Peterson, Joe Rogan or Alpha male bullshit for me! Woohoo! I do wonder though why nobody talks about this, cause I don’t think it’s actually legal for google to keep data you want deleted up, isn’t it? Has nobody but me noticed? Is it a bug? You can click to delete, but those likes will always show back up if you refresh the page.

    • Evie @lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 year ago

      This is happening on my FB video feed. I watch a funny chick called Charlotte Dobre and she does funny reaction videos. I honestly love her, but all my algorithm shows me for recommendations are these cop brutality videos with comments praising the cops, and right wing crap that praises Abbotts wall and desantis dictatorship. It drives me nuts, and no matter howany pages I block I always get more right wing recommended crap videos

    • bitwolf@lemmy.one
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      1 year ago

      Wow I am so surprised by this. I watch mainly tech and gardening YouTube and my shorts have been extremely applicable to me.

      Even when I use a new computer like at work the shorts are mostly pop culture.

      Didn’t make shorts any less annoying though

    • Tilgare@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      6
      ·
      1 year ago

      I literally only get Marvel Snap/general gaming, College Humor, tech, educational, stand up comedy, and drones. That’s it. I don’t mean to victim blame, but it learns what you click and what you stay to watch.

      • ChewTiger@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 year ago

        Mine acted similar to yours. I recently started watching a few more short videos and now it’s showing me an unfortunate amount of that far right nonsense.

      • SirStumps@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        2
        ·
        1 year ago

        I am pretty sure it is just showing politically charged content based on people watching other politically charged content. I feel the blame is misdirected at something that only provides content people will like based on their past.

      • VonCesaw@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Depends entirely on what you’re subscribed to, if you have multiple linked youtube accounts (such as the premium family plan) it depends on what THEY’RE subscribed to, and depends on location (my recommendations at home and my recommendations at work are wildly different)