• hexabs@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    3
    ·
    8 months ago

    Are you being intentionally daft? You realise there is no algorithm behind Lemmy, right? You aren’t being shoved controversial polarizing content subliminally here.

    The worst of Lemmy is a certain instance… That I have never heard from after defederation.

    • adriator@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      The worst of Lemmy is a certain instance… That I have never heard from after defederation.

      Yeah, defederating from Beehaw was definitely a great decision. I’m so glad I don’t have to see those guys’ posts anymore.

    • Terrasque@infosec.pub
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      9
      ·
      8 months ago

      You realise there is no algorithm behind Lemmy, right?

      Of course there is. Even “sort by newest” is an algorithm, and the default view is more complicated than that.

      You aren’t being shoved controversial polarizing content subliminally here.

      Neither are you on TikTok, unless you actively go looking for it

      • Korne127@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        3
        ·
        8 months ago

        Neither are you on TikTok, unless you actively go looking for it

        That’s just genuine nonsense. The whole point of platforms like TikTok are the modern recommender systems that (simplified) lead to algorithmic radicalisation. Because these systems heavily optimise towards user engagement, they naturally spread misinformation and controversial content.
        And because this kind of content statistically gets more user engagement as people commend on it and spend more time with it, it spreads quicker. This has also e.g. been confirmed by a leaked internal Facebook memo.

        And additionally, these systems are personalised, so when you start to interact with it, you get more and more similar content. This leads to a radicalisation pipeline in which the platforms normalises these positions in echo chambers to you.