• Foofighter@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    2 days ago

    So you expect that an AI provides a morally framed view on current events that meet your morally framed point of view?

    The answer provides a concise overview on the topic. It contains a legal definition and different positions on that matter. It does at not point imply. It’s not the job of AI (or news) to form an opinion, but to provide facts to allow consumers to form their own opinion. The issues isn’t AI in this case. It’s the inability of consumers to form opinions and their expec that others can provide a right or wrong opinion they can assimilation.

    • Anahkiasen@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      2 days ago

      I agree and that’s sad but that’s also how I’ve seen people use AI, as a search engine, as Wikipedia, as a news anchor. And in any of these three situations I feel these kind of “both sides” strictly surface facts answers do more harm than good. Maybe ChatGPT is more subtle but it breaks my heart seeing people running to DeepSeek when the vision of the world it explains to you is so obviously excised from so many realities. Some people need some morals and actual “human” answers hammered into them because they lack the empathy to do so themselves unfortunately.