• 0 Posts
  • 157 Comments
Joined 3 years ago
cake
Cake day: September 2nd, 2023

help-circle
  • Wtf is this?

    The most foreseeable event of the last 20 years.

    Massive out of this world investment + no demand = prices so cheap they were operating at a huge loss

    Operating at a huge loss + time = huge enshittification

    Raising prices is the easiest form of enshittification. Ads are coming too. Lastly it will be degrading features. Incorporating more features that no one wants, and bundling with other services that no one wants.











  • Well yes, the LLMs are not the ones that actually generate the images. They basically act as a translator between the image generator and the human text input. Well, just the tokenizer probably. But that’s beside the point. Both LLMs and image generators are generative AI. And have similar mechanisms. They both can create never-before seen content by mixing things it has “seen”.

    I’m not claiming that they didn’t use CSAM to train their models. I’m just saying that’s this is not definitive proof of it.

    It’s like claiming that you’re a good mathematician because you can calculate 2+2. Good mathematicians can do that, but so can bad mathematicians.




  • The wine thing could prove me wrong if someone could answer my question.

    But I don’t think my theory is that wild. LLMs can interpolate, and that is a fact. You can ask it to make a bear with duck hands and it will do it. I’ve seen images on the internet of things similar to that generated by LLMs.

    Who is to say interpolating nude children from regular children+nude adults is too wild?

    Furthermore, you don’t need CSAM for photos of nude children.

    Children are nude at beaches all the time, there probably are many photos on the internet where there are nude children in the background of beach photos. That would probably help the LLM.