I can only speak for myself, but I have always had bad luck with Linux on desktop. Something always breaks, isn’t compatible, or requires a lengthy installation process involving compiling multiple libraries because no .deb or .rpm is available.
On servers, it’s fantastic. If you count VMs, I have far more Linux installations than Windows. In general, I use Win10 LTSC for anything that requires a GUI and Ubuntu Server for anything that only needs CLI or hosts a web interface.
At least for me, the whole “made by devs for devs” isn’t really the major downfall. It’s the fact that it can’t be trusted to remain functional in a dynamic environment. I like using the command line, but sometimes that’s just not enough.
If I need a specific software package, I can download the source, compile it, along with the 100 of libraries that they chose not to include in the .tar.gz file, and eventually get it running.
However, when I do an “apt update” and it changes enough, then the binary I compiled earlier is going to stop working. Then I spend hours trying to recompile it along with it’s dependencies, only to find that it doesn’t support some obscure sub-version of a package that got installed along with the latest security updates.
In a static environment, where I will never change settings or install software (like my NAS), it’s perfect. On my desktop PC, I just want it to work well enough so I can tinker with other things. I don’t want to have to troubleshoot why Gnome or KDE isn’t working with my video drivers when all I want to do is launch remote desktop so I can tinker with stuff on a server that I actually want to tinker with.