• 0 Posts
  • 20 Comments
Joined 7 months ago
cake
Cake day: April 3rd, 2024

help-circle


  • These days ROCm support is more common than a few years ago so you’re no longer entirely dependent on CUDA for machine learning. (Although I wish fewer tools required non-CUDA users to manually install Torch in their venv because the auto-installer assumes CUDA. At least take a parameter or something if you don’t want to implement autodetection.)

    Nvidia’s Linux drivers generally are a bit behind AMD’s; e.g. driver versions before 555 tended not to play well with Wayland.

    Also, Nvidia’s drivers tend not to give any meaningful information in case of a problem. There’s typically just an error code for “the driver has crashed”, no matter what reason it crashed for.

    Personal anecdote for the last one: I had a wonky 4080 and tracing the problem to the card took months because the log (both on Linux and Windows) didn’t contain error information beyond “something bad happened” and the behavior had dozens of possible causes, ranging from “the 4080 is unstable if you use XMP on some mainboards” over “some BIOS setting might need to be changed” and “sometimes the card doesn’t like a specific CPU/PSU/RAM/mainboard” to “it’s a manufacturing defect”.

    Sure, manufacturing defects can happen to anyone; I can’t fault Nvidia for that. But the combination of useless logs and 4000-series cards having so many things they can possibly (but rarely) get hung up on made error diagnosis incredibly painful. I finally just bought a 7900 XTX instead. It’s slower but I like the driver better.


  • They did PR campaigns against Linux and OpenOffice for quite some time – until cloud computing took off and it turned out they could earn more money by supporting Linux than by fighting it.

    In fact, Microsoft weren’t happy about FOSS in general. I can still remember when they tried to make “shared source” a thing: They made their own ersatz OSI with its own set of licenses, some of which didn’t grant proper reuse rights – like only allowing you to use the source code to write Windows applications.



  • Like every time there’s an AI bubble. And like every time changes are that in a few years public interest will wane and current generative AI will fade into the background as a technology that everyone uses but nobody cares about, just like machine translation, speech recognition, fuzzy logic, expert systems…

    Even when these technologies get better with time (and machine translation certainly got a lot better since the sixties) they fail to recapture their previous levels of excitement and funding.

    We currently overcome what popped the last AI bubbles by throwing an absurd amount of resources at the problem. But at some point we’ll have to admit that doubling the USA’s energy consumption for a year to train the next generation of LLMs in hopes of actually turning a profit this time isn’t sustainable.



  • NTFS feels rock solid if you use only Windows and extremely janky if you dual-boot. Linux currently can’t really fix NTFS volumes and thus won’t mount them if they’re inconsistent.

    As it happens, they’re inconsistent all the time. I’ve had an NTFS volume become dirty after booting into Windows and then shutting down. Not a problem for Windows but Linux wouldn’t touch the volume until I’d booted into Windows at least once.

    I finally decided to use a storage upgrade to move most drives to Btrfs save for the Windows system volume and a shared data partition that’s now on ExFAT because it’s good enough for it.





  • Jesus_666@lemmy.worldtoLinux@lemmy.mlKDE Plasma needs stability
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    6 months ago

    Mind you, the real winner is of course Android. It has a consistent, easy to learn interface and a wide range of applications that integrate nicely.

    And we don’t need to speculate; it has already won and is the true face of Linux for the masses. Plenty of young people don’t even own traditional computers anymore and do everything on their smartphone or tablets.

    And that’s why this entire discussion is really just a form of fan wank; we don’t need to find a unified UI for Linux because it has already been found and has a massive market share. You may not like it but this is what peak performance looks like.

    Everything else can be as complicated, janky, or exotic as it wants because it doesn’t matter.


  • Honestly, if you want one simple DE for everyone it should probably be XFCE. Dead simple to use, feels vaguely familiar to Windows users, not overly complicated.

    KDE is heavily customizable, Gnome is very opinionated, and tiling WMs don’t adhere to orthodox UI patterns. Those are all suboptimal if you want something usable by the absolute widest range of users.






  • That’s why I dropped them when my mid-2013 MBP got a bit long in the tooth. Mac OS X, I mean OS X, I mean macOS is a nice enough OS but it’s not worth the extortionate prices for hardware that’s locked down even by ultralight laptop standards. Not even the impressive energy efficiency can save the value proposition for me.

    Sometimes I wish Apple hadn’t turned all of their notebook lines into MacBook Air variants. The unibody MBP line was amazing.



  • Your comment is a good reason why these tools have no place in the courtroom: The things you describe as imagination.

    They’re image generation tools that will generate a new, unrelated image that happens to look similar to the source image. They don’t reconstruct anything and they have no understanding of what the image contains. All they know is which color the pixels in the output might probably have given the pixels in the input.

    It’s no different from giving a description of a scene to an author, asking them to come up with any event that might have happened in such a location and then trying to use the resulting short story to convict someone.