• 1 Post
  • 332 Comments
Joined 5 months ago
cake
Cake day: June 9th, 2024

help-circle

  • And it doesn’t mean they can take away anything.

    Not if they’re able to monetize your small bugfix

    The problem is they can, and that’s not the point - I don’t care if you make money with something I spent my time on willingly, I care that you’re forcing me to say you’re the full and sole owner of my contributions and can do whatever you want at any point in the future with them.

    Signing a CLA puts the full ownership of the code in the hands of whomever you’ve signed the CLA with which means they have the full ability and legal right to do any damn thing they want, which often includes telling you to fuck yourself, changing the license, and running off to make a commercial product while both killing the AGPLed version, and fucking everyone who spent any time on it.

    If you have a CLA, I don’t care if your project gives out free handjobs: I don’t want it anywhere near anything I’m going to either be using or have to maintain.

    And sure you can fork from before the license change, but I’m unwilling to put a major piece of software into my workflows and hope that, if something happens, someone will come along and continue working on it.

    Frankly, I’m of the opinion that if you’re setting up a project and make the very, very involved decision to go with a CLA and spend the time implementing one, you’re spending that time because you’ve already determined it’s probably in your interests later to do a rugpull. If you’re not going to screw everyone, you don’t go to the store and buy a gallon of baby oil.

    I’ve turned into the person who doesn’t really care about new shit until it’s been around a decade, has no CLAs, and is under a standard GPL/AGPL license (none of this source-available business license nonsense), and has a proven track record of the developers not being shitheads.



  • Quickest peak and then utter vanishing of any interest in a project I’ve had in a while.

    Wouldn’t mind something a little more open than SearXNG in that it owns it’s own database, but requiring that they be the sole owner of anything anyone contributes AND having the ability to yank the rug at any time they feel like it pretty much puts it in the meh-who-cares category.

    Had enough stupid shit yanked over the past few years that I really just don’t care or have time to deal with any that is already prepping for their eventual enshittification.


  • contrast to their desktop offerings

    That’s because server offerings are real money, which is why Intel isn’t fucking those up.

    AMD is in the same boat: they make pennies on client and gaming (including gpu), but dumptrucks of cash from selling Epycs.

    IMO, the Zen 5(%) and Arrow Lake bad-for-gaming results are because uarch development from Intel and AMD are entirely focused on the customers that pay them: datacenter and enterprise.

    Both of those CPU families clearly show that efficiency and a focus on extremely threaded workloads were the priorities, and what do you know, that’s enterprise workloads!

    end of the x86 era

    I think it’s less the era of x86 is ended and more the era of the x86 duopoly putting consumer/gaming workloads first has ended because, well, there’s just no money there relative to other things they could invest their time and design resources in.

    I also expect this to happen with GPUs: AMD has already given up, and Intel is absolutely going to do that as soon as they possibly can without it being a catastrophic self-inflicted wound (since they want an iGPU to use). nVidia has also clearly stopped giving a shit about gaming - gamers get a GPU a year or two after enterprise has cards based on the same chip, and now they charge $2000* for them - and they’re often crippled in firmware/software so that they won’t compete with the enterprise cards as well as legally not being allowed to use the drivers in a situation like that.

    ARM is probably the consumer future, but we’ll see who and with what: I desperately hope that nVidia and MediaTek end up competitive so we don’t end up in a Qualcomm oops-your-cpu-is-two-years-old-no-more-support-for-you hellscape, but well, nVidia has made ARM SOCs for like, decades, and at no point would I call any of the ones they’ve ever shipped high performance desktop replacements.

    • Yes, I know there’s a down-stack option that shows up later, but that’s also kinda the point: the ones you can afford will show up for you… eventually. Very much designed to push purchasers into the top end.



  • sudo smartctl -a /dev/yourssd

    You’re looking for the Media_Wearout_Indicator which is a percentage starting at 100% and going to 0%, with 0% being no more spare sectors available and thus “failed”. A very important note here, though, is that a 0% drive isn’t going to always result in data loss.

    Unless you have the shittiest SSD I’ve ever heard of or seen, it’ll almost certainly just go read-only and all your data will be there, you just won’t be able to write more data to the drive.

    Also you’ll probably be interested in the Total_LBAs_Written variable, which is (usually) going to be converted to gigabytes and will tell you how much data has been written to the drive.



  • As a FunFact™, you’re more likely to have the SSD controller die than the flash wear out at this point.

    Even really cheap SSDs will do hundreds and hundreds of TB written these days, and on a normal consumer workload we’re talking years and years and years and years of expected lifespan.

    Even the cheap SSDs in my home server have been fine: they’re pushing 5 years on this specific build, and about 200 TBW on the drives and they’re still claiming 90% life left.

    At that rate, I’ll be dead well before those drives fail, lol.


  • Hell I almost got snagged by one recently, and a goodly portion of my last job was dealing with phishing sites all day.

    They’ve gotten good with making things look like a proper email from a business that would be sending that kind of email, and if you’re distracted and expecting something you can have at least a moment of ‘oh this is probably legitimate’.

    The giveaway was, hilariously, a case of using ‘please kindly’ and ‘needful’ which uh, aren’t something this particular company would have actually used as phraseology in an email, so saved by scammers not realizing that americans at least don’t actually use those two phrases in conversation.





  • too high TDP, using above the MAX rate of 250 Watt

    Agreed. Intel’s design philosophy seems to be ‘space heater that does math’ for some reason. That’s been true since at least 10th gen, if not before then. I don’t know if it’s just chasing benchmark wins at any cost, or if they’re firmly of the opinion that hot and loud is fine as long as it’s fast and no customers will care - which I don’t really think is true anymore - or what, but they’ve certainly invested heavily in CPUs that push the literal limits of physics while trying to cool them.

    Intel always had the advantage of superior production

    That really stopped being true in the Skylake era when TSMC leapfrogged them and Intel was doing their 14nm++++++++ dance. I mean they did a shockingly good job of keeping that node relevant and competitive, but the were really only relevant and competitive on it until AMD caught up and exceeded their IPC with Ryzen 3000.

    about the same price

    Yeah, if gaming is your use case there’s exactly zero Intel products you should even be considering. There’s nothing that’s remotely competitive with a 7800x3d, and hell, for most people and games, even a 5800x3d is overkill.

    And of course, those are both last-gen parts, so that’s about to get even worse with the 9800x3d.

    For productivity, I guess if you’re mandated to use Intel or Intel cpus are the only validated ones it’s a choice. But ‘at the same price’ is the problem: there’s no case where I’d want to buy Intel over AMD if they cost the same and perform similarly, if for no other reason than I won’t need something stupid like a 360mm AIO to cool the damn thing.






  • It’s now almost 8 years since AMD revealed Ryzen, and Intel still can’t beat it.

    That feels a slight bit unfair.

    For non-gaming workloads, they’re basically sitting on par or better because of the giant pile of e-cores, and for single-threaded performance (on p-cores) they’re also on par to slightly ahead.

    Sure, the x3d chips are the gaming kings and no argument here, but that’s not moving volume - even AMD is all-in on the datacenter side because their gaming/consumer side sales have evaporated into nothingness.

    Intel’s problem isn’t an inability to design CPUs that are competitive, it’s an inability to create production-ready processes that are competitive with TSMC.

    At some point they’re going to have to decide if spending endless billions on processes that aren’t competitive is the best use of their resources. Owning the ability to make your product is super important, but for certain market segments (client desktop and laptop) maybe going ‘fuck it’ and fabbing on the best process you can find so that your CPUs come out competitive is probably the way to go - and, honestly, is pretty much already what they’ve done with ARL.

    I’d also maybe agree that the pricing is an issue: they’re not industry-leading anymore but they’ve kept that pricing which almost immediately makes them less appealing than AMD if you don’t need something Intel is offering you (like the accelerators in scalable Xeons or whatever). ARL immediately made me go ‘How much? What the bleep?’ when they announced pricing, because uh, they’re way off on what they really should be asking.