• 0 Posts
  • 53 Comments
Joined 8 months ago
cake
Cake day: April 3rd, 2024

help-circle



  • To be fair, it had its moments. Windows 95 was a pretty big step forwards and the alternative was OS/2 Warp, which has some nice features but was from IBM, who were still dreaming of replacing the PC with a vertically-integrated home computer again.

    Windows 2000 (or XP starting with SP2) was also solid. 7 was alright. None of those had too much bullshit bundled with them.

    Everything since Windows 8 has been some flavor of shitty, though.









  • These days ROCm support is more common than a few years ago so you’re no longer entirely dependent on CUDA for machine learning. (Although I wish fewer tools required non-CUDA users to manually install Torch in their venv because the auto-installer assumes CUDA. At least take a parameter or something if you don’t want to implement autodetection.)

    Nvidia’s Linux drivers generally are a bit behind AMD’s; e.g. driver versions before 555 tended not to play well with Wayland.

    Also, Nvidia’s drivers tend not to give any meaningful information in case of a problem. There’s typically just an error code for “the driver has crashed”, no matter what reason it crashed for.

    Personal anecdote for the last one: I had a wonky 4080 and tracing the problem to the card took months because the log (both on Linux and Windows) didn’t contain error information beyond “something bad happened” and the behavior had dozens of possible causes, ranging from “the 4080 is unstable if you use XMP on some mainboards” over “some BIOS setting might need to be changed” and “sometimes the card doesn’t like a specific CPU/PSU/RAM/mainboard” to “it’s a manufacturing defect”.

    Sure, manufacturing defects can happen to anyone; I can’t fault Nvidia for that. But the combination of useless logs and 4000-series cards having so many things they can possibly (but rarely) get hung up on made error diagnosis incredibly painful. I finally just bought a 7900 XTX instead. It’s slower but I like the driver better.



  • Anduril is way overengineered. I like this UI that some of my lights have:

    While off:

    • One push: Turn on at the last used brightness.
    • Two pushes: Turn on at maximum brightness.
    • Three pushes: That strobe mode that you don’t need but seems to be obligatory.
    • Hold: Turn on at the lowest brightness (or moonlight mode if the light has one).

    While on:

    • One push to turn off.
    • Two pushes to toggle between maximum brightness and the last used “regular” brightness.
    • Three: That strobe mode that someone has to have some use for.
    • Hold: Alternately increase or decrease the brightness.

    That’s pretty easy to learn and gives you all the functions you’d reasonably need (plus that strobe) without a lot of clutter.




  • They did PR campaigns against Linux and OpenOffice for quite some time – until cloud computing took off and it turned out they could earn more money by supporting Linux than by fighting it.

    In fact, Microsoft weren’t happy about FOSS in general. I can still remember when they tried to make “shared source” a thing: They made their own ersatz OSI with its own set of licenses, some of which didn’t grant proper reuse rights – like only allowing you to use the source code to write Windows applications.



  • Nope, they just become less predictable. Which is why in some parts of Germany you can’t build as much as a garden shed without having EOD check the land first. In the more heavily-bombed areas it’s not unusual to hear on the radio that you’re to avoid downtown today between 10 and 12 because they’re disarming a 500-pound bomb they found during roadwork.

    And yes, the fact that an unstable bomb capable of trashing a city block is mundane nicely illustrates war’s potential to fuck things up for generations.

    Japan might want to get that land under and around the airport checked. There might be some other surprises hidden down there.