Good point. It’s more of an “allowing them to build nukes” thing than a “teach them to build nukes” one – although the existing nuclear-armed nations could certainly accelerate the heck out of a Ukrainian nuclear program if they wanted.
Good point. It’s more of an “allowing them to build nukes” thing than a “teach them to build nukes” one – although the existing nuclear-armed nations could certainly accelerate the heck out of a Ukrainian nuclear program if they wanted.
Neutrality could work if we give Ukraine nuclear weapons and the ability to make more. That’s the only thing other than a military alliance that’ll reliably keep Russia out of the country.
I get the feeling that nuclear proliferation would not be seen as a desirable solution, though.
To be fair, it had its moments. Windows 95 was a pretty big step forwards and the alternative was OS/2 Warp, which has some nice features but was from IBM, who were still dreaming of replacing the PC with a vertically-integrated home computer again.
Windows 2000 (or XP starting with SP2) was also solid. 7 was alright. None of those had too much bullshit bundled with them.
Everything since Windows 8 has been some flavor of shitty, though.
Milky oolong. It has just the right amount of sweetness and just evokes feelings of coziness for me. Sometimes I add a little bit of jasmine as well.
Especially if you didn’t have a lot of spare time. With an active community you can just dip into discussions when you have the time. With a community you’re trying to establish yourself you absolutely have to provide a steady stream of content until it (hopefully) takes off.
“You finished a computer game, Atticus.”
The truth was a burning green crack through my brain.
Credits scrolling by, a reminder of the talent behind a just-finished journey. The feeling of triumph, slowly replaced by the creeping grayness of ordinary life.
I had finished a computer game. Funny as hell, it was the most horrible thing I could think of.
I’d argue that unfun design elements can be useful in games if used with care and purpose. For instance, “suddenly all of the characters you’re attached to are dead” is not exactly fun but one of the Fire Emblem games used it to great dramatic effect at the midway point.
Of course the line between an event or mechanic that players love to hate and one they just hate is thin.
Hoo boy, you weren’t kidding. I find it amazing how quickly this went from “the kernel team is enforcing sanctions” to an an unfriendly abstract debate about the definition of liberalism. I shouldn’t, really, but I still am.
Aw, c’mon! Who doesn’t enjoy piping ten megabytes of JavaScript through Webpack to achieve those crucial on-scroll effects on an otherwise static page?
Oh yeah, the equation completely changes for the cloud. I’m only familiar with local usage where you can’t easily scale out of your resource constraints (and into budgetary ones). It’s certainly easier to pivot to a different vendor/ecosystem locally.
By the way, AMD does have one additional edge locally: They tend to put more RAM into consumer GPUs at a comparable price point – for example, the 7900 XTX competes with the 4080 on price but has as much memory as a 4090. In systems with one or few GPUs (like a hobbyist mixed-use machine) those few extra gigabytes can make a real difference. Of course this leads to a trade-off between Nvidia’s superior speed and AMD’s superior capacity.
These days ROCm support is more common than a few years ago so you’re no longer entirely dependent on CUDA for machine learning. (Although I wish fewer tools required non-CUDA users to manually install Torch in their venv because the auto-installer assumes CUDA. At least take a parameter or something if you don’t want to implement autodetection.)
Nvidia’s Linux drivers generally are a bit behind AMD’s; e.g. driver versions before 555 tended not to play well with Wayland.
Also, Nvidia’s drivers tend not to give any meaningful information in case of a problem. There’s typically just an error code for “the driver has crashed”, no matter what reason it crashed for.
Personal anecdote for the last one: I had a wonky 4080 and tracing the problem to the card took months because the log (both on Linux and Windows) didn’t contain error information beyond “something bad happened” and the behavior had dozens of possible causes, ranging from “the 4080 is unstable if you use XMP on some mainboards” over “some BIOS setting might need to be changed” and “sometimes the card doesn’t like a specific CPU/PSU/RAM/mainboard” to “it’s a manufacturing defect”.
Sure, manufacturing defects can happen to anyone; I can’t fault Nvidia for that. But the combination of useless logs and 4000-series cards having so many things they can possibly (but rarely) get hung up on made error diagnosis incredibly painful. I finally just bought a 7900 XTX instead. It’s slower but I like the driver better.
I also only used v2 but it’s the extra stuff in it that slightly annoys me. Like how turbo mode (brighter than the usual maximum but usually time-limited to avoid overheating) is only available when the full UI is unlocked. Or how there’s a stepped ramp mode that I have to remember to disable whenever I swap out the battery. Or how I can accidentally enter one of the more exotic modes of for some reason I press the button too often.
Anduril is way overengineered. I like this UI that some of my lights have:
While off:
While on:
That’s pretty easy to learn and gives you all the functions you’d reasonably need (plus that strobe) without a lot of clutter.
Pretty much business as usual. Those things pop up every once in a while, which is why Germany has no shortage of experienced bomb disposal experts.
And yes, the fact that this is perfectly mundane is chilling if you think about it. That’s how intensely bombed areas work for decades. Battlefields don’t turn back to normal when the war is over. And we casually do this all over the world for a large number of usually stupid reasons.
I fix that problem by forgetting to turn off Do Not Disturb mode. Also works wonders for blocking all of the calls you do want to take.
They did PR campaigns against Linux and OpenOffice for quite some time – until cloud computing took off and it turned out they could earn more money by supporting Linux than by fighting it.
In fact, Microsoft weren’t happy about FOSS in general. I can still remember when they tried to make “shared source” a thing: They made their own ersatz OSI with its own set of licenses, some of which didn’t grant proper reuse rights – like only allowing you to use the source code to write Windows applications.
And this is why stuff should be defined in terms of day’s earnings to provide scaling. If an ultra-rich person gets jailed and has to post 20 billion dollars in bail, they can’t treat jail as a minor inconvenience.
Nope, they just become less predictable. Which is why in some parts of Germany you can’t build as much as a garden shed without having EOD check the land first. In the more heavily-bombed areas it’s not unusual to hear on the radio that you’re to avoid downtown today between 10 and 12 because they’re disarming a 500-pound bomb they found during roadwork.
And yes, the fact that an unstable bomb capable of trashing a city block is mundane nicely illustrates war’s potential to fuck things up for generations.
Japan might want to get that land under and around the airport checked. There might be some other surprises hidden down there.
You forgot the degaussing sound for those screens that had that feature. Like turning them on but louder.
*KLONK*
There’s less than thirty of these in the wild so seeing one end up as bycatch is a sobering reminder of the consequences of overfishing. If we don’t start taking ocean preservation seriously we might at some point find that not just the Virginia-class but all nuclear-powered cruise missile fast attack submarines have gone extinct.
And you can’t even safely eat them; they’re full of heavy metals like uranium.