• 0 Posts
  • 157 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle
  • You can always combine integer operations in smaller chunks to simulate something that’s too big to fit in a register. Python even does this transparently for you, so your integers can be as big as you want.

    The fundamental problem that led to requiring 64-bit was when we needed to start addressing more than 4 GB of RAM. It’s kind of similar to the problem of the Internet, where 4 billion unique IP addresses falls rather short of what we need. IPv6 has a host of improvements, but the massively improved address space is what gets talked about the most since that’s what is desperately needed.

    Going back to RAM though, it’s sort of interesting that at the lowest levels of accessing memory, it is done in chunks that are larger than 8 bits, and that’s been the case for a long time now. CPUs have to provide the illusion that an 8-bit byte is the smallest addressible unit of memory since software would break badly were this not the case, but it’s somewhat amusing to me that we still shouldn’t really need more than 32 bits to address RAM at the lowest levels even with the 16 GB I have in my laptop right now. I’ve worked with 32-bit microcontrollers where the byte size is > 8 bits, and yeah, you can have plenty of addressible memory in there if you wanted.








  • I’m in a band that performs on occasion at CFBs (Canadian Forces Bases). We typically eat there and spend the night either in barracks or guest housing.

    I have noticed that when we play for officers, dinner is like steak and lobster. When we play for enlisted, it’s more like high school cafeteria. The one and only time I had to excuse myself towards the end of a concert and miss the closing number was after eating at the enlisted mess and getting explosive diarrhea.




  • Falsehoods About Time

    Having a background in astronomy, I knew going into programming that time would be an absolute bitch.

    Most recently, I thought I could code a script that could project when Easter would land every year to mark it on office timesheets. After spending an embarrassing amount of…er…time on it, I gave up and downloaded a table of pre-calculated dates. I suppose at some point, assuming the code survives that long, it will have a Y2K-style moment, but I didn’t trust my own algorithm over the table. I do think it is healthy, if not essential, to not trust your own code.

    Falsehoods About Text

    I’d like to add “Splitting at code-point boundary is safe” to your list. Man, was I ever naive!





  • I think I could get very nervous coding for the military, depending on what sort of application I was working on. If it were some sort of administrative database, that doesn’t sound so bad. If it were a missile guidance system, on man! A single bug and there goes a village full of civilians. Even something without direct human casualties could be nerve-wracking. Like if it were your code which bricked a billion-dollar military satellite.

    Speaking of missile guidance systems, I once met someone who worked a stint for a military contractor. He told me a story about a junior dev who discovered an egregious memory leak in a cruise missile’s software. The senior dev then told him “Yeah, I know about that one. But the memory leak would take an hour before it brings the system down and the missile’s maximum flight time is less than that, so no problem!” I think coding like that would just drive me into some OCD hell.