Depending on your needs you can also break it into a columnar format with some standard compression on top. This allows you to search individual fields without looking at the rest.
It also compress exceptionally well, and “rare” fields will be null in most records, so run length encoding will compress them to near zero
See fx parquet
That we stop fawning over tech CEOs
Thank you for saying this. Sometimes I feel like I sm the only one thinking like this 🙇♥️
Pet peeve: don’t use string keys. It invites key collision errors. Use the fact Go supports structs as keys in maps. Safer and more efficient.
You should probably change page content entirely, server sizey, based on the user agent og request IP.
Using CSS to change layout based on the request has long since been “fixed” by smart crawlers. Even hacks that use JS to show/hide content is mostly handled by crawlers.
I am super excited for this release. I think varargs min/max() built-ins are my favorite feature. Closely followed by clear() and improved type-param inference
The context package is such a big mistake. But at this point we just have to live with it and accept our fate because it’s used everywhere
It adds boilerplate everywhere, is easily misused, can cause resource leaks, has highly ambiguous conotations for methods that take a ctx: Does the function do IO? Is it cancellable? What transactional semantics are there if you cancel the context during method execution.
Almost all devs just blindly throw it around without thinking about these things
And dont get me startet on all the ctx.Value() calls that traverse a linked list