• 0 Posts
  • 14 Comments
Joined 1 year ago
cake
Cake day: June 21st, 2023

help-circle
  • Oil and gas products account for 4.2% of Sweden’s exports. The gas exports alone almost rival those of dairy and eggs! Truly a petrostate if I ever saw one

    Well the largest category is

    • Machinery, Nuclear reactors and boilers. The nuclear part of this in Sweden is quite small so machinery is the big part. 14%. Second is:
    • Vechicles, Other than railway, trans. E.g. the later large Car and Lorrie, Truck manufacturers, Volvo, Volvo Cars and Scania. also about 14% The third is:
    • Electrical, electronic equipment, with large companies like Ericsson. 8.7% Then on fort place:
    • Mineral Fuels, Oils, distillation products, 7.4% Thou there are no internal sources for this is mostly refining of imported gods.

    https://tradingeconomics.com/sweden/exports-by-category


  • balp@lemmy.worldtoLinux@lemmy.mlXZ backdoor in a nutshell
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    7 months ago

    I would say yes and no, but yes the clone command can do it. But branching and CI get a bit more complicated. Pushing and reviewing changes gets more complicated to get the overview. If the functionality and especially the release cycle is different the submodules still have great values. As always your product and repo structure is a mix of different considerations and always a compromise. I think the additions in git the last years have made the previous really bad pain points with bigger repos less annoying. So that I now see more situations it works well.

    I always recommend keeping all testing in the same repo as the code that affects the tests. It keeps tracking changes in functionality easier, needing to coordinate commits, merging, and branches in more than one repo is a bigger cognitive load.


  • balp@lemmy.worldtoLinux@lemmy.mlXZ backdoor in a nutshell
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    7 months ago

    It’s also easier to work if one simple git command can get everything you need. There is a good case for a bigger nono-repo. It should be easy to debug tests on all levels else it’s hard to fix issues that the bigger tests find. Many new changes in git make the downsides of a bigger repo less hurtful and the gains now start to outweigh the losses of a bigger repo.





  • I’ll say losing its protection is an over-simplification. Yes, when a significant military target is misusing the protected signs they can be ignored. You still however are not allowed to break the first rules around minimising any civilian losses. So the civilian people in the area are still protected, they don’t lose their protection. Injuries to civilians should always be kept minimal. If the hospital is cleared out and is only used for military operations (treating wonder soldiers is not a military operation here) then the symbols of the hospital don’t have any protection. No, if you attach from within anything that is protected, you should expect to be attacked. At the same time, any attack on the enemy has to be proportional and should always strive to minimize civilian losses. Including civilian material losses. An example is a sniper hiding in a civilian building. It’s probably a war crime to bomb the house. Even when there are no civilians in the house. If there is a full platoon using the same place for defense, and the terrain is hard, it is probably ok for the same bombing. If the building also has a few hundred civilians, well it is back in the probable war crimes to bomb it out and some other ways of taking the objective have to be considered. This even looks at the house without any special protection. Same with an ambulance, if there are enemy combatants in the ambulance attacking you, of course, you can take it out. If there are enemy combatants in the ambulance that are not attacking you, you can take them prisoner. But you can’t shoot first to make sure there are no combatants in it. For me, if there has been a bombing of a hospital building with masses of patients inside, it’s a clear war crime. Even if there were a few military in the same building.


  • There are protected symbols such as the red cross and red crescent that you are not allowed to misuse. But it is also clear that everyone is responsible for not doing any war crimes. So even if your enemy does a war crime you are not allowed. Normally the only protected places along the ones you have listed are hospitals. All have to do their best to protect civilians, you are not allowed to engage an enemy if there are big risks of damaging no-combatants, e.g. if a small group of enemy combatants is traveling through an area filled with civilians, you are not allowed to target that area just because the few targets. Military targets have to travel is areas with civilians all the time. Targeting civilians is always a war crime. All actions have to be proportional. There is never an excuse to commit a war crime. Even if the enemy targets hospitals, like Russia, is accused of in Ukraine, that does not give Ukraine forces any permission to attack Russian hospitals. If Hamas shoots civilians, it doesn’t excuse IDF to do the same. Even if IDF shoots civilians, Hamas is not allowed to do the same. In technical terms, you could discuss the theoretical coverage of the Geneva Convention as such for the Gaza strip. The amendments around it may or may not regulate how Israel and IDF are allowed to operate. It was written for regular wars between nations, if, when, how, and stuff get into civil wars. What happens then is a bit more unclear. There are regulations about occupied territories, that Gaza falls into.




  • I blame the rise of frameworks, libraries, and IDEs

    Without good libraries and frameworks, we can hardly get any software working in today’s environment. We get stuck with a slow development cycle and have software that doesn’t do what the users want of it. A few years ago, I was at a customer using an old Linux distribution at their customer’s site. For contractual reasons that was not upgrading to the latest version, they had skipped keeping up to date with changes as they came. Every step of development became a hassle and the good programmers there were not able to deliver features at any predictable rate. There were issues with HTTPS, most webservers of today mandate at least TLS1.2, but when the OS only supports SSLv2 and SSLv3, and TLS1.1, connecting to the internet, well gets hard.

    Having to develop all functionality from the ground up, makes no features needed by the customers ever released. With most developers I have worked with using good libraries also makes the implementations less prone to have serious bugs in them.


  • balp@lemmy.worldtoProgramming@programming.devIs software getting worse?
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    I assume you never worked in testing. back in the days, we used to cram testing into a weekend as developers were late with their coding. There was no test automation so that weekend we spend all the time on the most basic functionality. Barely getting thou the testing of having the app started and some of the most basic functions. Almost never was there any time for regression testing, old functions broke all the time. It wasn’t uncommon that we skipped a bug fix in one version, just to reintroduce the same bug in the next release.


  • It seems to me that the author doesn’t remember all the struggles we had back then with bugs and features not working. And masses of needed functionality that never got skipped into the hands of users. It also strikes me that maybe there is a bit of nostalgia, just a bit of reluctance to change his ways. He found a workflow around the missing functionality that might be blocking for others and he has a harder time adjusting to the new functionality.

    A bit like my father that refused to change his workflow, to make images for webpages (all static) he used for different Amiga programs because one could scale the images, one could edit them add lines and stuff, one for helping him make image maps, and they one so they could be converted to jpg/png as anim files used by everything else on the amiga didn’t work well on the internet.

    Bug testing back then was awful, we never had time to catch any issues but the biggest. The time plan for the release was fixed years ahead, the functionality that was needed was fixed years ahead. All the needed time for testing was eaten up by the developers working into the final skip to customers, trying to make the software actually run. It wasn’t uncommon for test teams trying to cramp months of eating into a weekend to have the software skipped on Monday morning. Well including masses of needed bug fixes during that weekend that no one knew what code each issue was actually tested on. Remember that software version control system was almost not used, there was no CI build system all all software was built on some random developers workstation. Maybe, with some additional changes for his or her convenience. No software development has come a long way since the 90s. A very long way!