The US Air Force wants $5.8 billion to build 1,000 AI-driven unmanned combat aircraft, possibly more, as part of its next generation air dominance initiative::The unmanned aircraft are ideal for suicide missions, the Air Force says. Human rights advocates call the autonomous lethal weapons “slaughterbots.”
You’re right on all counts here.
Computer algorithms (such as AI) can’t replace organic judgement-based decision making, but they vastly outperform humans when there is a well defined cost function to optimize against, such as, “hit this target in the minimum possible time”.
I think you can compare it to autonomous cars. They can drive from point to point while avoiding hazards along the way, but they still need the passenger to tell them where their destination is.
You’re missing the point. These drones can pull Gs that will kill a human pilot.
You don’t think that was implied when I said they vastly outperform human pilots?
There are numerous advantages to letting a flight computer do the piloting. Higher allowable G limits is one of them, albeit far from the most important.
Here is an alternative Piped link(s): https://piped.video/watch?v=O-2tpwW0kmU
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.