- cross-posted to:
- technology@beehaw.org
- cross-posted to:
- technology@beehaw.org
Somehow, AI is going to be responsible for both dystopia and utopia. But the good news is we have 18 months to get ready!
Somehow, AI is going to be responsible for both dystopia and utopia. But the good news is we have 18 months to get ready!
Yeah, good luck with that.
In reality, everyone gets fired, the rich get richertge poor get poorer and 99.9% of humans will live in a dystopia, if AI doesn’t kill us all.
Yet the AI bros go like “that won’t happen to ME though!”
Every business that fires their employees and tries to replace them with AI ends up hilariously screwed over by it. It’s not going to get to take over the world levels in 2 years.
Anyway I’m safe, the company I work for still use a software written in 1995 so I reckon I have until at least 2045 before they introduce any AI.
If we had a proper social safety net for all the displaced people, I’d be more open to it. But as things are now with rampant greed and a government for the corporations, fuck AI.
I can’t shake the feeling that all this talk of UBI and other social safety nets that are meant to support the majority of the populace after some notional post-work future society ignore a really big elephant in the room:
If most people are solely reliant on the good grace of a single entity, the government, for their whole means of survival, their entire existence is at the pleasure of that government. The populace becomes completely beholden to them, not the other way around.
The whole idea feels suspiciously like a trap set by bad actors with a long-term plan to steal the government from the governed.
I could see your argument if UBI was a recent concept (made by bad actors aimed at fixing today’s problem in order to steal the government from the governed tomorrow), but the concept has been around for a while: wiki article.
Its not so much of an argument as a concern. It puts an extreme amount of the country’s economic activity directly in the hands of the government. We see globally that governments can quickly change their motivations.
The same rope that can be used to help people out of a hole can be used to tie them up.
Definitely a fair and valid concern, then; I suppose that will always be an inherent risk.