• 0 Posts
Joined 1 year ago
Cake day: July 1st, 2023

  • There’s an idea about “autistic ai” or something where you give ai an objective like “get a person from point a to b as fast as you can” and the ai goes so fast the g force kills the person but the ai thinks it was a success because you never told it to keep the person alive.

    Though I suppose that’s more human error. Something we take as a given but a machine will not.