Submitted by MistakeNotOk6203 t3_11b2iwk in singularity
NoidoDev t1_ja5w2ji wrote
Reply to comment by Gordon_Freeman01 in Hurtling Toward Extinction by MistakeNotOk6203
You just don't get it.
>There is no reason to believe an AGI would think the same way. It cares only about his goals.
Only if you make it that way. Then it still wouldn't have the power.
>What I meant was that the AGI has to keep existing, because that's necessary to achieve its goal, whatever that is.
Only if it is created in a way to think these goals are absolute and need to be archived no matter what. The comparison with some employee is a good one, because if they can't do what they are supposed to do with some reasonable effort, then they report back that it can't be done or that it will be more difficult than anticipated. It's not just caring about humans, but about effort and power. AI doomers just make up the idea that some future AI would somehow be different and also have the power to do whatever it wants.
Viewing a single comment thread. View all comments