Submitted by Akashictruth t3_yt3x9f in singularity
OneRedditAccount2000 t1_iw3wa3m wrote
Reply to comment by TheHamsterSandwich in What if the future doesn’t turn out the way you think it will? by Akashictruth
I can because I know what it values, it value survival, and I just put it in a situation with only two choices and only one solution. Move/run or do something other than moving/running. It can only survive by choosing to run. It can think many thoughts I cannot predict, but in that situation it has to use a thought that I can also understand, granted I probably can't understand 99,99... percent of its thinking
If you put the A.I in that cage, tell me, is it gonna get eaten by the tiger? Is it gonna choose to do literally everything else other than running: jump, do nothing, look in the sky, dance, shout whatever or is it actually going to run in the cage because it doesn't want t o fucking die?
Viewing a single comment thread. View all comments