Hurtling Toward Extinction Submitted by MistakeNotOk6203 t3_11b2iwk on February 24, 2023 at 9:24 PM in singularity 26 comments 11
Artanthos t1_j9z1p6b wrote on February 25, 2023 at 4:50 PM You assume AGI will be sentient, possess free will, a be hostile, and have access to the tools and resources to act on that hostility. That’s a lot of assumptions. I would be far more worried about an alignment issue and having everything converted into paperclips. Permalink 2
Viewing a single comment thread. View all comments