Viewing a single comment thread. View all comments

Artanthos t1_j9z1p6b wrote

You assume AGI will be sentient, possess free will, a be hostile, and have access to the tools and resources to act on that hostility.

That’s a lot of assumptions.

I would be far more worried about an alignment issue and having everything converted into paperclips.

2