Submitted by [deleted] t3_10xoxh6 in Futurology
TheSecretAgenda t1_j7ye8jr wrote
Reply to comment by Sasuke_1738 in Can't we just control the development of AI? by [deleted]
That would be the worst possible outcome. Super powerful intellect paired with human emotions, very dangerous in my opinion.
Sasuke_1738 t1_j7yf0l0 wrote
Wouldn't we technically be able to control them better if we were smarter?
TheSecretAgenda t1_j7yfnr7 wrote
I don't think they will be dangerous, Decisions based on pure logic detached from emotion will be better for humanity.
Sasuke_1738 t1_j7yfx2k wrote
Lol we'd be vulcans 🖖
TheSecretAgenda t1_j7yg9qm wrote
Would that be a bad thing?
Sasuke_1738 t1_j7ygh5a wrote
I mean, wouldn't we start to lose connection to our cultures and basically just evolve into artificial beings without what makes us human.
TheSecretAgenda t1_j7ygt5l wrote
No, humans would go about their business for the most part. But if you wanted to create a product that was dangerous or polluting the AI would stop you. If you wanted to hoard resources, the AI would stop you. If you wanted to go to war with another country the AI would stop you. The AIs would be like children caring for a senile parent.
Sasuke_1738 t1_j7yh8n6 wrote
But the AI would be merged with you, meaning the AI isn't making those decisions the person is.
Viewing a single comment thread. View all comments