Submitted by izumi3682 t3_xxelcu in Futurology
whatTheBumfuck t1_irc43v3 wrote
Reply to comment by Leanardoe in White House Releases Blueprint for Artificial Intelligence Bill of Rights by izumi3682
Generally speaking it's better to do something slowly at a more controlled pace if you intend to do it safely. The thing with AGI is you can really only fuck it up once, then the next day your civilization has been turned into a paper clip factory. In the long run things like this are going to make positive outcomes more likely.
Leanardoe t1_irchxfz wrote
I see your point, I just think placing roadblocks now is premature. If we get to the point AI is starting to tread the line of independent thought, that’s when I think limits and guidelines need to be made. In case of the unlikely terminator event everyone fears lol,
Viewing a single comment thread. View all comments