Submitted by Akashictruth t3_yt3x9f in singularity
4e_65_6f t1_iw2szdy wrote
>After they start not needing us for anything what makes you think they’ll keep baggage like us around? All 8 billion of us? You think they gonna let us leech for old time’s sake?
​
The reason I think they will help humanity is that once production becomes fully automatic, it becomes very easy. All it would take for them is to tell the AI to do it, at basically no cost (since money itself will be useless at that point).
Consider this situation for instance:
AI: Sir, people are starving and need food. We have an estimation that we can solve it within a day at no cost without delaying other plans and you get all the credit for feeding everyone.
​
What reason would anyone have to answer no in that situation?
I think billionaires are greedy but they aren't psychotic sadists, think how much of an asshole you'd have to be to not share something that it's basically infinite and free for you.
OneRedditAccount2000 t1_iw2thta wrote
Or they can just tell AI to make a pill that makes all humans infertile, and only satisfy the needs of the last generation of humans. After that the planet js all theirs.
Time is also a resource.
Extinction is inevitable. Why can't people understand thay? LoL meat won't rule forever.
meat is simply obsolete.
4e_65_6f t1_iw2u2hb wrote
Yeah, that sounds reasonable if Dr Evil was in charge but in reality I don't think anyone would sterilize the whole population to save some time on their schedule.
OneRedditAccount2000 t1_iw2uxsv wrote
You will have to satisfy their needs and police them forever if they reproduce.
Why would you want to be tied to them forever?
It's like taking care of the needs of every wild animal in this world, you'd rather not
and humans occupy useful space on the planet,
they can rebel and be a nuissance to your establishment etc.
If you turn them into robotsz hlthat makes no sense
its better just to make robots
4e_65_6f t1_iw2vumj wrote
>You will have to satisfy their needs and police them forever if they reproduce.
You won't have to do anything, it's all automated including the planning and execution.
​
>Why would you want to be tied to them forever?
You don't have to, you can just copy the AI for them and bail to outer space or whatever the hell else you'd do without humanity.
​
>It's like taking care of the needs of every wild animal in this world, you'd rather not
If I had to put in work, then no. If I could just say "do it", I would. And I don't even particularly like animals tbh.
​
>and humans occupy useful space on the planet,
There's plenty of space for machines on the moon, in mars, in the ocean or underground. Why could you possibly need the entire earth without humans for?
​
>they can rebel and be a nuissance to your establishment etc.
They can't though, the AI will be a thousand steps ahead.
OneRedditAccount2000 t1_iw2x2ui wrote
They're still a threat. You'd have to spy on them 24/7 to make sure they don't create another AI.
Long term you're better off without them imo. Coexistence would only work short term.
The universe will dissipate ventually and you ll need every resource
cjeam t1_iw4i00b wrote
Billionaires are very immoral people, and the resources available to them are limited by the presence of the rest of us. You can see that in their behaviour now, buying all the neighbouring properties in an area to have more space, exclusive use of things that require space and time.
4e_65_6f t1_iw4jirh wrote
>Billionaires are very immoral people
While I don't disagree with that, I still don't think there's anything to be gained from letting everyone starve.
Like Jeff Bezos can buy a billion hamburgers but he can't eat them all. With automatic production what is the point in hoarding something that it's free for you to produce and nobody can buy? It would be like hoarding sand.
What resources do you think AGI couldn't produce automatically and therefore would be scarce still?
cjeam t1_iw4k2cy wrote
Space, time, privacy, silence, exclusivity.
4e_65_6f t1_iw4l7d3 wrote
Can't you think of some solutions to those problems that don't involve killing humanity? I'm sure AGI will.
Unless the person who "owns" the AGI is literally Hitler, I don't see the worst case scenario happening.
Viewing a single comment thread. View all comments