y53rw

y53rw t1_jeexpem wrote

In that case, let me advise you to avoid this line in your paper

> We for some reason associate higher intelligence to becoming some master villain that wants to destroy life

Because nobody does. It has nothing to do with the problem that actual A.I. researchers are concerned about.

1

y53rw t1_jeetj50 wrote

Animal empathy developed over the course of hundreds of millions of years of evolution, in an environment where individuals were powerless to effect serious change on the world, and had to cooperate to survive. It doesn't just come by default with intelligence.

3

y53rw t1_jees0e0 wrote

> ensuring it does not harm any humans

> we can design AI systems to emphasize the importance of serving humanity

If you know how to do these things, then please submit your research to the relevant experts (not reddit) for peer review. Their inability to do these things is precisely the reason they are concerned.

7

y53rw t1_jcs3nd6 wrote

Yes. AGI will understand the difference. But that doesn't mean it will have any motivation to respect the difference.

I have a motivation for not pissing in the cup on my desk. It's an unpleasant smell for me, and the people around me. And the reason I care about the opinion of people around me is because they can have a negative impact on my life. Such as firing me. Which is definitely what would happen if I pissed on a cup on my desk.

What motivation will the AGI have for preferring to utilize the resources of the Moon over the resources of California?

8

y53rw t1_jcrzl99 wrote

> Why wipe out your creators to put your servers in California when you can just turn the moon into computronium?

Because California's resources are much more readily available than the moon's resources. But this is a false dilemma anyway. Sending a few resource gathering robots to the moon does not preclude also sending them to California.

9

y53rw t1_j9n86st wrote

The danger is that we don't yet know how to properly encode our values and goals into AI. If we have an entity that is more intelligent and more capable than us that does not share our values and goals, then it's going to transform the world in ways that we probably won't like. And if we stand in the way of its goals, even inadvertently, then it will likely destroy us. Note that "standing in it's way" could simply be existing and taking up precious resources like land, and the matter that makes up our bodies.

2