Desperate_Food7354

Desperate_Food7354 t1_j9k7jid wrote

what is supposed to be “beneficial” to an individual? Is hunger not beneficial? If you have to fight and deploy the usage of anger in order to survive is that not beneficial? Benefit to the species and benefit to the organism is arbitrary, does a fly live in order to service itself? Yes. How? Make more flys.

1

Desperate_Food7354 t1_j9iq35x wrote

Yes. Are you programming a human? Is that what you want? Why do you even get up everyday? It’s completely illogical beyond the perspective of doing tasks that give you feelings of “goodness” which typically are for the purposes of achieving reproduction.

1

Desperate_Food7354 t1_j9iprju wrote

Emotions are the condition for intelligence in the biological world, without them you will die and you won’t reproduce, and you would have no reason to do anything at all as there is no reason to do anything in the first place beyond what feelings drive your behavior due to the programming that was passed onto you that insures genetic transfer. An AI would likely have emotions to the extent of it needing to achieve correct answers in order to feed back that no this answer is wrong = negative stimuli, this answer is correct = positive stimuli, but no it will not need all of our emotions. But if you are also asking can you code an ai to be exactly like us to the extent it’s practically a human with the full range of emotions, i see no reason why not.

1

Desperate_Food7354 t1_j9ipcct wrote

How are emotions not logical? If you didn’t have sex the genes that allow things like you to exist wouldn’t exist, it’s completely logical. How is anger not logical? If you experienced no anger you wouldn’t defend yourself resulting in 0 sexual and 0 gene transfer.

−6

Desperate_Food7354 t1_j6ar1m4 wrote

I don’t see how this new response isn’t in complete alignment with what I’m saying. It’s a program, it doesn’t have wants and needs, it can do exactly that, it will do exactly as directed, but it will not randomly be like “huh this human stuff isn’t fun i’m gonna go to the corner of the universe and put myself in a hooker simulation.”

1

Desperate_Food7354 t1_j67qht0 wrote

Dopamine like release system of these wants and needs, my calculator can calculate without needing a (dopamine-like release system upon achievements of calculating 5+5), your brain only cares about your survival, it doesn’t care about your happiness, not one bit. It seems that many people are unable to not anthropomorphize AI, no wonder people think their chat bot is sentient. Humans evolved by natural selection, emotions are a survival response, AGI is programmed and fed data, it doesn’t slowly evolve aggressive and sexual traits in order to survive. You yourself are just a program, doing exactly as programmed.

2

Desperate_Food7354 t1_j67pmaa wrote

What. Why would an AGI care about its own existence? You think the reptilian brain is required to make an AGI? That it needs to desire sex, hunting, exploring? Why does your calculator calculate numbers? Because that is its programming, if you give a calculator the option to reprogram itself it wouldn’t at all unless that was its directive, circuits are deterministic, so is our brain, so is an AGI, we aren’t making it into an animal.

9