Viewing a single comment thread. View all comments

Captain_Clark t1_j8w8dcq wrote

Correct, it is not sentient.

Now consider: Every organic intelligence is sentient. That’s because intelligence evolved for a reason: the sentience which enables the organism to survive.

Sentience is the foundation upon which intelligence has evolved. It is the necessary prerequisite for intelligence to exist. That holds true for every level of intelligence in every living creature. Without sentience, there’s no reason for intelligence.

So it’s quite a stretch to consider intelligence with no foundation nor reason to be intelligence at all. It’s something. But it’s not intelligence. And ChatGPT has no reason, other than our own.

You can create a website for me. But unless you have a reason to, you won’t. That is intelligence.

2

donniedenier t1_j8whhzc wrote

and we evolved to develop an intelligence that is smarter and more efficient than any one person on the planet can be.

so now instead of hiring a team of people to build me a website, i have the ai build it for me.

i’m not saying we don’t need humans, i’m saying we’re making 50%+ of our labor entirely obsolete, and we need a plan on what to do next.

3

Captain_Clark t1_j8x3v5r wrote

Which is fine. I merely wish to suggest to you, that if you consider ChatGPT to be intelligent, you devalue your own intelligence and your reason for having it.

Because by your own description, you’ve already implied that ChatGPT is more intelligent than you.

So I’d ask: Do you really want to believe that a stack of code is more intelligent than you are? It’s just a tool, friend. It only exists as human-created code, and it only does one thing: Analyze and construct human language.

Whereas, you can be intelligent without using language at all. You can be intelligent by simply and silently looking at another person’s face.

And the reason I’m telling you this is because I consider it dangerous to mistake ChatGPT for intelligence. That’s the same fear you describe: The devaluing of humanity, via the devaluing of human labor. But human labor is not humanity. If it were so, we could say that humans who do not work are not intelligent - even though most of us would be perfectly happy if we didn’t have to work. Which is why we created ChatGPT in the first place.

It once required a great deal of intelligence to start a fire. Now, you may start a fire by easily flicking a lighter. That didn’t make you less intelligent than a lighter.

3

anti-torque t1_j8xj33k wrote

I think the concern is its adaptability to collate data for business. It can essentially do middle-management tasks, given controlled inputs.

I think people forget that being a manager of people is hard enough. Shedding or reducing the paperwork might give business the time to allow managers to actually interact with their teams more efficiently.

3

HanaBothWays t1_j943tav wrote

> Which is fine. I merely wish to suggest to you, that if you consider ChatGPT to be intelligent, you devalue your own intelligence and your reason for having it.

Nah, this person is devaluing other human beings. There’s a sizeable contingent of people on this website (well, everywhere, but it’s a particular thing on this website) who will seize on any excuse to say most other people aren’t really people/don’t really matter.

This kind of talk about humans not really being all that different from large language models like ChatGPT is just the latest permutation of that.

3

Intensityintensifies t1_j8wuwql wrote

Nothing evolves for a reason. It’s all chance. You can evolve negative traits totally by accident.

1