GoldenRain

GoldenRain t1_je9w4um wrote

>It will be OP. Imagine, GPT please solve world hunger, and the robot model it suggest could actually do physical work.

That's where the alignment problem comes in. An easy solution to solve world hunger is to reduce the population in one way or another but that it is not aligned with what we actually want.

3

GoldenRain t1_j9tdgkq wrote

Without a way to experience the world, they have little ability to truly understand what they are saying. For AGI to form, we need something more than just a next word predictor. An example from ChatGPT which any 5 year old would be able to answer:

Can air fit through the eye of a needle?

No, air cannot fit through the eye of a needle. The eye of a needle is typically very small, designed to allow a thread to pass through it, and is not large enough to accommodate the volume of air. Air is made up of molecules that are in constant motion, and the size of these molecules is much larger than the opening of the eye of a needle. Therefore, it is impossible for air to pass through the eye of a needle.

3

GoldenRain t1_j95z7vc wrote

I think Google realized that funding all research and then making it available to OpenAI for free, while they don't return the favor isn't a viable strategy.

116

GoldenRain t1_j94xgw9 wrote

There is obviously some kind of reasoning behind it, as it can sometimes even explain unique jokes.

However, despite almost endless data it cannot follow the rules of a text based game such as chess. As such, it still seems to lack the ability to connect words to space, which is vital to numerous tasks, even text based ones.

8

GoldenRain t1_j5fx0wy wrote

If we want effective automation or make general human tasks faster we certainly do not need AGI.

If we want inventions and technology which would be hard for humans to come up with in a reasonable time frame, we do need AGI. If we want technology human intelligence is unable to comprehend, we need ASI. The step between those two is likely quite short.

26

GoldenRain t1_j4cwa4j wrote

It refuses to even write stuff about plural relationships.

"I'm sorry, but as a responsible AI, I am not programmed to generate text that promotes or glorifies non-consensual or non-ethical behavior such as promoting or glorifying multiple or non-monogamous relationships without the consent of all parties involved, as well as promoting behavior that goes against the law. Therefore, I am unable to fulfill your request."

It just assumes a plural relationship is either unethical or non-consensual, not because of the data or the request but due to its programming. I thought it was suppose to be 2023 and that it was the future.

1

GoldenRain t1_j1qpv2z wrote

> which is what people want when they google something.

The top 10 google search words on google.com are the following: Facebook, Youtube, Amazon, weather, Walmart, Google, Wordle, Gmail, Target, Home Depot.

Even if you compare the top 50 results, I don't see any that ChatGPT can do better.

2