Submitted by [deleted] t3_zzhy3k in singularity
[removed]
Submitted by [deleted] t3_zzhy3k in singularity
[removed]
Hey hey still 1 day left in December
[deleted]
Possibly, but I'm not aware of such rumors.
Other AI's just in general, or are there other specific yet to be released projects we should have our eyes on?
Source for these rumors?
There will be no GPT-4. Microsoft did not pay OpenAI one billion dollar without a reason. The next GPT will be called GPT 2000™.
GPT Vista
GPT-10 would be the last version
GPT ME.
Once again, the Giga-Parameter Terminator says: "Hasta la vista, baby!"
And I thought that "The machines gonna kill us all! We must stop machine learning research!" has been replaced by "The machines will get conscious and suffer! We must stop machine learning research!"
Let me lay out my GPT-4 speculations:
If they go to 1T parameters, the model would be hard to use. Even a demo might be impractical. I think they would prefer to keep it at the same size. In fact, it is desirable to have a 10-30B model as good as GPT-3, for deployment cost reduction. It's bloody expensive.
Most of the good training data is already scraped, but maybe there is still some left to surprise us. Maybe they transcribed the whole YouTube to generate a massive text dataset.
This is feasible, recent papers showed how you can bootstrap task+solution data by clever prompting. This self generated task data is more diverse than human generated one.
Maybe they are solving millions of coding and math problems, where it is possible to filter out garbage outputs by exact verification/code execution. This can bootstrap a model to surpass human level because it is learning not from us, but from the execution feedback.
Probably not, if they had that they would have used it on chatGPT.
This could be the biggest change. It would open usage of language models in robotics and UI automation, with huge implications for the job market. No longer will these models be limited to a text box. But it is hard to do efficiently.
Burning in all that trivia in the weights of a model is inefficient. Instead, why not use a search engine to tell us the height of Everest? A search engine could be a great addition for the language model. Also, calculator and even code execution. Armed with these "toys" a language model would be able to check factuality and ensure correct computations.
As for the date? Probably not in the next 2-3 months, as they already released chatGPT with great acclaim. They got to milk the moment for all the PR. It sounds like the rumours about GPT-4 are pretty bullish, I hope it is true.
What will gpt4 do that's so impressive?
wack everybody's dick off in fdvr, obviously
Holy shit I think gpt4 will drastically reduce mean jerk time and maximize DAAT (dicks at a time) - how didn't these guys see that an LLM was the solution to all their woes https://youtu.be/P-hUV9yhqgY
There is a study that looked at how you can teach a neural network to give the best head by analyzing thousands of blowjob videos.
Turn water into wine
& wine into beer, beer into domestic assault & domestic assault to arrest.
Rumoured to be Q1 2023. Not sure if that will be a public or a closed beta testing. It is not a given that it will be immediately part of chatGPT.
I wonder if Microsoft planned this from the start? It’s so perfect GitHub, vscode, codex, copilot
i thought g3 didnt even come out
You're way behind. The G3 Free Trade Agreement came out in 1995.
Think op was referring to the Gulfstream G300 private jet which came out in 2002.
ChatGPT is based on GPT 3.5
[deleted]
Sashinii t1_j2bp5mo wrote
GPT-4 is rumored to release in either December (nope), January or February, and while we don't know, it seems likely it'll release in 2023, but if it doesn't, there's other AI's to look forward to.