Submitted by flowday t3_10gxy2t in singularity
genshiryoku t1_j57j6s1 wrote
Reply to comment by Gohoyo in AGI by 2024, the hard part is now done ? by flowday
It would be lower quality data but still usable if significantly altered. The question is. Why would you do this instead of just generating real data?
GPT is trained on human language it needs real interaction to learn from like the one we're having right now.
I'm also not saying that this isn't possible. We are AGI level intelligences and we absolutely consumed less data than GPT-3 did over our lifetimes so we know it's possible to reach AGI with relatively little data.
My original argument was merely that it's impossible with current transformer models like GPT and that we need another breakthrough in AI architecture to solve problems like this, not merely scale up current transformer models, because the training data is going to run out over the next couple of years as all of the internet will be used up.
Gohoyo t1_j57jyq4 wrote
> Why would you do this instead of just generating real data?
The idea would be that harnessing the AI's ability to create massive amounts of regurgitated old data quickly and then transmuting it into 'new data' somehow is faster than acquiring real data.
I mean I believe you, I'm not in this field nor a genius, so if the top AI people are seeing it as a problem then I have to assume it really is, I just don't understand it fully.
Viewing a single comment thread. View all comments