Snoo58061
Snoo58061 t1_jdjvybp wrote
Reply to comment by E_Snap in [D] "Sparks of Artificial General Intelligence: Early experiments with GPT-4" contained unredacted comments by QQII
I'm saying it's not the same kind of development and the results are different. A human works for a long time to grasp the letters and words at all, then extracts much more information from many orders of magnitude smaller data sets with weaker specific recall and much faster convergence for a given domain.
To be clear I think AGI is possible and that we've made a ton of progress, but I just don't think that scale is the only missing piece here.
Snoo58061 t1_jdjdy56 wrote
Reply to comment by stimulatedecho in [D] "Sparks of Artificial General Intelligence: Early experiments with GPT-4" contained unredacted comments by QQII
I like to call this positive agnosticism. I don't know and I'm positive nobody else does either.
Tho I lean towards the theory of mind camp. General intelligence shouldn't have to read the whole internet to be able to hold a conversation. The book in the Searle's Chinese Room is getting bigger.
Snoo58061 t1_jcdtg08 wrote
Reply to comment by nopainnogain5 in [D] To those of you who quit machine learning, what do you do now? by nopainnogain5
Well I started of doing my time in the Data Warehouse. I was hoping I could retire to the Data Lakehouse. Now it's being drained by a Data Pipeline and the rest is slowly floating off into The Cloud.
Amusingly they recently changed my team name to Data Integration and Engineering. The DIE team.
Snoo58061 t1_jcdrz12 wrote
Never quite made it to working on ML professionally. This week I'm a 'Data Engineer'.
Snoo58061 t1_jdjxmti wrote
Reply to comment by E_Snap in [D] "Sparks of Artificial General Intelligence: Early experiments with GPT-4" contained unredacted comments by QQII
The brain almost certainly doesn't use backpropgation. Liquid nets are a bit more like neurons than the current state of the art Most of this stuff is old theory refined with more compute and data.
These systems are hardly biologically plausible. Not that biological plausibility is a requirement for general intelligence.