Snoo58061

Snoo58061 t1_jdjxmti wrote

The brain almost certainly doesn't use backpropgation. Liquid nets are a bit more like neurons than the current state of the art Most of this stuff is old theory refined with more compute and data.

These systems are hardly biologically plausible. Not that biological plausibility is a requirement for general intelligence.

3

Snoo58061 t1_jdjvybp wrote

I'm saying it's not the same kind of development and the results are different. A human works for a long time to grasp the letters and words at all, then extracts much more information from many orders of magnitude smaller data sets with weaker specific recall and much faster convergence for a given domain.

To be clear I think AGI is possible and that we've made a ton of progress, but I just don't think that scale is the only missing piece here.

1

Snoo58061 t1_jdjdy56 wrote

I like to call this positive agnosticism. I don't know and I'm positive nobody else does either.

Tho I lean towards the theory of mind camp. General intelligence shouldn't have to read the whole internet to be able to hold a conversation. The book in the Searle's Chinese Room is getting bigger.

8