Metacognitor

Metacognitor t1_j94ois4 wrote

Reply to comment by KPTN25 in [D] Please stop by [deleted]

That's a fair enough point, I can see where you're coming from on that. Although my perspective is perhaps as the models become increasingly large, to the point of being almost entirely a "black box" from a dev perspective, maybe something resembling sentience could emerge spontaneously as a function of some type of self-referential or evaluative model within the primary. It would obviously be a more limited form of sentience (not human-level) but perhaps.

0

Metacognitor t1_j941yl1 wrote

Reply to comment by KPTN25 in [D] Please stop by [deleted]

My question was more rhetorical, as in, what would be capable of producing sentience? Because I don't believe anyone actually knows, which makes any definitive statements of the nature (like yours above) come across as presumptuous. Just my opinion.

1