TemetN t1_j0vezph wrote
I agree with the starting premise, but the implicit assumption it'll be able to rapidly and recursively self improve is dubious in my view. An intelligence explosion seems like the least likely way to reach the singularity honestly.
​
That said yes, people are getting wild about what AGI is/will mean, when in practice both some of the more operationalized and broader definitions will be met within a year or two most likely.
Ace_Snowlight OP t1_j0vgwrc wrote
👍✨
Viewing a single comment thread. View all comments