Submitted by EntireContext t3_zasjrg in singularity
ReadSeparate t1_iyo883j wrote
Reply to comment by EntireContext in Have you updated your timelines following ChatGPT? by EntireContext
I do agree with this comment. It’s feasible that long term memory isn’t required for AGI (though I think it probably is) or that hacks like reading/writing to a database will be able to simulate long term memory.
I think it may take longer than 2025 to replace transformers though. They’ve been around since 2017 and we haven’t seen any real promising candidates yet.
I can definitely see a scenario where GPT-5 or 6 has prompts built into is training data which are designed to teach it to utilize database read/writes.
Imagine it says hello to you after seeing your name only once six months ago. It could have a read database token which has sub-input tokens to fetch your name from a database based on some sort of identifier.
It could probably get really good at doing this too if it’s actually in the training data.
Eventually, I could see the model using its coding knowledge to design the database/promoting system on its own.
Viewing a single comment thread. View all comments