Submitted by Vegetable-Skill-9700 t3_121a8p4 in MachineLearning
Puzzleheaded_Acadia1 t1_jdn6ugl wrote
Reply to comment by soggy_mattress in [D] Do we really need 100B+ parameters in a large language model? by Vegetable-Skill-9700
I see a future where LLMs or llamas that are multimodels or any other new kind artificial intelligence run on esp32 level of hardware i don't know how that will work but I'm pretty sure we are heading there
Viewing a single comment thread. View all comments