wordholes t1_jed6wd9 wrote
Oh my god they're using approximate data from a probabilistic model to train another even more approximate probabilistic model.
What level of generational loss is this??
kaenneth t1_jed7exg wrote
wordholes t1_jed7nco wrote
The future of AI: https://www.youtube.com/watch?v=QEzhxP-pdos
z57 t1_jedfhgf wrote
Wasn't Stanfords Alpaca trained using GPT?
Yes I think it was: Researchers train a language model from Meta with text generated by OpenAI's GPT-3.5 for less than $600
Orqee t1_jedubqo wrote
It’s called meta probabilistic recursion. Because I just name it.
[deleted] t1_jedeb60 wrote
[deleted]
Viewing a single comment thread. View all comments