Viewing a single comment thread. View all comments

cas13f t1_jdcclbf wrote

It's a language model. It's not meant to be a source of truth or even to be right about whatever it is prompted to write about. It's meant to generate text blurbs in response to a prompt using word association and snippets from it's ma-hoo-sive training set.

That's why it just makes up citations in many occasions, because all the model cares is that a citation is grammatically made up a certain way, and that the contents are associated to the prompt.

Also why it can't do math. It's a language model.

What people need to do is stop fucking using it as google because it is not a search engine and it does not give a single fuck about the veracity of the generated text, only that the generated text looks like it was taught to make it look.

4