Eh... What is a more or less normal question for you? What color is the sky?
It's easy not to be wrong when small talking...
Instead just try to ask some very detailed questions... Politely, rationally, with no intentions to drive it off track... just try to ask for the most specific details you may be legitimately interested into... Then after each answer ask it to double check them...
You will be surprised to see how many times it will try to auto-correct the wrong information it just provided... with an even more nested error...
Are you asking specific questions, or just asking it to generate fluff and "small talk"? In my experience, it gets a lot of things wrong when you ask specific stuff.
For example: some time ago twitch had a problem with the website and app were the chat of all streams stopped working if you refreshed the page. I went to ChatGPT and asked it to give me a script to send comments to any chat using the twitch api, in python.
It gave me a normal looking script, that looked mostly alright (I wish I could post it here, but sadly ChatGPT is unavailable RN). There was only one problem: it used a package that didn't exist (which basically makes the entire answer useless). That's because there are a multitude of tutorials that use packages to do similar things, which were probably used as training data. Since ChatGPT doesn't really know anything, it generated similar looking fluff with no real substance.
I've had similar experience when asking it to refactor code and to simplify equations.
am I the only one that's extremely annoyed by everyone saying 'chatGTP/chatbotTGP/conversationGDP/cahTpG/etcederatah'?
seriously this is like the 5000th time I've seen this, WFT
feelosofee t1_j8md695 wrote
Eh... What is a more or less normal question for you? What color is the sky?
It's easy not to be wrong when small talking...
Instead just try to ask some very detailed questions... Politely, rationally, with no intentions to drive it off track... just try to ask for the most specific details you may be legitimately interested into... Then after each answer ask it to double check them...
You will be surprised to see how many times it will try to auto-correct the wrong information it just provided... with an even more nested error...