- #1
- 47,477
- 23,757
I’ll start with the simple fact: ChatGPT is not a reliable answerer to questions.
To try to explain why from scratch would be a heavy lift, but fortunately, Stephen Wolfram has already done the heavy lifting for us in his article, “What is ChatGPT Doing… and Why Does It Work?” [1] In a PF thread discussing this article, I tried to summarize the key message of Wolfram’s article as briefly as I could. Here is what I said in my post there [2]:
ChatGPT does not make use of the meanings of words at all. All it is doing is generating text word by word based on relative word frequencies in its training data. It is using correlations between words, but that is not the same as correlations in the underlying information that the words represent (much less causation). ChatGPT literally has no idea that the words it strings together represent anything.
In other words, ChatGPT is not designed to answer questions or provide information. It is explicitly designed not to do those things, because...
Continue reading...
Last edited by a moderator: