- #1
Anachronist
Gold Member
- 119
- 58
Why? Why do people do this?rocknrollkieran said:I asked Chat GPT this but could'nt get a satisfactory answer.
You never get a satisfactory answer when asking for factual information that normally requires some research or study to find. ChatGPT gives you hallucinations instead.
I asked it to give examples of books published by university presses that discuss fringe views, and it gave me either books that didn't fit my criterion or books that don't exist.
I asked it for a simple bit of information: which is the nearest door number to United Airlines baggage claim carousel #6 at the San Francisco airport? I can look it up on a map, but ChatGPT basically hemmed and hawed and gave me canned responses containing only general information, not what I specifically asked.
I asked it to produce some OpenSCAD code that makes a specific 3D shape, and it gave me non-functional code with syntax mixed together from multiple languages.
And on and on. Every time I ask ChatGPT something factual, I ask it something that I can check myself, and the answer is almost always factually incorrect.
What ChatGPT excels at is making stuff up. Writing poetry, writing a paragraph, inventing a compelling title for a particular essay on some subject, creating a paragraph that includes particular meanings, and so on. But that sort of inventiveness is undesirable when looking for scientific facts.