Olaf Schlüter
1 min readApr 29, 2023

You are a physicist: Ever heard of "twilight-zone gravity" or a paper "Gravity in the twilight-zone"? Two things chatGPT hallucinated recently out of the blue when asked about the last paper of a certain physicist - with a coauthor she never published anything with. it wasn't even able to locate her correctly in the world of universities (a fate she shared with some other colleagues mentioned by chatGPT within that dialogue)

Did you know that Safari needs special configuration to handle sites offering kerberos authentication (it does not), and when you tell chatGPT that the suggested way to configure it simply isn't there, it invents a new method and you can repeat that cycle (i tried four times)?

chatGPT seems to be good in understanding questions but the answers are crap. Given that chatGPT is basically a text transformator it seeming to understand you is an illusion. To reproduce typical language patterns of answers based on patterns of the question understanding is not necessary. Just a really big neural network.

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Olaf Schlüter
Olaf Schlüter

Written by Olaf Schlüter

IT security specialist, Physicist by education, believing in God as for the exceptional harmony of the laws of nature to create and support life.

Responses (1)

Write a response

Thnx for more counter examples.
But in my experience, this is the 1%.

My research via GPT, which I verify, i find to be around 99% accurate.

I know this is on OpenAI’s radar. They’re measuring hallucination and reduced it by 40% with GPT-4…

--