AI chatbots can ‘hallucinate’ and make things up

When you hear the word "hallucination,"

you may think of hearing sounds no one else seems to hear or

imagining your coworker has suddenly grown a second head while you're talking to them.

But when it comes to artificial intelligence, hallucination means something a bit different.

When an AI model "hallucinates,"

it generates fabricated information in response to a user's prompt,

but presents it as if it's factual and correct.

Say you asked an AI chatbot to write an essay on the Statue of Liberty.