Article 6GF53 “Hallucinating” AI models help coin Cambridge Dictionary’s word of the year

“Hallucinating” AI models help coin Cambridge Dictionary’s word of the year

by
Benj Edwards
from Ars Technica - All content on (#6GF53)
halluncinate_hero_1-800x450.jpg

Enlarge / A screenshot of the Cambridge Dictionary website where it announced its 2023 word of the year, "hallucinate." (credit: Cambridge Dictionary)

On Wednesday, Cambridge Dictionary announced that its 2023 word of the year is "hallucinate," owing to the popularity of large language models (LLMs) like ChatGPT, which sometimes produce erroneous information. The Dictionary also published an illustrated site explaining the term, saying, "When an artificial intelligence hallucinates, it produces false information."

"The Cambridge Dictionary team chose hallucinate as its Word of the Year 2023 as it recognized that the new meaning gets to the heart of why people are talking about AI," the dictionary writes. "Generative AI is a powerful tool but one we're all still learning how to interact with safely and effectively-this means being aware of both its potential strengths and its current weaknesses."

As we've previously covered in various articles, "hallucination" in relation to AI originated as a term of art in the machine-learning space. As LLMs entered mainstream use through applications like ChatGPT late last year, the term spilled over into general use and began to cause confusion among some, who saw it as unnecessary anthropomorphism. Cambridge Dictionary's first definition of hallucination (for humans) is "to seem to see, hear, feel, or smell something that does not exist." It involves perception from a conscious mind, and some object to that association.

Read 8 remaining paragraphs | Comments

External Content
Source RSS or Atom Feed
Feed Location http://feeds.arstechnica.com/arstechnica/index
Feed Title Ars Technica - All content
Feed Link https://arstechnica.com/
Reply 0 comments