Article 6DR6T Anthropic Launches Improved Version of Its Entry-Level LLM

Anthropic Launches Improved Version of Its Entry-Level LLM

by
BeauHD
from Slashdot on (#6DR6T)
Anthropic, the AI startup co-founded by ex-OpenAI execs, has released an updated version of its faster, cheaper, text-generating model available through an API, Claude Instant. TechCrunch reports: The updated Claude Instant, Claude Instant 1.2, incorporates the strengths of Anthropic's recently announced flagship model, Claude 2, showing "significant" gains in areas such as math, coding, reasoning and safety, according to Anthropic. In internal testing, Claude Instant 1.2 scored 58.7% on a coding benchmark compared to Claude Instant 1.1, which scored 52.8%, and 86.7% on a set of math questions versus 80.9% for Claude Instant 1.1. "Claude Instant generates longer, more structured responses and follows formatting instructions better," Anthropic writes in a blog post. "Instant 1.2 also shows improvements in quote extraction, multilingual capabilities and question answering." Claude Instant 1.2 is also less likely to hallucinate and more resistant to jailbreaking attempts, Anthropic claims. In the context of large language models like Claude, "hallucination" is where a model generates text that's incorrect or nonsensical, while jailbreaking is a technique that uses cleverly-written prompts to bypass the safety features placed on large language models by their creators. And Claude Instant 1.2 features a context window that's the same size of Claude 2's -- 100,000 tokens. Context window refers to the text the model considers before generating additional text, while tokens represent raw text (e.g. the word "fantastic" would be split into the tokens "fan," "tas" and "tic"). Claude Instant 1.2 and Claude 2 can analyze roughly 75,000 words, about the length of "The Great Gatsby." Generally speaking, models with large context windows are less likely to "forget" the content of recent conversations.

twitter_icon_large.pngfacebook_icon_large.png

Read more of this story at Slashdot.

External Content
Source RSS or Atom Feed
Feed Location https://rss.slashdot.org/Slashdot/slashdotMain
Feed Title Slashdot
Feed Link https://slashdot.org/
Feed Copyright Copyright Slashdot Media. All Rights Reserved.
Reply 0 comments