Article 6JZJF StarCoder 2 Is a Code-Generating AI That Runs On Most GPUs

StarCoder 2 Is a Code-Generating AI That Runs On Most GPUs

by
BeauHD
from Slashdot on (#6JZJF)
An anonymous reader quotes a report from TechCrunch: Perceiving the demand for alternatives, AI startup Hugging Face several years ago teamed up with ServiceNow, the workflow automation platform, to create StarCoder, an open source code generator with a less restrictive license than some of the others out there. The original came online early last year, and work has been underway on a follow-up, StarCoder 2, ever since. StarCoder 2 isn't a single code-generating model, but rather a family. Released today, it comes in three variants, the first two of which can run on most modern consumer GPUs: A 3-billion-parameter (3B) model trained by ServiceNow; A 7-billion-parameter (7B) model trained by Hugging Face; and A 15-billion-parameter (15B) model trained by Nvidia, the newest supporter of the StarCoder project. (Note that "parameters" are the parts of a model learned from training data and essentially define the skill of the model on a problem, in this case generating code.)a Like most other code generators, StarCoder 2 can suggest ways to complete unfinished lines of code as well as summarize and retrieve snippets of code when asked in natural language. Trained with 4x more data than the original StarCoder (67.5 terabytes versus 6.4 terabytes), StarCoder 2 delivers what Hugging Face, ServiceNow and Nvidia characterize as "significantly" improved performance at lower costs to operate. StarCoder 2 can be fine-tuned "in a few hours" using a GPU like the Nvidia A100 on first- or third-party data to create apps such as chatbots and personal coding assistants. And, because it was trained on a larger and more diverse data set than the original StarCoder (~619 programming languages), StarCoder 2 can make more accurate, context-aware predictions -- at least hypothetically. [I]s StarCoder 2 really superior to the other code generators out there -- free or paid? Depending on the benchmark, it appears to be more efficient than one of the versions of Code Llama, Code Llama 33B. Hugging Face says that StarCoder 2 15B matches Code Llama 33B on a subset of code completion tasks at twice the speed. It's not clear which tasks; Hugging Face didn't specify. StarCoder 2, as an open source collection of models, also has the advantage of being able to deploy locally and "learn" a developer's source code or codebase -- an attractive prospect to devs and companies wary of exposing code to a cloud-hosted AI. Hugging Face, ServiceNow and Nvidia also make the case that StarCoder 2 is more ethical -- and less legally fraught -- than its rivals. [...] As opposed to code generators trained using copyrighted code (GitHub Copilot, among others), StarCoder 2 was trained only on data under license from the Software Heritage, the nonprofit organization providing archival services for code. Ahead of StarCoder 2's training, BigCode, the cross-organizational team behind much of StarCoder 2's roadmap, gave code owners a chance to opt out of the training set if they wanted. As with the original StarCoder, StarCoder 2's training data is available for developers to fork, reproduce or audit as they please. StarCoder 2's license may still be a roadblock for some. "StarCoder 2 is licensed under the BigCode Open RAIL-M 1.0, which aims to promote responsible use by imposing 'light touch' restrictions on both model licensees and downstream users," writes TechCrunch's Kyle Wiggers. "While less constraining than many other licenses, RAIL-M isn't truly 'open' in the sense that it doesn't permit developers to use StarCoder 2 for every conceivable application (medical advice-giving apps are strictly off limits, for example). Some commentators say RAIL-M's requirements may be too vague to comply with in any case -- and that RAIL-M could conflict with AI-related regulations like the EU AI Act."

twitter_icon_large.pngfacebook_icon_large.png

Read more of this story at Slashdot.

External Content
Source RSS or Atom Feed
Feed Location https://rss.slashdot.org/Slashdot/slashdotMain
Feed Title Slashdot
Feed Link https://slashdot.org/
Feed Copyright Copyright Slashdot Media. All Rights Reserved.
Reply 0 comments