Article 5XG5G Supercomputer to train 176-billion-parameter open-source AI language model

Supercomputer to train 176-billion-parameter open-source AI language model

by
from The Register on (#5XG5G)
Story ImageBigScience is a collaborative effort by developers volunteering to make ML research more accessible

GTC BigScience - a team made up of roughly a thousand developers around the world - has started training its 176-billion-parameter open-source AI language model in a bid to advance research into natural language processing (NLP)....

External Content
Source RSS or Atom Feed
Feed Location http://www.theregister.co.uk/headlines.atom
Feed Title The Register
Feed Link https://www.theregister.com/
Feed Copyright Copyright © 2024, Situation Publishing
Reply 0 comments