Article 5XG5G Supercomputer to train 176-billion-parameter open-source AI language model

Supercomputer to train 176-billion-parameter open-source AI language model

by
from www.theregister.com - Articles on (#5XG5G)
Story ImageBigScience is a collaborative effort by developers volunteering to make ML research more accessible

GTC BigScience - a team made up of roughly a thousand developers around the world - has started training its 176-billion-parameter open-source AI language model in a bid to advance research into natural language processing (NLP)....

External Content
Source RSS or Atom Feed
Feed Location http://www.theregister.co.uk/headlines.atom
Feed Title www.theregister.com - Articles
Feed Link https://www.theregister.com/
Reply 0 comments