Grok AI Goes Open Source
xAI has opened sourced its large language model Grok. From a report: The move, which Musk had previously proclaimed would happen this week, now enables any other entrepreneur, programmer, company, or individual to take Grok's weights -- the strength of connections between the model's artificial "neurons," or software modules that allow the model to make decisions and accept inputs and provide outputs in the form of text -- and other associated documentation and use a copy of the model for whatever they'd like, including for commercial applications. "We are releasing the base model weights and network architecture of Grok-1, our large language model," the company announced in a blog post. "Grok-1 is a 314 billion parameter Mixture-of-Experts model trained from scratch by xAI." Those interested can download the code for Grok on its Github page or via a torrent link. Parameters refers to the weights and biases that govern the model -- the more parameters, generally the more advanced, complex and performant the model is. At 314 billion parameters, Grok is well ahead of open source competitors such as Meta's Llama 2 (70 billion parameters) and Mistral 8x7B (12 billion parameters). Grok was open sourced under an Apache License 2.0, which enables commercial use, modifications, and distribution, though it cannot be trademarked and there is no liability or warranty that users receive with it. In addition, they must reproduce the original license and copyright notice, and state the changes they've made.
Read more of this story at Slashdot.