Amazon Dedicates Team To Train Ambitious AI Model Codenamed 'Olympus'
Amazon is investing millions in training an ambitious large language model (LLMs), hoping it could rival top models from OpenAI and Alphabet. From a report: The model, codenamed as "Olympus," has 2 trillion parameters, the people said, which could make it one of the largest models being trained. OpenAI's GPT-4 model, one of the best models available, is reported to have one trillion parameters. The team is spearheaded by Rohit Prasad, former head of Alexa, who now reports directly to CEO Andy Jassy. As head scientist of artificial general intelligence (AGI) at Amazon, Prasad brought in researchers who had been working on Alexa AI and the Amazon science team to work on training models, uniting AI efforts across the company with dedicated resources.
Read more of this story at Slashdot.