Apple's AI Research Signals Ambition To Catch Up With Big Tech Rivals
Apple's latest research about running large language models on smartphones offers the clearest signal yet that the iPhone maker plans to catch up with its Silicon Valley rivals in generative artificial intelligence. From a report: The paper, entitled "LLM in a Flash," offers a "solution to a current computational bottleneck," its researchers write. Its approach "paves the way for effective inference of LLMs on devices with limited memory," they said. Inference refers to how large language models, the large data repositories that power apps like ChatGPT, respond to users' queries. Chatbots and LLMs normally run in vast data centres with much greater computing power than an iPhone. The paper was published on December 12 but caught wider attention after Hugging Face, a popular site for AI researchers to showcase their work, highlighted it late on Wednesday. It is the second Apple paper on generative AI this month and follows earlier moves to enable image-generating models such as Stable Diffusion to run on its custom chips. Device manufacturers and chipmakers are hoping that new AI features will help revive the smartphone market, which has had its worst year in a decade, with shipments falling an estimated 5 per cent, according to Counterpoint Research.
Read more of this story at Slashdot.