Article 6TTX6 $120 Raspberry Pi5 Can Run 14 Billion Parameter LLM Models … Slowly

$120 Raspberry Pi5 Can Run 14 Billion Parameter LLM Models … Slowly

by
Brian Wang
from NextBigFuture.com on (#6TTX6)
Story ImageIt is possible to load and run 14 Billion parameter llm AI models on Raspberry Pi5 with 16 GB of memory ($120). However, they can be slow with about 0.6 tokens per second. A 13 billion parameter model can run at 1.36 tokens per second. Improved firmware (better SDRAM timing) improved results.
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/blogspot/advancednano
Feed Title NextBigFuture.com
Feed Link https://www.nextbigfuture.com/
Reply 0 comments