Updates on the AGI Debate, Humanoid Bots and Karpathy Insight
by Brian Wang from NextBigFuture.com on (#7250J)
Tim Dettmers Argues AGI Will Not Happen Dettmers claims GPU performance-per-cost peaked around 2018. Post-2018 gains from one-off architectural tricks now exhausted and facing diminishing returns. Transformers are near physically optimal for local computation versus global attention. Scaling laws demand exponential resources for linear gains. Only 1-2 years of viable scaling left before costs outpace ...