Risk of extinction by AI should be global priority, say experts
by Geneva Abdul from Technology | The Guardian on (#6BZK1)
Hundreds of tech leaders call for world to treat AI as danger on par with pandemics and nuclear war
A group of leading technology experts from across the world have warned that artificial intelligence technology should be considered a societal risk and prioritised in the same class as pandemics and nuclear wars.
The statement, signed by hundreds of executives and academics, was released by the Center for AI Safety on Tuesday amid growing concerns over regulation and risks the technology posed to humanity.
Continue reading...