'Deepfake' App Causes Fraud And Privacy Fears In China
Arthur T Knackerbracket has found the following story:
An artificial-intelligence app that allows users to insert their faces in place of those film and TV characters has caused controversy in China. Zao has sparked privacy fears and suggestions it could be used to defeat systems using facial recognition. It appeared in China on 29 August and has proven wildly popular. But it has led to developers Momo apologising for its end-user agreement, which stripped users of the rights to their images. And as the app went viral, Zao's owners aired fears users were devouring its expensively purchased server capacity.
Its popularity has also lead to assurances from Alipay, part of the Chinese web giant Alibaba, saying that it's impossible for so-called deepfake videos created by the app to be used to cheat its Smile to Pay facial-recognition system.
Zao is a face-swapping app that uses clips from films and TV shows, convincingly changing a character's face by using selfies from the user's phone.
But some users had noted the app's terms and conditions "gave the developers the global right to permanently use any image created on the app for free", Hong Kong's South China Morning Post reported. "Moreover, the developers had the right to transfer this authorisation to any third party without further permission from the user," the paper said, adding experts believed this broke Chinese law.
Momo had subsequently deleted the controversial clause and issued an apology, saying its app would not store users' biometric information nor "excessively collect user information", Shanghai-based The Paper said.
But popular social media platform WeChat quickly banned users from uploading Zao videos via its platform, citing "security risks".
[...] And lawyer Zhang Xinnian told it the laws governing phone apps' terms and conditions needed to be tightened.
Zao's initial terms "violated user privacy and once personal information is leaked and abused it could lead to criminal incidents", he added.
Read more of this story at SoylentNews.