FBI Warns Imminent Deepfake Attacks "Almost Certain"
[Ed. note: As much as this goes against the norm here, I strongly encourage folk to read the entire linked article. We continue to witness dramatic advances in computer capabilities. Just consider what we already have today: AMD's Epyc and Threadripper processors, Apple Silicon (of which the M1 processor is only a taste), multi-terabyte DDR6 memories, huge farms of SSD storage all help leverage the tremendous capabilities of the latest ray-tracing video cards. Consider this a PSA (Public Service Announcement): You' ve Been Warned.-martyb)
upstart writes in with an IRC submission for c0lo:
FBI Warns Imminent Deepfake Attacks "Almost Certain" - The Debrief:
The Federal Bureau of Investigation (FBI) has issued a unique Private Industry Notification (PIN) on deepfakes, warning companies that "malicious actors almost certainly will leverage synthetic content for cyber and foreign influence operations in the next 12-18 months."
[...] Creating or manipulating images and videos to depict events that never actually happened is hardly new. However, advances in machine learning and artificial intelligence have allowed for the creation of compelling and nearly indistinguishable fake videos and images.
Legacy photo editing software uses various graphic editing techniques to alter, change, or enhance images. Photo editing software such as PhotoShop can manipulate pictures to include details or even people that weren't originally in a photo. However, creating convincing false images is highly-dependent on a user's skill in using the editing software.
In contrast, deepfakes use machine learning, and a type of neural network called an autoencoder. An encoder reduces an image to a lower-dimensional latent space, allowing for a decoder to reconstruct an image from the latent representation.
Read more of this story at SoylentNews.