Deepfake detection tools must work with dark skin tones, experts warn
by Hibaq Farah Technology reporter from Technology | The Guardian on (#6DXJG)
Fears that bias in training sets would mean minorities bearing brunt of scams, fraud and misinformation
Detection tools being developed to combat the growing threat of deepfakes - realistic-looking false content - must use training datasets that are inclusive of darker skin tones to avoid bias, experts have warned.
Most deepfake detectors are based on a learning strategy that depends largely on the dataset that is used for its training. It then uses AI to detect signs that may not be clear to the human eye.
Continue reading...