Academic Publishers Turn To AI Software To Catch Bad Scientists Doctoring Data
Shady scientists trying to publish bad research may want to think twice as academic publishers are increasingly using AI software to automatically spot signs of data tampering. The Register: Duplications of images, where the same picture of a cluster of cells, for example, is copied, flipped, rotated, shifted, or cropped is, unfortunately, quite common. In cases where the errors aren't accidental, the doctored images are created to look as if the researchers have more data and conducted more experiments then they really did. Image duplication was the top reason papers were retracted for the American Association for Cancer Research (AACR) over 2016 to 2020, according to Daniel Evanko, the company's Director of Journal Operations and Systems. Having to retract a paper damages the authors and the publishers' reputation. It shows that the quality of work from the researchers was poor, and the editor's peer review process missed mistakes. To prevent embarrassment for both parties, academic publishers like AACR have turned to AI software to detect image duplication before a paper is published in a journal. The AACR started trialling Proofig, an image-checking programme developed by a startup going by the same name as their product based in Israel. Evanko presented results from the pilot study to show how Proofig impacted AACR's operations at the International Congress on Peer Review and Scientific Publication conference held in Chicago this week. AACR publishes ten research journals and reviews over 13,000 submissions every year. From January 2021 to May 2022, officials used Proofig to screen 1,367 manuscripts that had been provisionally accepted for publication and contacted authors in 208 cases after reviewing image duplicates flagged by the software. In most cases, the duplication is a sloppy error that can be fixed easily. Scientists may have accidentally got their results mixed up and the issue is often resolved by resubmitting new data. On rare occasions, however, the dodgy images highlighted by the software are a sign of foul play.
Read more of this story at Slashdot.