Article 6FR6F Study shows AI program could verify Wikipedia citations, improving reliability

Study shows AI program could verify Wikipedia citations, improving reliability

by
Malak Saleh
from Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics on (#6FR6F)
Story Image

You can't trust everything on a Wikipedia page, which is why it's important that you refer to the original sources cited in the footnotes. But sometimes, even the primary sources can lead you astray. Researchers have developed an AI focused on improving the reliability of Wikipedia references by training the algorithms to identify citations on the website that are questionable.

The program, called SIDE, does two things: check if a primary source is accurate and suggest new ones. However, the AI operates under the assumption that a Wikipedia claim is true. This means that, while it can check for the validity of a source, it can't actually verify claims made in an entry.

In a study, people preferred the AI's suggested citations to the original 70 percent of the time. The researchers found that in nearly 50 percent of the cases, SIDE presented a source that was already being used by Wikipedia as the top reference. And 21 percent of the time, SIDE was one step ahead when it churned out a recommendation that was already deemed appropriate by human annotators in the study.

While the AI appears to demonstrate it can effectively help an editor verify Wikipedia claims, the researchers admit that alternative programs could outperform their current design in both quality and speed. SIDE is limited in its capabilities - namely, the program only considers references corresponding to web pages. In reality, Wikipedia cites books, scientific articles and info presented through other media beyond text like images and video. But beyond its technical limits, the whole premise of Wikipedia is that any writer anywhere could assign a reference to a topic. The researchers suggest that the use of Wikipedia itself could be limiting to the study. They allude that individuals who plug citations into the website could permeate bias depending on the nature of the topics in question.

Meanwhile, we all know that any program, especially an AI that is dependent on training, could be prone to the exposure of the biases of its programmer. The data used to train and evaluate SIDE's models could be limited in that regard. But still, the benefits of using AI to streamline fact-checking, or at least use it as a supportive tool, could have reverberating applications elsewhere. Wikipedia and social media companies alike need to contend with bad actors and bots that flood digital town squares with false information. This is especially true and important now more than ever, in the wake of misinformation spreading around the Israel-Hamas war and the upcoming presidential elections in the US. The need to mitigate misinformation online could be catalyzed with AI tools, like SIDE, designed for this exact purpose. But there are still some advances that need to be made before it can.

This article originally appeared on Engadget at https://www.engadget.com/study-shows-ai-program-could-verify-wikipedia-citations-improving-reliability-184543711.html?src=rss
External Content
Source RSS or Atom Feed
Feed Location https://www.engadget.com/rss.xml
Feed Title Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics
Feed Link https://www.engadget.com/
Feed Copyright copyright Yahoo 2024
Reply 0 comments