Is Wikipedia just as good when the articles are written by a script?

by
in internet on (#3QQ)
story imageAt its core, it's a question of quantity versus quality, or the right to access information. But it's also a question about the role humans should play in an ostensibly human-edited encyclopedia. Here not to provoke those questions but simply to add information to Wikipedia is a Swede by the name of Sverker Johansson. He is single-handedly responsible for 2.7 million articles on wikipedia (8.5% of the entire site). But 'single-handedly' isn't quite right: he wrote and deploys a bot.
Mr. Johansson's program scrubs databases and other digital sources for information, and then packages it into an article. On a good day, he says his "Lsjbot" creates up to 10,000 new entries. On Wikipedia, any registered user can create an entry. Mr. Johansson has to find a reliable database, create a template for a given subject and then launch his bot from his computer. The software program searches for information, then publishes it to Wikipedia.

Bots have long been used to author and edit entries on Wikipedia, and, more recently, an increasingly large amount of the site's new content is written by bots. Their use is regulated by Wikipedia users called the "Bot Approvals Group." While Mr. Johansson works to achieve consensus approval for his project, he and his bot-loving peers expect to continue facing resistance. "There is a vocal minority who don't like it," he said during a recent speech on his work. Still, he soldiers on.
Complex questions are at play here: is it better Wikipedia lack articles that humans can't or won't write? Can robot-written articles be trusted? Should they be labeled and approved? What kind of criteria would be applied and who would fund/oversee this kind of oversight body? And lastly: is all this work even worth it in the first place? Do these bot-written articles even add any value to everyone's favorite information site?

More coverage at Business Spectator (Australia) and Popular Science.

Re: My opinion (Score: 1)

by zafiro17@pipedot.org on 2014-07-17 11:57 (#2HJ)

I'm kind of with Quadrox on this one. If all this guy is doing is taking other primary sources and banging them into Wikipedia's database, wouldn't it be more useful to get those primary sources in shape where they are available, etc.? And if they're already available, maybe more effort should go into linking to them and letting people know they exist, then screen scraping data just for the purpose of getting into Wikipedia.

I haven't seen any of these articles but if they're just stubs they are likely not that useful. I know whenever I hit a stub article I avoid it entirely, thinking incomplete means probably never reviewed and therefore untrustable.

Maybe somebody should tell this geezer to take up another hobby, like building model trains or something.
Post Comment
Subject
Comment
Captcha
What is seventeen thousand and seventy eight as digits?