Story 3QQ Is Wikipedia just as good when the articles are written by a script?

Is Wikipedia just as good when the articles are written by a script?

by
in internet on (#3QQ)
story imageAt its core, it's a question of quantity versus quality, or the right to access information. But it's also a question about the role humans should play in an ostensibly human-edited encyclopedia. Here not to provoke those questions but simply to add information to Wikipedia is a Swede by the name of Sverker Johansson. He is single-handedly responsible for 2.7 million articles on wikipedia (8.5% of the entire site). But 'single-handedly' isn't quite right: he wrote and deploys a bot.
Mr. Johansson's program scrubs databases and other digital sources for information, and then packages it into an article. On a good day, he says his "Lsjbot" creates up to 10,000 new entries. On Wikipedia, any registered user can create an entry. Mr. Johansson has to find a reliable database, create a template for a given subject and then launch his bot from his computer. The software program searches for information, then publishes it to Wikipedia.

Bots have long been used to author and edit entries on Wikipedia, and, more recently, an increasingly large amount of the site's new content is written by bots. Their use is regulated by Wikipedia users called the "Bot Approvals Group." While Mr. Johansson works to achieve consensus approval for his project, he and his bot-loving peers expect to continue facing resistance. "There is a vocal minority who don't like it," he said during a recent speech on his work. Still, he soldiers on.
Complex questions are at play here: is it better Wikipedia lack articles that humans can't or won't write? Can robot-written articles be trusted? Should they be labeled and approved? What kind of criteria would be applied and who would fund/oversee this kind of oversight body? And lastly: is all this work even worth it in the first place? Do these bot-written articles even add any value to everyone's favorite information site?

More coverage at Business Spectator (Australia) and Popular Science.
Reply 3 comments

My opinion (Score: 1, Interesting)

by Anonymous Coward on 2014-07-15 21:51 (#2GZ)

is it better Wikipedia lack articles that humans can't or won't write?
Yes. As long as it's properly cited (not that many people look anyway) it's easier access to information. A starting point for further research.
Can robot-written articles be trusted?
No more or less than any other article. There are many reasons articles shouldn't be trusted, including paid editors, people with biased views, people with out of date information.. etc. You shouldn't take anything at face value, that's what citations and research are for.
Should they be labeled and approved?
Maybe the accounts should be labeled like payed editors are now supposed to be now. I don't see why they need more approval than a "real" person. If you find out a bot has reoccurring problems, ban it 'til it's fixed. Reverting and deleting isn't a major issue.
And lastly: is all this work even worth it in the first place? Do these bot-written articles even add any value to everyone's favorite information site?
The authors must believe it is worth it. You can't stop it. If people want to they will find a way to flood Wikipedia with whatever articles they want. By trying to limit them you will only inconvenience "real" users.

Re: My opinion (Score: 1)

by quadrox@pipedot.org on 2014-07-16 10:27 (#2H1)

I agree with most of what you are saying, but I would like to add another point.

While it is true that you can't stop people from doing this, I am somewhat concerned that resources are spent (disk space, processing time etc.) on article stubs that hardly anyone is ever going to miss. Yes, the cited example of some obscure philipene fishing village that nobody would have known about otherwise is a very fine anecdote, but it's no more than that.

ow many of the other fishing village stubs are ever going to be looked up by anyone ever? Wikipedia is always asking for more money, is it really worthwhile to spend a lot of resources (8.5 % af all articles is not nothing) on this stuff?

Re: My opinion (Score: 1)

by zafiro17@pipedot.org on 2014-07-17 11:57 (#2HJ)

I'm kind of with Quadrox on this one. If all this guy is doing is taking other primary sources and banging them into Wikipedia's database, wouldn't it be more useful to get those primary sources in shape where they are available, etc.? And if they're already available, maybe more effort should go into linking to them and letting people know they exist, then screen scraping data just for the purpose of getting into Wikipedia.

I haven't seen any of these articles but if they're just stubs they are likely not that useful. I know whenever I hit a stub article I avoid it entirely, thinking incomplete means probably never reviewed and therefore untrustable.

Maybe somebody should tell this geezer to take up another hobby, like building model trains or something.