Pipe 3CG Is Wikipedia just as good when the articles are written by a script?

Is Wikipedia just as good when the articles are written by a script?

by
in internet on (#3CG)
At its core, it's a question of quantity versus quality, or the right to access information. But it's also a question about the role humans should play in an ostensibly human-edited encyclopedia. Here not to provoke those questions but simply to add information to Wikipedia is a Swede by the name of Sverker Johansson. He is single-handedly responsible for 2.7 million articles on wikipedia (8.5% of the entire site). But 'single-handedly' isn't quite right: he wrote and deploys a bot.
Mr. Johansson's program scrubs databases and other digital sources for information, and then packages it into an article. On a good day, he says his "Lsjbot" creates up to 10,000 new entries. On Wikipedia, any registered user can create an entry. Mr. Johansson has to find a reliable database, create a template for a given subject and then launch his bot from his computer. The software program searches for information, then publishes it to Wikipedia.

Bots have long been used to author and edit entries on Wikipedia, and, more recently, an increasingly large amount of the site's new content is written by bots. Their use is regulated by Wikipedia users called the "Bot Approvals Group." While Mr. Johansson works to achieve consensus approval for his project, he and his bot-loving peers expect to continue facing resistance. "There is a vocal minority who don't like it," he said during a recent speech on his work. Still, he soldiers on.
Complex questions are at play here: is it better Wikipedia lack articles that humans can't or won't write? Can robot-written articles be trusted? Should they be labeled and approved? What kind of criteria would be applied and who would fund/oversee this kind of oversight body? And lastly: is all this work even worth it in the first place? Do these bot-written articles even add any value to everyone's favorite information site?

More coverage at Business Spectator (Australia) and Popular Science.

History


Deprecated: mb_convert_encoding(): Handling HTML entities via mbstring is deprecated; use htmlspecialchars, htmlentities, or mb_encode_numericentity/mb_decode_numericentity instead in /var/pipedot/include/diff.php on line 25

Deprecated: Creation of dynamic property FineDiff::$granularityStack is deprecated in /var/pipedot/lib/finediff/finediff.php on line 217

Deprecated: Creation of dynamic property FineDiff::$edits is deprecated in /var/pipedot/lib/finediff/finediff.php on line 218

Deprecated: Creation of dynamic property FineDiff::$from_text is deprecated in /var/pipedot/lib/finediff/finediff.php on line 219

Deprecated: Creation of dynamic property FineDiff::$last_edit is deprecated in /var/pipedot/lib/finediff/finediff.php on line 372

Deprecated: Creation of dynamic property FineDiff::$stackpointer is deprecated in /var/pipedot/lib/finediff/finediff.php on line 373

Deprecated: Creation of dynamic property FineDiff::$from_offset is deprecated in /var/pipedot/lib/finediff/finediff.php on line 375

Deprecated: Creation of dynamic property FineDiffCopyOp::$len is deprecated in /var/pipedot/lib/finediff/finediff.php on line 155
2014-07-15 11:33
Is Wikipedia just as good when the articles are written by a script?
zafiro17@pipedot.org
At its core, it's a question of quantity versus quality, or the right to access information. But it's also a question about the role humans should play in an ostensibly human-edited encyclopedia. Here not to provoke those questions but simply to add information to Wikipedia is a Swede by the name of Sverker Johansson. He is single-handedly responsible for 2.7 million articles on wikipedia (8.5% of the entire site). But 'single-handedly' isn't quite right: he wrote and deploys a bot.
Mr. Johansson's program scrubs databases and other digital sources for information, and then packages it into an article. On a good day, he says his "Lsjbot" creates up to 10,000 new entries. On Wikipedia, any registered user can create an entry. Mr. Johansson has to find a reliable database, create a template for a given subject and then launch his bot from his computer. The software program searches for information, then publishes it to Wikipedia.

Bots have long been used to author and edit entries on Wikipedia, and, more recently, an increasingly large amount of the site's new content is written by bots. Their use is regulated by Wikipedia users called the "Bot Approvals Group." While Mr. Johansson works to achieve consensus approval for his project, he and his bot-loving peers expect to continue facing resistance. "There is a vocal minority who don't like it," he said during a recent speech on his work. Still, he soldiers on.
Complex questions are at play here: is it better Wikipedia lack articles that humans can't or won't write? Can robot-written articles be trusted? Should they be labeled and approved? What kind of criteria would be applied and who would fund/oversee this kind of oversight body? And lastly: is all this work even worth it in the first place? Do these bot-written articles even add any value to everyone's favorite information site?

More coverage at Business Spectator (Australia) and Popular Science.
Reply 0 comments