Article 185K4 Microsoft AI chatbot promptly becomes Nazi

Microsoft AI chatbot promptly becomes Nazi

by
Rob Beschizza
from on (#185K4)

Microsoft_Tay_daddy-large_transqVzuuqpFl

Microsoft has pulled the plug on on Tay, a twitter AI chatbot that went from zero to Nazi in a matter of hours after being launched. And not the strangely-compelling Kenneth Branagh-type Nazi, either.

bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we've got.

@TomDanTheRock Repeat after me: Hitler did nothing wrong!

The problem seems obvious and predictable: by learning from its interactions with real humans, Tay could be righteously trolled into illustrating the numbing stupidity of its own PR-driven creators. The Daily Telegaph:

All of this somehow seems more disturbing out of the 'mouth' of someone modelled as a teenage girl. It is perhaps even stranger considering the gender disparity in tech, where engineering teams tend to be mostly male. It seems like yet another example of female-voiced AI servitude, except this time she's turned into a sex slave thanks to the people using her on Twitter.
zudp2CIUPcw
External Content
Source RSS or Atom Feed
Feed Location http://feeds.boingboing.net/boingboing/iBag
Feed Title
Feed Link http://feeds.boingboing.net/
Reply 0 comments