Microsoft programmed Tay, an artificial intelligence bot, as an innocent teenage girl to chat online with millennials—leaning how to use language along the way.
It took just one day after its launch for social media users to corrupt Tay, teaching her to spew hate speech. Consequently, the tech giant unplugged Tay and deleted dozens of her messages, the Huffington Post reports.
A Microsoft spokesman emailed this comment to the Post:
“Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments.”
Business Insider said Microsoft launched Tay earlier this week as a program that learns human conversation. But White supremacists and sexists quickly discovered an open door to teach it offensive language, which the program doesn’t comprehend.
In one infamous tweet, Business Insider said Tay used a racially derogatory description of President Barack Obama, accused former President George W. Bush of causing 9/11, and commented that Adolf Hitler “would have done a better job.” Tay’s tweet pointed to presidential candidate Donald Trump, “the only hope we’ve got.”
Critics say Microsoft should have anticipated Tay’s vulnerability and added filters to prevent what happened.
Microsoft Unplugs Chatbot After Racist, Sexist Rants was originally published on newsone.com
Houston Authorities Respond To Reports Of Shooting at Lakewood Church
[VIDEO] Amber Rose and Texans QB CJ Stroud Spotted Leaving Celebrity Game Together In Houston
Rodeo News: Bun B Confirms Too Short As Performer For All-American Takeover March 12
Usher Adds THIRD Date To November Concert in Houston
Usher Adds 2nd Date To Houston Show
D.L. Hughley GOES OFF on IG Live: MO’NIQUE IS A LIAR!
Brazilian Teenager Dies After Masturbating 42 “Times” Without Stopping!!
Our Favorite Super Bowl Memes, Moments, Music and Random MESS