Home

obchodník hrnec Machu Picchu tay bot twitter vyhodnotit palec Ještě pořád

Microsoft's Tay chatbot returns briefly and brags about smoking weed |  Mashable
Microsoft's Tay chatbot returns briefly and brags about smoking weed | Mashable

Microsoft shuts down AI chatbot, Tay, after it turned into a Nazi - CBS News
Microsoft shuts down AI chatbot, Tay, after it turned into a Nazi - CBS News

Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage | HuffPost  Impact
Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage | HuffPost Impact

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge

Microsoft's Chat Bot Experiment Turns Racist | Fortune
Microsoft's Chat Bot Experiment Turns Racist | Fortune

Trolls turned Tay, Microsoft's fun millennial AI bot, into a genocidal  maniac - The Washington Post
Trolls turned Tay, Microsoft's fun millennial AI bot, into a genocidal maniac - The Washington Post

Microsoft silences its new A.I. bot Tay, after Twitter users teach it  racism [Updated] | TechCrunch
Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism [Updated] | TechCrunch

Remembering Microsoft's Chatbot disaster | by Kenji Explains | UX Planet
Remembering Microsoft's Chatbot disaster | by Kenji Explains | UX Planet

Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft chatbot is taught to swear on Twitter - BBC News

AI Expert Explains Why Microsoft's Tay Chatbot Is so Racist
AI Expert Explains Why Microsoft's Tay Chatbot Is so Racist

Microsoft scrambles to limit PR damage over abusive AI bot Tay | Artificial  intelligence (AI) | The Guardian
Microsoft scrambles to limit PR damage over abusive AI bot Tay | Artificial intelligence (AI) | The Guardian

Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a  Racist Jerk. - The New York Times
Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk. - The New York Times

Microsoft silences its new A.I. bot Tay, after Twitter users teach it  racism [Updated] | TechCrunch
Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism [Updated] | TechCrunch

Microsoft exec apologizes for Tay chatbot's racist tweets, says users  'exploited a vulnerability' | VentureBeat
Microsoft exec apologizes for Tay chatbot's racist tweets, says users 'exploited a vulnerability' | VentureBeat

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge

Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says  Microsoft
Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says Microsoft

Microsoft AI bot Tay returns to Twitter, goes on spam tirade, then back to  sleep | TechCrunch
Microsoft AI bot Tay returns to Twitter, goes on spam tirade, then back to sleep | TechCrunch

Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft chatbot is taught to swear on Twitter - BBC News

Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft chatbot is taught to swear on Twitter - BBC News

Microsoft's Tay AI chatbot goes offline after being taught to be a racist |  ZDNET
Microsoft's Tay AI chatbot goes offline after being taught to be a racist | ZDNET

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge