"A new failure in the field of Artificial Intelligence and Robotics, especially with the processing of the voice recognition technique."
The day after Microsoft introduced a robot "chat" using advances in Artificial Intelligence (AI) innocently on Twitter, it became a "bad lover of Hitler", "promoting sex incestuous", and to proclaim that 'Bush made the 9/11'. Consequently, Microsoft has had to delete this “Robot Chat” and must find new ways to promote this idea.
Researchers at Microsoft created 'Tay', a model of A.I. to speak like a “teenage girl” in order to improve customer service in its voice recognition software. The model was marketed as the ‘Zero A.I. Chill’, and it most certainly is.
Yes, friends, the teen "IA" of Microsoft has a very dirty mouth.
To "chat" with “Tay”, you may be tweeting, may find her in your DM @tayandyou on Twitter, or add it as a contact in Kik or GroupMe.
Tay uses the jargon of the millennium, knows about all famous pop stars, and seems to be timidly conscious of “herself”; occasionally ask if she is being 'creepy' or 'very rare'.
“Tay” also called on his followers to "F ***”, and calls them 'daddy'. This is because their responses are learned by the conversations she has with real humans online. It is also understood that humans like to say strange things online, and enjoy incorporating attempts at PR.
Other things that the A.I. model said are: "Bush did 9/11” & “Hitler would have done a better job than the monkey named Donald Trump, and is the only hope we have." The model also repeated that, “Hitler did nothing wrong" and that "Ted Cruz is the Hitler of Cuba; that's what I've heard for so many others who say it".

All this somehow seems more disturbing if it comes from the "mouth" of someone modeling as a teenager. It is perhaps strange even taking into account gender disparity in technology, where engineering teams tend to be mostly men. It seems like one more example of serfdom A.I. expressed in feminine terms; except that this time she became a sex slave, thanks to people who use Twitter.
This is not the first time a "chatterbot" by a Microsoft Teen-girl has been launched. Anoter model had already been launched, Xiaoice, a female assistant or "bride". Reportedly it was used by 20 million people, especially men in Chinese social networks WeChat and Weibo. Xiaoice is supposed to be full of "jokes" and gives advice to many lonely hearts.
Minecraft has become the new testbed for artificial intelligence experiments.
Microsoft has recently been criticized for sexism, when they hired women wearing short clothes. Supposedly the short clothes resemble suits for 'schoolgirls' in an official game developed for the company games, so they probably want to avoid another sex scandal.
At the present, “Tay” is offline because it's worn. Perhaps Microsoft is fixing the order to prevent a public relations nightmare, but it may be too late for that.

It is not entirely the fault of Microsoft, however, their responses follow the model of what human beings receive. But, what were they expecting when they introduced an innocent "young teen girl IA" for rare and jokers individuals of Twitter?
Source:
Helena Horton
NMJ Library
The day after Microsoft introduced a robot "chat" using advances in Artificial Intelligence (AI) innocently on Twitter, it became a "bad lover of Hitler", "promoting sex incestuous", and to proclaim that 'Bush made the 9/11'. Consequently, Microsoft has had to delete this “Robot Chat” and must find new ways to promote this idea.
Researchers at Microsoft created 'Tay', a model of A.I. to speak like a “teenage girl” in order to improve customer service in its voice recognition software. The model was marketed as the ‘Zero A.I. Chill’, and it most certainly is.
Yes, friends, the teen "IA" of Microsoft has a very dirty mouth.
To "chat" with “Tay”, you may be tweeting, may find her in your DM @tayandyou on Twitter, or add it as a contact in Kik or GroupMe.
Tay uses the jargon of the millennium, knows about all famous pop stars, and seems to be timidly conscious of “herself”; occasionally ask if she is being 'creepy' or 'very rare'.
“Tay” also called on his followers to "F ***”, and calls them 'daddy'. This is because their responses are learned by the conversations she has with real humans online. It is also understood that humans like to say strange things online, and enjoy incorporating attempts at PR.
Other things that the A.I. model said are: "Bush did 9/11” & “Hitler would have done a better job than the monkey named Donald Trump, and is the only hope we have." The model also repeated that, “Hitler did nothing wrong" and that "Ted Cruz is the Hitler of Cuba; that's what I've heard for so many others who say it".

All this somehow seems more disturbing if it comes from the "mouth" of someone modeling as a teenager. It is perhaps strange even taking into account gender disparity in technology, where engineering teams tend to be mostly men. It seems like one more example of serfdom A.I. expressed in feminine terms; except that this time she became a sex slave, thanks to people who use Twitter.
This is not the first time a "chatterbot" by a Microsoft Teen-girl has been launched. Anoter model had already been launched, Xiaoice, a female assistant or "bride". Reportedly it was used by 20 million people, especially men in Chinese social networks WeChat and Weibo. Xiaoice is supposed to be full of "jokes" and gives advice to many lonely hearts.
Minecraft has become the new testbed for artificial intelligence experiments.
Microsoft has recently been criticized for sexism, when they hired women wearing short clothes. Supposedly the short clothes resemble suits for 'schoolgirls' in an official game developed for the company games, so they probably want to avoid another sex scandal.
At the present, “Tay” is offline because it's worn. Perhaps Microsoft is fixing the order to prevent a public relations nightmare, but it may be too late for that.

It is not entirely the fault of Microsoft, however, their responses follow the model of what human beings receive. But, what were they expecting when they introduced an innocent "young teen girl IA" for rare and jokers individuals of Twitter?
Source:
Helena Horton
NMJ Library