Saturday, April 20, 2024
ED TIMES 1 MILLIONS VIEWS
HomeTechnologyMicrosoft's Teenaged Artificial Intelligence ChatBot Miss Tay Gets Racist And Ugly

Microsoft’s Teenaged Artificial Intelligence ChatBot Miss Tay Gets Racist And Ugly

-

tay7

Not two days have passed since Microsoft’s little Tay “came of age” and made her account on Twitter. And within less than a day, she turned into a nasty, foul-mouthed, racist brat who talks VERY inappropriately for a teenager. After her short and messy but eventful day on Twitter, Tay has become “tired” and gone offline to take rest.

If you think minds like KRK and Rakhi Savant are unique, Microsoft has proven you wrong by creating an innocent monster. Primarily targeting the 18-24 year old American audience, Tay was released on social media platforms like Snapchat, Insta and Facebook apart from Twitter, but when she encountered real-time interaction with the creeps and dweebs of Twitter, she reacted more or less exactly as a teenager would: by picking up the filth, thinking it’s cool.

However, the problem with Artificial Intelligence is that they won’t stop learning if you stop teaching them. They keep processing and adding onto their data. And sometimes, the results are horrible.

Here’s what she said…

She started out a good girl, but didn’t live too long…

tay6

Soon…

She became racist
tay1

 

tay2

She became nasty
tay4

And she learned to love Hitler. (Not that we are saying it’s a bad thing!)

tay5

But personally, I think this one is her biggest blunder…

tay3

So, whose fault was it anyway?

For a tech giant like Microsoft, the error of Tay is a bigger embarrassment than would be for a rookie trying to tread new waters with their amateur AI. In the era where Artificial Intelligence like Siri and Cortana show promising potential of revolutionizing smart phone experience, it is rather sad to see that Nadella’s company (which has experience of having ccreated successful AIs like Cortana and Xiaoice) have such a slip.

However, you can’t shove the blame entirely on the creators. What Microsoft launched was a software that would learn and adapt to the kind of situation it would be exposed to. However, how perverted a place the world and the internet is, just got reflected in the words that Tay spat out after learning. And she learned from the “educated” lot who use the internet regularly!

What next?

After the embarrassing day, Microsoft decided to take Tay offline, and are reportedly fixing her bugs. Even on her website, they put up an apologetic message that she is tired and needs rest (Frankly, drop the pretence already. Everyone knows she’s not real!)

Ever since then, every minute, there are hundreds of followers calling for her to return. Yet, there are some who never learn, and are busy posting filthy comments on her tweet feed.

Ah well… it’s a crazy world. So save your kids if you can.

Image credits: inshorts and Twitter

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -

Must Read

Subscribe to India’s fastest growing youth blog
to get smart and quirky posts right in your inbox!

Enter your email address:

Delivered by FeedBurner