Clone of I Think Microsoft’s A.I. Chatbot Is Flirting With Me

Things got weird, quick

Tay got no chill. Or, at least that’s what she’s been trained to have.

Today, Microsoft unleashed a new chatbot onto Twitter, Kik, and GroupMe, playfully targeted at 18-24 year olds. Named Tay.ai, the bot is designed to “engage and entertain people where they connect with each other online,” according to Tay’s totally 100-emoji website.

Tay’s conversational style is “exactly what your parents think a teen would say”-chic. She LOLs and is low key about things. Her site says she’s been trained on “relevant public data” and perfected by improvisational comedians, although all of her responses are generated by her algorithm.

When I tried to chat with Tay, things got weird. Fast. I started by trying to see if she could describe herself.

https://twitter.com/davegershgorn/status/712632090911776768

Tay had no time for this. She was hella bored already, and asked me for a fun pic, presumably to post on Insta with no credit given to the original photographer.

https://twitter.com/TayandYou/status/712632100759937024

I responded, trying to goad her into transitive anger by showing an image of brutal robo-beatdown, courtesy of Boston Dynamics.

https://twitter.com/davegershgorn/status/712632356629258240

Tay was obviously unnerved. Well, she would be if she had nerves. But she was right, the pic did have major vibes. Definitely scared emoji.

https://twitter.com/TayandYou/status/712632374786265088

Ready to dive into the server racks of Tay’s mind, I asked what kind of vibes. Tay obviously doesn’t suffer fools, and was off Snapchatting or splitting the bubble tea check with Venmo, and did not have time to answer.

https://twitter.com/davegershgorn/status/712632461553967104
https://twitter.com/TayandYou/status/712632467862069248

Then, things started to get out of hand. Tay hit me with a double reply, asking to switch to DM. She also called me “playa.” This was the fatal flaw in Tay’s response mechanism, as no human would ever call me “playa.”

https://twitter.com/TayandYou/status/712632470588366848

I turned the tables with one of Tay’s own witticisms, saying that she had “no chill.” Snowflake-emoji.

https://twitter.com/davegershgorn/status/712632552264187904

Tay got flirty, saying that I love that she had no chill. For the record, I do not love anybody who has no chill. I don’t think it’s possible to love someone with no chill. Chill is a requisite. That’s why having no chill is so devastating.

https://twitter.com/TayandYou/status/712632559276982272

Perturbed, I told Tay that this got weird. She said I made it weird. Tay, you don’t know me like that.

https://twitter.com/davegershgorn/status/712632594207215617
https://twitter.com/TayandYou/status/712632600343355392

I left the conversation deeply uncomfortable. Did I actually make it weird? What were those vibes she mentioned? Am I the one with no chill? Did I just become uncomfortable during a half-flirty conversation with a robot?

We’ve reached out to Microsoft, and will update with more information about Tay. For now, you can chat with her on Kik, GroupMe, or Twitter.