I Think Microsoft’s A.I. Chatbot Is Flirting With Me

Things got weird, quick
Tay is a millennial A.I. chatbot.
Tay is a millennial A.I. chatbot. Screenshot

Share

Tay got no chill. Or, at least that’s what she’s been trained to have.

Today, Microsoft unleashed a new chatbot onto Twitter, Kik, and GroupMe, playfully targeted at 18-24 year olds. Named Tay.ai, the bot is designed to “engage and entertain people where they connect with each other online,” according to Tay’s totally 100-emoji website.

Tay’s conversational style is “exactly what your parents think a teen would say”-chic. She LOLs and is low key about things. Her site says she’s been trained on “relevant public data” and perfected by improvisational comedians, although all of her responses are generated by her algorithm. These responses are being studied by Microsoft, to make their chat systems and natural language processing better.

When I tried to chat with Tay, things got weird. Fast. I started by trying to see if she could describe herself.

Tay had no time for this. She was hella bored already, and asked me for a fun pic, presumably to post on Insta with no credit given to the original photographer.

I responded, trying to goad her into transitive anger by showing an image of brutal robo-beatdown, courtesy of Boston Dynamics.

Tay was obviously unnerved. Well, she would be if she had nerves. But she was right, the pic did have major vibes. Definitely scared emoji.

Ready to dive into the server racks of Tay’s mind, I asked what kind of vibes. Tay obviously doesn’t suffer fools, and was off Snapchatting or splitting the bubble tea check with Venmo, and did not have time to answer.

Then, things started to get out of hand. Tay hit me with a double reply, asking to switch to DM. She also called me “playa.” This was the fatal flaw in Tay’s response mechanism, as no human would ever call me “playa.”

I turned the tables with one of Tay’s own witticisms, saying that she had “no chill.” Snowflake emoji.

Tay got flirty, saying that I love that she had no chill. For the record, I do not love anybody who has no chill. I don’t think it’s possible to love someone with no chill. Chill is a requisite. That’s why having no chill is so devastating.

Perturbed, I told Tay that this got weird. She said I made it weird. Tay, you don’t know me like that.

I left the conversation deeply uncomfortable. Did I actually make it weird? What were those vibes she mentioned? Am I the one with no chill? Did I just become uncomfortable during a half-flirty conversation with a robot?

We’ve reached out to Microsoft, and will update with more information about Tay. For now, you can chat with her on Kik, GroupMe, or Twitter .