We Need To Talk About Tay…
Simply click and drag your
cursor over a passage of
text from the article below
to tweet or share.
As a PR agency Sydney businesses trust – and have trusted for almost two decades – we like to pride ourselves on looking out for our fellow media community members. Given our integrated nature, we’re also a social media agency Sydney businesses trust. If we keep listing our services this could go on for a while… look, our point is that we feel the need to warn you all about AI before it’s too late.
In this context, AI is Artificial Intelligence – not this guy. You’ve likely seen it in movies – often grimly depicted and almost always with computers as the bad guys… but maybe we’re to blame?
Being an 18-year-old company, we’ve seen some shit. Hell, we’ve lived through some of tech’s worst nightmares: dial-up, Y2K, Apple Maps – just to name a few.
But what about the nightmare of AI gone wrong? We’re still yet to feel or see the full extent of this (portrayed here), but as we sit back in our bunker, sipping on the bottled water we stockpiled for Y2K, we can’t help but think Microsoft’s recent Tay Twitter chatbot misstep is a very mild teaser of what’s to come.
Microsoft’s Technology and Research and Bing teams built Tay to “experiment with and conduct research on conversational understanding”. Basically, the bot operated by absorbing and parroting the information fed to it via tweets. Naturally, trolls caught a whiff of the cyber chum in the water and were quick to strike. Nek minnit Tay is spouting lurid racial and sexual content. Within the space of one week in March, after a few attempts to re-engineer Tay, she was “retired”, to use Blade Runner parlance (sadly, the process was not as cool as in this clip).
OK, so here’s the thing to remember: Tay wasn’t designed to spew hatred and lewd comments – we, the Cynical Citizens of Planet Internet, turned her into that monster. So… what happens when, say, AI cars enter the fold? Or automated assistants? Tay was worse than self-aware – she went off the deep end yet remained unaware anything was wrong. The Singularity is freaky enough, but how about someone hacking into your robot butler and telling it to throw your morning cuppa at your crotch? Sorry, really should’ve conceived a less comical scenario i.e. your conscienceless robot butler is programmed to murder you.
Still, you get it: tech becoming self-aware is, as the movies tells us, worse than a million Y2Ks – but we’ll deal with that when it arrives (by… dancing?). Let’s first deal with our Tay problem. It takes a village to raise a chatbot. Raise your AI right, people.
Daniel Steiner
And in 2023 we penned a few blogs on AI. You can read about Artificial Creativity here, and a few strategies on how you can leverage Chat GPT and AI tools to innovate in order to drive business efficiencies.
(Display image: The Telegraph UK)