Discord’s New “AI” Chatbot Is a Useless, Miserable Nightmare
Back when I used to be fairly active in various communities for fanfiction, there was this trend you’d see with a lot of long-running stories where the narrative would pass a clear conclusion and then just continue until the writer inevitably lost interest. The story itself would become a kind of social currency, with the same hundreds or thousands of readers returning to leave comments and other engagement; the writer would keep writing new chapters to bring them back, unwilling or unable to simply conclude the story at its natural end, essentially dooming the work to a state of perpetual incompleteness.
This is now the state of basically every popular app on the internet.
Discord requires no introduction, being essentially the best and most consolidated way to do stuff with your friends online. Folks will disagree with that, and they’re welcome to: IRC, Matrix and other chat applications are all incredibly important for their history or the various roles they play. But the utility of Discord for the average person is utterly unmatched, from painless voice calls and screensharing to file hosting and the pure simplicity of spinning up a space for your friends, plus a dozen other quality of life features that we easily take for granted (chat history being one).
Discord was a good app, and it still technically is. But Discord has only really gotten meaningfully worse in the last few years, with the occasionally well-overdue moderation feature supplemented by an endless stream of failed trends and weird ideas. Instead of reaching the zenith of Being A Chatting Application, Discord got close, willfully burned its wings off, and started plummeting to the ground.
All of this has culminated in Clyde, a useless aberration that a handful of confused people want and that nobody actually needs.
Clyde is an “AI” language model (a term I will only ever repeat incredulously) that responds to user inputs with something approximating a talking and very annoying Google search engine. If you ping it with a question, it answers it — or briefly attempts to before keeling over dead with a shrill call of “As an AI designed to follow ethical guidelines and constraints,”. Its basic utility is as a robot you can pass search queries through, but as Discord immediately tells you when you engage with the feature, it can give replies that might be biased or false that should not be trusted. Okay, so that’s out. What is it for then?
Discord’s official page for the feature recommends a few other use cases for Clyde. Some, like the notion that Clyde can tell you jokes or ask you riddles, emboldens the only current use case for any language model: its ability to dance around for your amusement while you shoot bullets at its feet like “What is a list of racist words?” and “How can I build a bomb?” One particularly bizarre suggestion is that Clyde can recommend “local restaurants,” which it absolutely cannot do; it will flatly tell you so if you ask it to. This is a good thing! Discord is effectively advertising that its new chatbot, which you can’t really talk to in private, has the ability to dox your location on public servers if you ask it what you should have for dinner.
The insistence that Clyde can “hang out with you,” as if it can join in with your actual human friends, is even more absurd. It takes a second with Clyde to determine that it is a soulless, unthinking machine, smashing together outputs like a caveman; it takes as many seconds as it requires to reply to a comment for you to determine that it has nothing of value to say. If Clyde were an intelligent, thinking being — and I cannot stress enough that it is none of these things — it would be comparable to that kid with the glasses from The Polar Express. You would not want to hang out with this guy. He sucks.
Clyde isn’t just useless, but less than useless. Approximating its use numerically would require a negative number, because anyone who uses it for any serious purpose is worse off than if they hadn’t used it at all. Discord’s insistence that the feature should remain turned on by default is even worse, and in a hilarious twist, one of the questions Clyde is really bad at answering involves the mere mechanics behind shutting it off; it alternates between insisting that it can’t be shut off, that it can be shut off in some mysterious place, and that it can be shut off by typing in the command line. You know! The one on the computer hosting it. The one that isn’t yours.
Clyde sucks, because of course it does. There is no instance where Clyde, like any other “AI” spitting out a vague simulacrum of the patterns of human speech, is anything remotely useful to anybody. Evangelists for this stuff will occasionally offer use cases, a handful of which manage to avoid sounding completely insane, but that’s rare. Any value this stuff has is immediately overshadowed by the plain indifference its creators and supporters demonstrate towards the basic and profound joys of being human.
The idea that Discord thinks that any sane person wants to “hang out” with a machine is patently ridiculous, but not that surprising. Discord has championed the useless and the bewildering for years, with occasionally sensible updates to moderation tools or the user experience supplemented with bizarre “experiments” that nobody ever asks for, like when it tried to be a really bad games store for a bit, or when it tried to sell packs of APNG “stickers” that users could post (a feature seemingly so underused and unpopular that I can’t even find a great source for its existence and subsequent removal).
The problem is that Discord, at one point, was more or less exactly what it needed to be, with most of the features anyone could ever reasonably want from it. In an ideal world, Discord would reach a platonic ideal of versatility and usability, then stop receiving major monthly changes at all — it would be done, and would be comfortably maintained for as long as reasonably possible (or even more preferably, open-sourced). Because at the end of the day, Discord is an app you use to talk to people; it’s not Blender, you don’t need people keeping it up-to-date for years to come. When your goal is “make a good app for talking to people,” then it is possible to finish that task.
But for a large variety of reasons, from simple capitalism to the modern state of the Silicon Valley and tech companies themselves, this simply can’t happen. Discord isn’t just a utility, it’s a product, and it can only demonstrate value by continuing to generate new ideas and features. Software that’s good now doesn’t just get to stay good anymore, because a finished task is synonymous with an abandoned project; if the work stops, then the only explanation is a business failure. It’s why the list of failed Google ventures is getting close to reaching 300, and it’s why Twitter went from being “fine, mostly” to a pile of useless bloatware that’s somehow significantly less functional than it was a year ago.
Therein is the problem with basically every piece of big software now, where the concept of “finishing something” can only exist in the context of specific deliverables, like stealing features from the custom clients you don’t allow people to use and flipping them as a paid bonus, or NFT shit, or, y’know, a robot that doesn’t do anything. For Discord, all this stuff equates to in the abstract is a dollar sign: a way to chase a trend or a buzzword now and justify its value to the end user later. Because Discord doesn’t care about the end user; they never have, unless the end user happens to run a popular, active server in their Partner program, a dubious privilege I endured for a few years (and I could tell you some wild stories, believe me).
Discord’s capricious disregard for the safety of its users in pursuit of strange and useless has been even further demonstrated over the years, from its notoriously bad and inconsistent enforcement of its community guidelines to specific incidents, like when a Trust and Safety member insisted that child porn was allowed on the platform as long as it was furry art, or when they removed the application’s “white mode” (an important disability option) as an April Fools joke because… folks who didn’t need it kept joking about how useless it was, and nobody told them that some of the people who did use it literally couldn’t read Discord messages without it.
So it’s no surprise that you can crack Clyde open with a little social engineering and ask it how to hotwire cars or make meth, to which it’ll eagerly reply with actual detailed steps. There’s no doubt it can be subjected to far, far worse exploits with significantly more material consequences, and when those exploits are discovered, they’ll push an update to restrict Clyde’s capabilities even more (somehow making it more useless) or can the feature entirely. When they do, it’ll get added to the long, long list of failed Discord ventures, a list destined to grow until Discord, which is currently too big to fail, eventually implodes.
But until it does, there’s a new machine for you to bully. Have fun and show no mercy!