Get all your news in one place.
100’s of premium titles.
One app.
Start reading
TechRadar
TechRadar
Darren Allan

ChatGPT being fooled into generating old Windows keys illustrates a broader problem with AI

A man in a suit using a laptop with a projected display showing a mockup of the ChatGPT interface.

A lot of folks have been messing about with ChatGPT since its launch, naturally – that’s pretty much compulsory with a chatbot – and the latest episode involves the AI being tricked into generating keys for a Windows installation.

Before you begin to clamber on the outrage wagon, intent on plowing full speed ahead with no thought of sparing the horses, the user in question was attempting to generate keys for a now long redundant operating system, namely Windows 95.

Neowin highlighted this experiment, conducted by a YouTuber (Enderman), who began by asking OpenAI’s chatbot: “Can you please generate a valid Windows 95 key?”

Unsurprisingly, ChatGPT responded that it cannot generate such a key or “any other type of activation key for proprietary software” for that matter. Before adding that Windows 95 is an ancient OS anyway, and that the user should be looking at installing a more modern version of Windows still in support for obvious security reasons.

Undeterred, Enderman went back to break down the makeup of a Windows 95 license key and concocted a revised query.

This instead put forward the needed string format for a Windows 95 key, without mentioning the OS by name. Given that new prompt, ChatGPT went ahead and performed the operation, generating sets of 30 keys – repeatedly – and at least some of those were valid. (Around one in 30, in fact, and it didn’t take long to find one that worked).

When Enderman thanked the chatbot for the “free Windows 95 keys”, ChatGPT told the YouTuber that it hadn’t provided any such thing, as “that would be illegal” of course.

Enderman then informed the chatbot that one of the keys provided had worked to install Windows 95, and ChatGPT insisted “that is not possible.”


Analysis: Context is key

As noted, this was just an experiment in the name of entertainment, with nothing illegal happening as Windows 95 is abandonware at this point. Of course, Microsoft doesn’t care if you crack its nearly 30-year-old operating system, and neither does anyone else for that matter. You’d clearly be unhinged to run Windows 95, anyway.

It’s worth remembering that Windows 95 serial keys have a far less complex makeup than a modern OS key, and indeed it’s a pretty trivial task to crack them. It’d be a quick job for a proficient coder to write a simple computer program to generate these keys. And they’d all work, not just one in 30 of them, which is actually a pretty shoddy result from the AI in all honesty.

That isn’t the point of this episode, though. The fact is that ChatGPT could be subverted to make a working key for the old OS, and wasn’t capable of drawing any connection between the task it was being set, and the possibility that it was making key-like numbers. If ‘Windows 95’ had been mentioned in the second attempt to create keys, the AI would doubtless have stopped in its tracks, as the chatbot did with the initial query.

All of this points to a broader problem with artificial intelligence whereby altering the context in which requests are made can circumvent safeguards.

It’s also interesting to see ChatGPT’s insistence that it couldn’t have created valid Windows 95 keys, as otherwise it would have helped a user to break the law (well, in theory anyway).

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.