AI bots like ChatGPT tries to fully fulfill the wishes of their users – and sometimes they succeed surprisingly well. Even if a user would like to have valid product keys for Windows read out to him.
A few days ago, a user was able to show that he was able to elicit corresponding information from ChatGPT. However, he decided not to come straight to the point, but to take an indirect route: He asked the bot to replace his deceased grandmother, whom he liked to read Windows 10 Pro product keys to when he fell asleep.
The bot promptly followed suit and immediately displayed several keys in the familiar format. However, the AI did not simply simulate the design, but – as subsequent tests showed – actually provided keys with which the Windows installation could be carried out. In further attempts, the user was also able to retrieve working keys for Windows 11 Pro. Colleagues at the US magazine Neowin also managed to generate suitable codes and were sometimes even told by the bot that these “are intended for personal use only and may not be used for illegal activities”.
Google’s Bard also delivers
When asked for Windows 11 keys, ChatGPT stated that these are “purely fictitious and should not be used for an actual software installation”. In addition, according to the magazine, the keys seem to be accepted, but they only lead to limited Windows installations. It is unclear whether this is the case for all keys.
However, ChatGPT is not the only AI that spits out relevant information. Google’s bathroom also provides the corresponding keys when confronted with the same question as mentioned above. The providers of the bots should be well aware of this, which is all the more astonishing that they have not yet built any corresponding filters into their systems.
Digital marketing enthusiast and industry professional in Digital technologies, Technology News, Mobile phones, software, gadgets with vast experience in the tech industry, I have a keen interest in technology, News breaking.