Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Microsoft has taken action against a group that the company says deliberately created and used tools to bypass the security of its cloud AI products.
According to complaint filed by the company in December in the US District Court for the Eastern District of Virginia, a group of 10 unnamed individuals used fake customer information and software to hack it. Azure OpenAI ServiceMicrosoft managed services powered by ChatGPT OpenAI technology.
In the complaint, Microsoft accuses the defendants – referred to only as “Does,” a pseudonym – of violating the Computer Fraud and Abuse Act, the Digital Millennium Copyright Act, and the defamation law by accessing and using Microsoft software without permission. and servers to “create offensive” and “harmful and illegal.” Microsoft has not released any details on the malware that was created.
The company is seeking relief and other “comparable” damages.
In the complaint, Microsoft says it discovered in July 2024 that customers with information about the Azure OpenAI Service – specifically API keys, special strings of characters used to authenticate a program or a user – were used to create content that violates the official terms of service. Later, during an investigation, Microsoft discovered that API keys had been stolen from paying customers, according to the complaint.
“The exact method by which Plaintiffs obtained all of the API keys used to commit the misconduct described in this complaint is unknown,” Microsoft’s complaint reads, “but it appears that Plaintiffs engaged in an API Key theft scheme that enabled them to steal Microsoft API Keys from customers.” several of Microsoft.”
Microsoft says the defendants used the Azure OpenAI Service API keys of US customers to carry out a “hacking-as-a-service” scheme. According to the complaint, in order to carry out the scheme, the defendants developed a client-side tool called de3u, as well as software to transfer and convert messages from de3u to Microsoft systems.
De3u allowed users to use stolen API keys to generate images using it DALL-Eone of the versions of OpenAI available to Azure OpenAI Service customers, without writing their own code, Microsoft says. De3u also tried to prevent the Azure OpenAI Service from analyzing the data used to create images, according to the complaint, which can happen, for example, when words contain words that trigger Microsoft’s filtering.
The repo containing the code of the de3u project, hosted on GitHub – a company owned by Microsoft – is no longer available at the time of printing.
“These factors, combined with Plaintiffs’ unconstitutional API access to the Azure OpenAI service, enabled Plaintiffs to reverse engineer Microsoft’s actions and abuses,” the complaint reads. “The attackers willfully and intentionally accessed computers protected by the Azure OpenAl Service without permission, and as a result of this behavior caused damage and destruction.”
In a blog post published on Friday, Microsoft says the court has allowed it to seize a website “helpful” to the defendants that would allow the company to obtain evidence, interpret how the defendants make money, and tamper with some of the technology they have acquired. .
Microsoft also says it has “implemented countermeasures,” which the company did not specify, and has “added security mitigations” to the Azure OpenAI Service following its observations.