Microsoft exposes 38TB in AI data leak

Microsoft Exposes 38TB In AI Data Leak
69 / 100

Microsoft Exposes 38TB In AI Data Leak

Key Highlights:

  1. The token provided unrestricted access to an Azure storage account, extending beyond its intended purpose of accessing open AI datasets.

  2. Wiz Research’s investigation revealed the presence of computer backups within the exposed data, containing sensitive elements such as passwords, encryption keys, and over 30,000 internal Teams messages.

  3. Microsoft has addressed the situation through a blog post, affirming that no customer data was compromised, and no customer intervention is required.


While the incident did not result in widespread repercussions, it serves as a vital reminder for organizations to bolster their defenses against AI-related vulnerabilities. As the deployment of AI increasingly relies on vast datasets, the security measures guarding this data must also evolve and expand in scope.

View our content creation services for your business. Blog posts. Social Media posts. Website copy and more. 

Microsoft AI researchers mistakenly leaked 38TB of company data

Microsoft’s AI researchers inadvertently disclosed 38TB of company data while attempting to provide open-source code and AI models for image recognition to fellow researchers via GitHub. This breach occurred through a SAS link that granted access to an entire Azure storage account. Within the exposed data, cybersecurity firm Wiz identified backups of Microsoft employees’ computers, housing passwords for Microsoft services, confidential keys, and more than 30,000 internal Teams messages from numerous Microsoft employees.

Microsoft, in its incident report, reassures that “no customer data was exposed, and no other internal services were compromised.” The deliberate inclusion of the link alongside the files was intended to facilitate the download of pre-trained models by interested researchers.

Microsoft’s researchers leveraged an Azure feature known as “SAS tokens,” enabling the creation of shareable links that extend access to data within their Azure Storage account. Users have the flexibility to specify which data can be accessed through SAS links, whether it’s an individual file, an entire container, or their entire storage. In this case, the researchers shared a link providing access to the entire storage account.

On June 22, Wiz identified and promptly reported the security concern to Microsoft, prompting the company to revoke the SAS token by June 23. Microsoft clarified that it conducts routine scans of all its public repositories, but in this instance, its system erroneously categorized the specific link as a “false positive.” Subsequently, the company has rectified this issue, enhancing its system’s ability to identify overly permissive SAS tokens in the future.

While the specific link flagged by Wiz has been addressed, misconfigured SAS tokens have the potential to result in data breaches and significant privacy challenges. Microsoft acknowledges the importance of “properly creating and managing SAS tokens” and has even published a set of best practices for their use, presumably adhering to these practices itself, thus fortifying its commitment to data security and privacy.

Read the full article –

Follow our social media
Recent Posts