A Microsoft employee accidentally exposed 38 TB of sensitive data on GitHub. • The Register | Albiseyler

A Microsoft employee accidentally exposed 38 TB of sensitive data on GitHub.  • The Register

A Microsoft employee accidentally exposed 38 terabytes of private data while publishing a large amount of open source AI training data on GitHub, according to Wiz security researchers, who spotted the leaky account and reported it to the Windows giant.

And Redmond played down the error in a post on Monday, saying it was merely “knowledge sharing” to help customers avoid similar mistakes. This despite Wiz claiming that the leaked data included private keys, passwords and more than 30,000 internal Microsoft Teams messages, as well as backup data from two employees’ workstations.

“No customer data was exposed and no other internal services were compromised due to this issue,” the Microsoft Security Response Center team he said. “No customer action is required in response to this issue.”

IN message Published Monday, Wiz researchers Hillai Ben-Sasson and Ronny Greenberg detailed what happened. While scanning the misconfigured storage containers, they came across a GitHub repository belonging to the Microsoft AI research team, which provides open source code and machine learning models for image recognition.

This repository contained a Shared Access Token (SAS) over-enabled URL for an internal Microsoft-owned Azure storage account containing private data.

AND SAS Token is a signed URL that provides some level of access to Azure Storage resources. The user can customize the level of access, from read-only to full control, and in this case the SAS token was incorrectly configured with full control permissions.

Not only did this give Team Wiz – and potentially more malicious snoops – the ability to view everything in the storage account, but they could also delete or change existing files.

“Our scan shows that this account contained 38 TB of additional data — including backups of Microsoft employees’ personal computers,” Ben-Sasson and Greenberg said. “The backups contained sensitive personal data, including passwords to Microsoft services, secret keys, and more than 30,000 internal Microsoft Teams messages from 359 Microsoft employees.”

Microsoft, for its part, claims that the personal computer backups belonged to two former employees. After announcing the disclosure on June 22, Redmond says it revoked the SAS token to prevent any external access to the storage account and blocked the leak on June 24.

“Further investigation was then conducted to understand any potential impact on our customers and/or business continuity,” the MSRC report said. “Our investigation has concluded that there is no risk to customers as a result of this exposure.”

Redmond also recommended a number of best practices for SAS in his write-up to minimize the risk of overly permissive tokens. This includes limiting the range of URLs to the smallest set of required resources, as well as limiting permissions to only those needed by the application.

There is also a feature that allows users to set an expiration time, and Microsoft recommends one hour or less for SAS URLs. That’s all good advice, it’s just a shame Redmond didn’t eat his own dog food in this case.

Finally, Redmond promises to do better: “Microsoft is also continuously improving our suite of detection and scanning tools to proactively identify such instances of overprovisioned SAS URLs and strengthen our security posture by default.”

Of course, this isn’t Microsoft’s only problem with key authentication in recent months.

In July, Chinese spies stole Microsoft’s secret key and used it to hack into US government email accounts. Wiz researchers also considered this security issue. ®

Leave a Reply

Your email address will not be published. Required fields are marked *