What came to light?
The URL exposed 38 terabytes of confidential internal Microsoft data, including secret keys, passwords, and internal Microsoft Teams chats from hundreds of Microsoft employees, according to Wiz, a cloud security business that found the vulnerability. Two Microsoft workers’ personal computer backups were also included in the material.
How did it take place?
The reason for the data exposure was that the Microsoft employee created the URL using a Shared Access Signature (SAS) token. Azure uses SAS tokens as a tool to let users build shared links that provide users access to Azure Storage resources. Instead of using a service SAS token, which only allows access to particular resources, the employee used an account SAS token, which allows access to all the resources in the storage account.
According to Wiz, the URL was made public in 2020 and discovered on June 22, 2023. On June 24, two days after Wiz sent Microsoft a report with its findings, Microsoft cancelled the SAS token. On August 16, Microsoft announced that it had finished looking into possible organizational effects.
What effect did it have?
According to Microsoft, this issue did not expose any customer data or jeopardize any other internal systems. Wiz cautioned that there might have been serious repercussions from the disclosure, including supply chain disruptions, ransomware attacks, and data theft. Additionally, Wiz stated that the course of action might have jeopardized the validity and integrity of Microsoft’s AI models and research.
What have we learned from this?
The data exposure incident brings to light the difficulties and dangers associated with protecting large data sets, particularly in the rapidly advancing field of artificial intelligence. It also emphasizes how crucial it is to create and distribute SAS tokens in accordance with security guidelines and best practices. Wiz suggested utilizing SAS with stored access policy or user delegation SAS for external sharing instead of account SAS tokens.