Home  >  Article  >  Technology peripherals  >  Microsoft AI researchers accidentally leaked 38TB of internal data, including private keys and password information

Microsoft AI researchers accidentally leaked 38TB of internal data, including private keys and password information

PHPz
PHPzforward
2023-09-21 17:29:01498browse

There is no need to change the original meaning, the content that needs to be rewritten is: Source: IT Home

Wiz Research announced today that a data leak was discovered in Microsoft AI's GitHub repository, which was caused by a misconfigured SAS (IT Home Note: Shared Access Signature) token

微软 AI 研究人员无意中泄露 38TB 内部数据,包括私钥和密码信息

In terms of details, Microsoft's artificial intelligence research team released open source training data on GitHub, but accidentally exposed 38TB of other internal data, including disk backups of the personal computers of several Microsoft employees. These backups contained confidential information, private keys, passwords, and thousands of internal Microsoft team messages, involving more than 30,000 employees

微软 AI 研究人员无意中泄露 38TB 内部数据,包括私钥和密码信息

微软 AI 研究人员无意中泄露 38TB 内部数据,包括私钥和密码信息

This GitHub repository provides open source code and AI models for image recognition, visitors need to download the model from the Azure storage URL. However, Wiz discovered that the URL's permissions were misconfigured, causing permissions to be granted to the entire storage account, thereby incorrectly exposing other private data

According to reports, the URLs involved are said to have exposed the data since 2020. Furthermore, the URL was incorrectly configured to allow "Full Control" instead of "Read-Only" permissions. This means that anyone who knows how to view this URL could potentially remove, replace, and inject malicious content

Wiz said it reported the issue to Microsoft on June 22, and two days later on June 24, Microsoft announced it was revoking the SAS tokens. Microsoft said it completed its investigation into potential organizational impact on August 16.

The following is the specific timeline of the entire incident:

On July 20, 2020, the SAS token was submitted to GitHub for the first time; the expiration date is October 5, 2021

October 6, 2021 - SAS token expiration date updated to October 6, 2051

June 22, 2023 - The Wiz research team discovered the issue and reported it to Microsoft

June 24, 2023 - Microsoft Announces SAS Token Expiration

On July 7, 2023, the SAS token was replaced on GitHub

August 16, 2023 - Microsoft completes internal investigation into potential impact

September 18, 2023 - Wiz Research publicly discloses this

The above is the detailed content of Microsoft AI researchers accidentally leaked 38TB of internal data, including private keys and password information. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:sohu.com. If there is any infringement, please contact admin@php.cn delete