Home >Technology peripherals >AI >Massive data leak! Microsoft AI team accidentally leaked more than 30,000 pieces of internal information

Massive data leak! Microsoft AI team accidentally leaked more than 30,000 pieces of internal information

王林
王林forward
2023-09-19 17:33:011031browse

According to a research report released on September 18 local time, Chinese cloud security startup Wiz Research pointed out that Microsoft’s artificial intelligence (AI) research team accidentally leaked a large amount of private data cache on GitHub. This situation is caused by a misconfigured SAS token. Microsoft's AI research team released open source training data on GitHub, but inadvertently exposed 38TB of other internal data, including disk backups of the personal computers of two Microsoft employees. The backups contained confidential information, private keys, passwords and more than 30,000 internal Microsoft Teams messages. Microsoft responded that day that it had revoked the SAS token and completed an internal investigation into the potential impact. No customer data was exposed and no other Microsoft services were put at risk as a result of the issue

Massive data leak! Microsoft AI team accidentally leaked more than 30,000 pieces of internal information

Screenshot source: Wiz Research official website

Massive data leak! Microsoft AI team accidentally leaked more than 30,000 pieces of internal information

Screenshot source: Microsoft official website

The above is the detailed content of Massive data leak! Microsoft AI team accidentally leaked more than 30,000 pieces of internal information. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:sohu.com. If there is any infringement, please contact admin@php.cn delete