


Financial Associated Press News on September 19 (Editor Xiaoxiang) Since the rise of generative artificial intelligence this year, there have been constant disputes about data security. According to the latest research from a cybersecurity company, Microsoft's artificial intelligence research team accidentally leaked a large amount of private data on the software development platform GitHub a few months ago, which included more than 30,000 pieces of internal information about the Microsoft team.
A team at cloud security company Wiz discovered that open source training data was leaked by Microsoft's research team when it was released on GitHub in June. The cloud-hosted data was leaked through a misconfigured link.
According to a blog post by Wiz, Microsoft's AI research team initially released the open source training data on GitHub. However, due to a misconfiguration of the SAS token, they incorrectly configured the permissions to grant the entire storage account and also gave the user full control instead of just read-only permissions. This means they can delete and overwrite existing files
According to Wiz, the 38TB of leaked data includes disk backups of the personal computers of two Microsoft employees. These backups contain passwords and keys for Microsoft services and more than 30,000 Microsoft team entries from 359 Microsoft employees. Internal information
Wiz researchers pointed out that open data sharing is an important part of AI training. However, large-scale data sharing can also pose significant risks to companies if not used correctly
Wiz CTO and co-founder Ami Luttwak pointed out that Wiz shared the situation with Microsoft in June, and Microsoft quickly deleted the exposed data. The Wiz research team discovered these data caches while scanning the Internet for misconfigured storage.
In response, a Microsoft spokesperson commented afterwards, "We have confirmed that no customer data has been exposed and no other internal services have been compromised."
In a blog post published on Monday, Microsoft said it has investigated and remediated an incident involving a Microsoft employee who shared a URL to an open source artificial intelligence learning model in a public GitHub repository. Microsoft said the exposed data in the storage account included backups of workstation configuration files of two former employees, as well as internal Microsoft Teams information about the two former employees and their colleagues.
The rewritten content is: Source: Financial Associated Press
The above is the detailed content of Up to 38TB of data accidentally leaked! Microsoft's AI research team 'poke trouble' How to ensure data security?. For more information, please follow other related articles on the PHP Chinese website!

Vibe coding is reshaping the world of software development by letting us create applications using natural language instead of endless lines of code. Inspired by visionaries like Andrej Karpathy, this innovative approach lets dev

DALL-E 3: A Generative AI Image Creation Tool Generative AI is revolutionizing content creation, and DALL-E 3, OpenAI's latest image generation model, is at the forefront. Released in October 2023, it builds upon its predecessors, DALL-E and DALL-E 2

February 2025 has been yet another game-changing month for generative AI, bringing us some of the most anticipated model upgrades and groundbreaking new features. From xAI’s Grok 3 and Anthropic’s Claude 3.7 Sonnet, to OpenAI’s G

YOLO (You Only Look Once) has been a leading real-time object detection framework, with each iteration improving upon the previous versions. The latest version YOLO v12 introduces advancements that significantly enhance accuracy

The $500 billion Stargate AI project, backed by tech giants like OpenAI, SoftBank, Oracle, and Nvidia, and supported by the U.S. government, aims to solidify American AI leadership. This ambitious undertaking promises a future shaped by AI advanceme

Google's Veo 2 and OpenAI's Sora: Which AI video generator reigns supreme? Both platforms generate impressive AI videos, but their strengths lie in different areas. This comparison, using various prompts, reveals which tool best suits your needs. T

Google DeepMind's GenCast: A Revolutionary AI for Weather Forecasting Weather forecasting has undergone a dramatic transformation, moving from rudimentary observations to sophisticated AI-powered predictions. Google DeepMind's GenCast, a groundbreak

The article discusses AI models surpassing ChatGPT, like LaMDA, LLaMA, and Grok, highlighting their advantages in accuracy, understanding, and industry impact.(159 characters)


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

ZendStudio 13.5.1 Mac
Powerful PHP integrated development environment

mPDF
mPDF is a PHP library that can generate PDF files from UTF-8 encoded HTML. The original author, Ian Back, wrote mPDF to output PDF files "on the fly" from his website and handle different languages. It is slower than original scripts like HTML2FPDF and produces larger files when using Unicode fonts, but supports CSS styles etc. and has a lot of enhancements. Supports almost all languages, including RTL (Arabic and Hebrew) and CJK (Chinese, Japanese and Korean). Supports nested block-level elements (such as P, DIV),

SecLists
SecLists is the ultimate security tester's companion. It is a collection of various types of lists that are frequently used during security assessments, all in one place. SecLists helps make security testing more efficient and productive by conveniently providing all the lists a security tester might need. List types include usernames, passwords, URLs, fuzzing payloads, sensitive data patterns, web shells, and more. The tester can simply pull this repository onto a new test machine and he will have access to every type of list he needs.

WebStorm Mac version
Useful JavaScript development tools

DVWA
Damn Vulnerable Web App (DVWA) is a PHP/MySQL web application that is very vulnerable. Its main goals are to be an aid for security professionals to test their skills and tools in a legal environment, to help web developers better understand the process of securing web applications, and to help teachers/students teach/learn in a classroom environment Web application security. The goal of DVWA is to practice some of the most common web vulnerabilities through a simple and straightforward interface, with varying degrees of difficulty. Please note that this software
