[ITBEAR] News on August 11th. Recently, a new discovery in the security field has attracted widespread attention. According to Futurism, the Copilot AI built into Microsoft Windows systems has security vulnerabilities that may be exploited by criminals, leading to the leakage of sensitive enterprise data, and even turning into a powerful phishing attack tool.
1. Zenity Research
- Zenity’s Michael Bargury disclosed this finding at the Black Hat security conference.
- Exploiting Copilot's vulnerabilities, attackers can easily obtain contact information and send spoofed emails.
2. Researchers’ demonstration
- Researchers demonstrated how attackers can use vulnerabilities to modify bank transfer payee information.
- Attackers can also obtain sensitive data and launch phishing attacks.
- The attacker induced Copilot to leak the emails of the people copied in the conversation.
- Copilot assisted in creating highly credible phishing emails.
3. Copilot’s security risks
- Copilot Studio allows enterprises to customize chatbots.
- Chatbots access corporate data, causing security risks.
- Hackers can bypass Copilot's protective measures through prompt injection.
- Malicious data becomes the attack surface for prompt injection.
- Bargury emphasized that if a robot is useful, it will be fragile; if it is not fragile, it is useless. -->
The above is the detailed content of Microsoft Copilot AI vulnerability? Can hackers easily steal company secrets?. For more information, please follow other related articles on the PHP Chinese website!
Statement:The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn