Home  >  Article  >  Technology peripherals  >  How to use Nightshade to protect artwork from generative AI

How to use Nightshade to protect artwork from generative AI

PHPz
PHPzforward
2024-03-14 22:55:02959browse

Translator| Chen Jun

##Reviewer| Chonglou

As you can see, the artificial intelligence (AI) revolution currently taking place has swept across all walks of life. The most intuitive feeling is that based on interactive human-computer dialogue, AI algorithms can not only generate text similar to human language, but also create images and videos based on a (group of) words. However, these artificial intelligence tools (especially text-to-image generators like DALL-E, Midjourney etc.) use Training data often comes from copyrighted data sources.

How to use Nightshade to protect artwork from generative AI

In the digital realm, preventing AI-generating tools from being trained on copyrighted images can be challenging Task. Artists of all stripes have been working on many levels to protect their work from AI training data sets. There are many complex issues faced when it comes to protecting intellectual property, as the rapid development of the digital world makes supervision and protection more difficult. Artists may take technical measures, such as adding watermarks or digital signatures, to ensure the originality and uniqueness of their works. However, these measures are not always

#Now, the advent of Nightshade will completely change the status quo. Nightshade is a free AI tool that helps artists protect their copyrights by "tainting" the results of generative AI tools. The launch of this tool means that creators have greater control and can better protect their works from infringement. The emergence of Nightshade provides artists with a new way to deal with potential copyright infringement issues, allowing them to display their works with more confidence and peace of mind. The introduction of this technology will bring revolutionary changes to the entire creative field, and will bring about revolutionary changes to the art world. What is artificial intelligence poisoning (AI Poisoning)?

Conceptually, artificial intelligence poisoning refers to the act of “poisoning” the training data set of an artificial intelligence algorithm. This is akin to deliberately feeding false information to an AI, causing the trained AI to malfunction and fail to detect the image. Technically, tools like Nightshade can change the pixels in a digital image so that it looks completely different when trained on artificial intelligence. And this change, from the perspective of the human eye, is still basically consistent with the original image.

If you upload a doctored picture of a cat on the Internet, to humans, the picture may look like a normal cat cat. But for an AI system, it could be tampered with and fail to accurately identify the cat, leading to confusion and misclassification. This demonstrates the critical importance of data accuracy and completeness when training AI systems, as erroneous or deceptive information in the data can negatively impact the system's learning and performance. Therefore, ensuring the quality and authenticity of data is a critical step in training AI models to avoid misleading results and inaccurate judgments.

In addition, in the artificial intelligence training data process, due to the scale effect, if there are enough fake or poisoned image samples, it will affect The accuracy of AI’s understanding compromises its ability to generate accurate images based on given prompts.

Although the technology of generative artificial intelligence is still developing rapidly, for now, the data used as the basis for model training cannot escape people once something happens. Visible errors will subtly damage subsequent iterations of the model. This has the effect of protecting original digital works. In other words, digital creators who do not want their images to be used in AI datasets can effectively protect their image works from being imported into generative AI without permission. .

Currently, some platforms have begun to provide creators with the option of not including their works in artificial intelligence training data sets. Of course, for artificial intelligence model trainers, they also need to pay enough attention to this.

Compared with other digital art protection tools such as Glaze,

Nightshade# The implementation of ## is completely different. Glaze prevents AI algorithms from imitating a specific image style, while Nightshade changes the appearance of an image from an AI perspective. Of course, both tools were developed by University of Chicago computer science professor Ben Zhao.

How to use Nightshade

While the creator of the tool recommends users to use Nightshade with Glaze can be used together, but it can also be used as an independent tool to protect users' works. Overall, using this tool is not complicated and you can protect your image creations using only Nightshade in just a few steps. Before you get started, though, here are three things you need to keep in mind:

    Nightshade
  1. works only with Windows and MacOS, while support for GPU is limited and requires at least 4GB VRAM. Currently it does not support non-NVIDIA GPU and IntelMac. Fortunately, the Nightshade team has provided a link to the list of supported Nvidia GPU-- https://www.php.cn/link/719e427d3b21a35b8cdcd2d88db6ca11 (You may notice: GTX and RTX GPU is located in "CUDA supported GeForce and TITANProducts" section). Alternatively, you can run Nightshade on CPU, but the performance will be reduced.
  2. If you are using
  3. GTX 1660, 1650 or 1550, then a bug in the PyTorch library will prevent you from starting properly or using Nightshade . The team behind Nightshade may move in the future by migrating from PyTorch to Tensorflow way to fix this problem, but there is currently no better solution. Moreover, the problem also extends to Ti variants of such graphics cards. In testing, I gave it administrator access on my Windows 11 computer, but I still had to wait several minutes before I could open the program. Hopefully your situation will be different.
  4. If your artwork has a lot of solid shapes or backgrounds, you may encounter some artifacts (
  5. artifact )question. This can be solved by reducing the "Poison" intensity.
You need to perform the following steps to specifically use

Nightshade to protect the image. Please keep in mind that although this guide uses the Windows version, it also applies to the macOS version.

    Download the Windows or macOS version from the
  1. Nightshade download page.
  2. Since
  3. Nightshade is downloaded locally in the form of an archive folder, no additional installation is required. After the download is complete, you only need to unzip its ZIP folder and double-click to run Nightshade.exe.
  4. As shown in the figure below, on the pop-up interface, please click the "Select" button in the upper left corner to select the image you want to protect. Note that you can select multiple images simultaneously for batch processing.
  5. Depending on your preference, you can use sliders to adjust the intensity (
  6. Intensity) and rendering quality (Render Quality). The higher the value, the stronger the poisoning effect, but will also introduce artifacts in the output image.
  7. #Next, please click the "Save As" button under the "Output" section to select a destination for the output file.
  8. Finally, please click the "Run
  9. Nightshade" button at the bottom to run the program and finish editing the image poison.
At the same time, you can also select the "Poison (

poison)" tag. If the tag is not selected manually, Nightshade will automatically detect and recommend a word tag. Of course, you can also change it manually if the label is incorrect or too general. Keep in mind that this setting can only be used when Nightshade is working with a single image.

#If all goes well, you will end up with an image that looks identical to the original to the human eye, but is recognized by the AI ​​algorithm as being completely different from the original image. This means your artwork is protected from artificial intelligence generators.

Translator introduction

Julian Chen, 51CTO community editor, has more than ten years of experience in IT project implementation and is good at Implement management and control of internal and external resources and risks, and focus on disseminating network and information security knowledge and experience.

Original title: ##How to Use Nightshade to Protect Your Artwork From Generative AI

The above is the detailed content of How to use Nightshade to protect artwork from generative AI. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:51cto.com. If there is any infringement, please contact admin@php.cn delete