Undeleted Files

Nightshade: Legal Poison Disguised as Protection for Artists

As I stated in my previous article, generative AI has continued to be a contentious subject for many artists, and various schemes have appeared in order to ward off model training. The last article, however, only discussed Glaze and how an artist could be negatively affected by using it. Now, I shall discuss the potential legal and ethical risks of a relatively new protection scheme, known as Nightshade. It’s worth noting that there is no official release of Nightshade software yet; however, the paper has already been published, and there’s a nice write-up on it here.

What is Nightshade?

In short, Nightshade is an algorithm that modifies an image so that an AI model trained on it will falsely associate the caption paired with the image with a concept not in the image. For example, a normal looking image of a dog could be modified to teach a model to produce an image of a car even when a dog is requested. The key differences between Nightshade and older schemes is:

Of course, the name Nightshade is designed to evoke images of the family of poisonous plants.

These protection schemes were born out of a desire by artists to protect their work from being trained on, since copyright does not seem effective in this case. Nightshade is the latest work from the same team that produced Glaze, but the key differences between it and Glaze make its usage much more questionable.

As the purpose of Nightshade is to cause damage to AI models, it’s possible that its usage may conflict with various laws against abuse of computer systems. For example, the United States Computer Fraud and Abuse Act1 covers intentionally causing damage to a computer system:

(a) Whoever

[. . .]

  (5)
    (A) knowingly causes the transmission of a program, information, code, or command, and as a result of such conduct, intentionally causes damage without authorization, to a protected computer;

Note that damage means “any impairment to the integrity or availability of data, a program, a system, or information.” Following that definition, it is clear that the only purpose of Nightshade is to cause damage to generative AI model systems. As an exception, if someone were to accidentally submit a Nightshade-protected image to a model they don’t run, they wouldn’t have broken this law.

In addition to legal statutes, the terms of service on many websites prohibit uploading or sending content designed to damage systems, whether their own or other systems. For example, many artists upload to DeviantArt. Their terms of service2 states as follows:

You agree not to use the service:

[. . .]

  1. to upload, post, or otherwise transmit any material which is likely to cause harm to DeviantArt or anyone else’s computer systems, including but not limited to that which contains any virus, code, worm, data or other files or programs designed to damage or allow unauthorized access to the Service which may cause any defect, error, malfunction or corruption to the Service;

Notably, their terms expressly prohibited posting “any [. . .] other files [. . .] designed to damage [. . .] the Service, which may cause any defect, error, malfunction, or corruption to the service.” Since DeviantArt launched their own AI generation tool, it’s reasonable to assume that uploading a Nightshade-protected image would be prohibited. Even if not, the terms still generally prohibit uploading material that could harm anyone else’s computer systems.

At this point, you may think: the scraping bots aren’t allowed to be scraping, so it is their fault if they retrieve poisoned images. That excuse is still a rather shaky one. If you get hacked by a dumb script kiddie, it still isn’t legal for you to hack them back. Serving ZIP bombs to annoying, abusive bots may be funny, but it’s still technically illegal, even if you likely won’t get in trouble for it.

There’s also the fact that you agreed to the terms of service when you signed up for the site. If you agree that the site is allowed to use your content, and your uploaded images poison an internal model the site uses, the owners may not be very pleased. This can lead to getting banned from the site at best or costly litigation at worst.

DeviantArt isn’t alone, as most sites’ terms of service include similar provisions that ban uploading content designed to disrupt systems. The reason why should be obvious: ignoring legality, it’s just bad manners.

Then you might ask: what if I host my own site? This is a much murkier case that I can’t really comment on, besides the fact there could still be trouble if you don’t let people know you’re using Nightshade.

Where This Leaves Nightshade’s Audience

Because of the above, it is plausible that artists could suffer both lawsuits and criminal prosecution in a variety of situations, depending on how and where they use Nightshade. I am personally a fan of offensive security tools and prefer for them to be available to the public, but it is important to use them responsibly and ethically. While it may feel good to “strike back” and cause damage to large generative AI models, it is certainly not worth the legal risk.

While courts have been generally favorable toward DRM and similar protection schemes, those schemes usually have to be disclosed and not cause undue damage. When these constraints are ignored, you end up with situations like the Sony rootkit scandal from the mid 2000s. This scandal is particularly relevant because it establishes the idea that “No, you can’t destroy other systems for the sake of ‘copyright protection.’”

I respect the Nightshade team for making their research available, and working to produce a user-friendly tool; however, I have a few criticisms:

I am not asking them to stop working on tools like Nightshade, or to never make them available, but I would encourage them to be more upfront about these risks. While it is reasonable that artists are upset about having their works trained on, examining these situations from a one-sided perspective is not OK, regardless of whether you are for or against generative AI.

Nightshade is probably fine to use in most cases, if it’s disclosed and a site doesn’t ban content protected by it. In other cases, there could be trouble. I don’t fully know.

Disclaimer: I am not a lawyer, and this post does not constitute legal advice. The above is only my opinion formed based on a variety of sources.

#AI #Machine Learning #Image Generation #No to AI Art #GLAZE #Adversarial Attacks #Nightshade