Prevent Data Theft: Poison it!

Nan Nelson, Upward Mobility Signals Team

With AI’s future still unwritten, Nightshade, a project from the University of Chicago, gives artists recourse from AI theft by rendering data useless or disruptive to AI model training. Ben Zhao, a professor who led the project, compared it to “putting hot sauce in your lunch so it doesn’t get stolen from the workplace fridge.” Read the story here.