Glaze is a new software tool that has been developed to protect artwork from being used in artificial intelligence (AI) art without attribution. Researchers at the University of Chicago have created the software to prevent AI models from learning an artist's style.
The software uses 'Style transfer' algorithms, which take an existing image and recreate it in a particular style, to identify the unique features of the original art that change when the image is transformed into another style. Glaze then perturbs those features just enough to fool AI models while leaving the original image almost unchanged to the human eye.
According to Michael Osterrieder, the CEO of vAIsual, AI algorithms are capable of recognizing patterns and replicating them in a highly efficient way, meaning that digital collections of AI artwork can be used to train AI to integrate styles and compositions that look similar to the original artwork without the artist's consent. In addition,
Glaze also uses AI to work for artists instead of against them by identifying specific features that change when the image is transformed into another style, allowing the software to mislead art-mimicking AI models while leaving the original image almost unchanged to the human eye.
This new tool is a part of a growing effort to prevent AI art from including the work of human artists without their consent, which has become a significant concern in the art world.
Some artists add digital watermarks, use low-resolution images of their work online, create different versions of their artwork that are not easily reproduced by an AI system or copyright their work to protect it. While legal cases are underway to determine how much protection is available to human artists whose work is used in AI artwork, Glaze is a significant step towards protecting artists' works.