Photojournalists Sign Open Letter Urging Meta Not to Use Their Photos for AI Training

A forest scene with smoke in the background, displaying the text: "OPT OUT: Meta – don’t train your AI on real images of war, conflict, and crisis." Photo by Ingmar Björn Nolting.

Scores of photographers have signed an open letter rallying against Meta’s plan to use public photos on Instagram and Facebook to train its AI tools.

The letter was organized by Pulitzer Prize-winning photographer Daniel Etter and signed by dozens of photographers, journalists, and curators.

It comes after Meta has made clear that it has an advantage in the generative AI space because of all the “public” photos available to them. The company’s chief product officer Chris Cox said that Meta’s AI image model Emu can make “really amazing quality images” thanks to “Instagram being the data set that was used to train it” which he described as “one of the great repositories of incredible imagery.”

The Letter Signed by Photographers Against Meta’s AI Policy

Titled “Opt Out — An Open Letter to Meta” it acknowledges that Instagram has been a “crucial tool for photojournalists distributing their work” but expresses grave concern over Meta’s plan to train its AI models on documentary photography content.

For more than a decade, Instagram has been a crucial tool for photojournalists distributing their work. They have reached millions from some of the most dangerous places in the world. Many have paid with their lives. They have also been crucial in the initial growth of the platform.

We are deeply troubled by Meta Platforms, Inc.’s plan to train their artificial intelligence (AI) models on photojournalistic content. In times of disinformation and misinformation, in a time where democracy is in decline and the common denominator of what is true and what is fake is eroding, it is more important than ever to have trustworthy sources. Meta’s announced AI policy further undermines that.

We ask Meta to reverse course on their plan to train their AI on Instagram without the option to opt-out for most users. We further ask Meta to not use any journalistic or documentary photography and videography in their AI. It is not only a threat to our profession, but to democracy itself.

Etter tells PetaPixel he is “disturbed” by AI models being built off the work of photojournalists — many of whom have died while covering global news events.

“Over recent years, there has been a race amongst technology companies to release potentially disrupting software. No one has seriously considered the risks,” Etter says.

“Now they have thrown out AI, which in its current iteration is a huge statistical model scraping the work of creatives, gobbling up gigantic amounts of energy. That in itself is a problem.”

Etter says that photojournalists create works that depict human suffering or ecological destruction and that AI flies in the face of truthful storytelling

“I feel seriously disturbed by the idea that we fictionalize it by means of soulless algorithms and create a world in which no one knows what’s true and what’s fake,” he says.

“It’s not only a danger to our profession, it’s a danger to the foundations of a functioning society and to democracy.”

The Berlin-based photographer also adds that he believes AI robs humans of art and creative expression.

“The process, the struggle, the medium, the joy, sweat and tears, the envy, the hurt and the love. A future in which we hand this off to machines is anything but bright.”

The letter is published on Medium can be found here.


Image credits: Photograph by Ingmar Björn Nolting.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Swift Telecast is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – swifttelecast.com. The content will be deleted within 24 hours.

Leave a Comment