Demand Progress

Tell Adobe's CEO to stop selling AI-generated images of war

Sign the petition:

    Not ? Click here.

    Tell Adobe's CEO to stop selling AI-generated images of war

    Petition to Adobe CEO Shantanu Narayen:
    We write to express our deep concern and disappointment regarding Adobe Stock's sale of AI-generated images depicting the Israel-Gaza war. We urge Adobe to end the sale of AI-generated images of war immediately.

    Disinformation about the ongoing Israel-Gaza war is bad enough — AI-generated images are making it even worse.

    Adobe is selling AI-generated images of the Israel-Gaza war, and it’s not just irresponsible — it's downright dangerous. Business Insider recently reported that images of “war-torn streets, explosions, soldiers, tanks, and buildings on fire, and children standing in rubble” from Adobe Stock — all fake — are being used across the internet.1

    The CEO of Adobe must take a stand and ban the sale of such dangerous images.

    Sign the petition: Send Adobe Systems CEO Shantanu Narayen a message demanding that Adobe stops selling AI-generated images of war immediately.

    It’s getting harder and harder to tell what’s accurate information online, especially about an ongoing armed conflict — thanks to Adobe, we can add the validity of photos to the list. In a conflict that has claimed over 20,000 lives, trustworthy information is so important — AI-generated images are just adding to disinformation.

    Adobe’s willingness to profit from such misleading content is disgraceful. No matter how they tweak and change their guidelines, Adobe can never control how these images are used after the point of sale. The good news? There’s one clear way Adobe can fully avoid complicity in ANY disinformation — and the way to do it is to stop selling AI-generated images of war now.

    Add your name: Stop selling AI-generated images of war!

    Sources:

    1. Business Insider, “Adobe is selling AI-generated images of the Israel-Hamas war, and some websites are using them without marking that they're fake," November 8, 2023.