Top 12 AI-Generated Images Fact-Checked by Misbar in 2025
AI-generated content has flooded the internet throughout the past few years, highly influencing news and information around the world. Spotting original real content has become increasingly difficult as AI technology improves rapidly, making observing the AI content more challenging for the untrained eye.
Throughout 2025, Misbar actively fact-checked a wide range of viral AI-generated content that was widely circulated on social media platforms. Many of these pieces of content were linked to sensitive political, humanitarian, and security-related events in war-torn areas.
This roundup highlights 12 prominent AI-generated images Misbar debunked in 2025.
Image of Gaza Mother Carrying Her Son’s Bones
During the Israeli war in the Gaza Strip and the humanitarian crisis, an image was claimed to show a Gazan mother embracing her son's remains after finding his decomposed body. Misbar’s team investigated the image and found that it was originally published by a visual artist known for generating images using artificial intelligence. The artist confirmed it was AI-generated.

Image of Sudanese Mother Protecting Her Child From RSF
Amid reports of mass atrocities in Sudan’s el-Fasher following paramilitaries' control of the city in October, an image circulated purporting to show a Sudanese mother protecting her child from two armed men. Misbar found, through reverse image search, that the image was grabbed from an AI-generated video uploaded by Al Jazeera’s digital video producer and AI specialist, Khoubaib Ben Ziou, who confirmed it was AI-generated.

Image of Iranian Rocket Strike on Tel Aviv
Amid the 12-day conflict between Israel and Iran in June, an image purporting to show a massive barrage of Iranian rockets striking Tel Aviv went viral on social media platforms. Misbar’s team investigated the viral image using reverse image search and traced it to an Egyptian graphic designer, Mhamad Yusif, who confirmed that he generated the image using artificial intelligence and described it as a cinematic depiction rather than a real scene.

Images Showing Attack on U.S. Base in Qatar
Following reports that Iran launched missiles toward a U.S. military base in Qatar, two images claiming to show the aftermath of the attack on the Al-Udeid airbase in Doha circulated on social media. Misbar’s team analyzed the images using AI detection tools, including Hive Moderation, and determined that both images had a high likelihood of being AI-generated and that no credible media outlets had published the scenes.

Image Purporting to Show Recent Israel Wildfires
When Israel experienced the largest wildfires in the country's history in late April, an image claiming to show scenes from the fires circulated on social media platforms in early May. Using AI detection tools and source tracing, Misbar determined that the image originated from an AI-generated video previously shared on Instagram, where it was presented as depicting wildfires in Los Angeles and labeled with hashtags indicating it was generated using artificial intelligence.

Images Showing Aftermath of Pahalgam Attack
Following the Pahalgam attack in Indian-administered Kashmir in late April, social media users circulated two images claiming to show the aftermath of the incident, with bodies lying on the ground. Misbar’s team investigated the images and found that they were AI-generated. Using AI-detection tools and visual analysis, Misbar determined that both images showed clear signs of digital generation and did not depict real scenes from the attack. One of the images displayed the label “Meta AI,” further confirming it was AI-generated.

Image Showing the Aftermath of Hurricane Melissa
When Hurricane Melissa, the strongest tropical cyclone of 2025, hit Jamaica in October, an image claiming to show Black River Hospital in Jamaica severely damaged by the hurricane circulated on social media. Misbar’s team traced the image to an AI content creator who labeled the visual as artificially generated and clarified that it did not depict real scenes from the hurricane.

Image of Man Applying Makeup Before Bondi Attack
After the Bondi Beach stabbing attack in December, an image circulated on social media claiming to show a man having special effects makeup applied before the incident, suggesting that the attack was staged. Misbar’s team analyzed the photo and identified multiple signs of AI generation, including gibberish text on the man’s shirt, distorted and extra fingers on both the man and the girl applying makeup, unrealistic texture of the makeup, repeated and blended elements in the background, and an implausible camera held by a bystander.

Image of Bondi Beach Shooter With India’s Defense Attaché
Another image that widely circulated following the Bondi Beach attack was claimed to show shooter Naveed Akram sitting with India’s Defense Attaché to the Philippines, Capt. Kant Kothari. Misbar’s team analyzed the photo using Google Gemini and detected a Google SynthID watermark, which confirms the image was created using Google AI. Misbar also found signs of AI manipulation, such as unreadable text on a chicken bucket. Verification with the Hive AI detector indicated a 98.9% likelihood that the image was AI-generated or altered, and the account sharing was previously identified for spreading fabricated content targeting India.

Image of Delta Crash Plane
Following the Delta Air Lines crash at Toronto Pearson International Airport in February, an image claiming to show the overturned plane went viral on social media. Misbar’s team analyzed the image and found multiple signs of AI generation, including distorted windows, oddly merged firefighters, and misshapen ambulance tires. AI detection tools indicated a 98.7% likelihood that the image was artificially generated.

Image Claims Gaza Map Resembles California Wildfire Map
In January, California faced destructive wildfires that affected thousands of structures. During the crisis, an image circulated on social media claiming that the Gaza Strip map resembles California’s wildfire map. Misbar’s team traced the image and found multiple signs of AI generation, including incomprehensible text on the Gaza map, particularly in the area of northern Gaza, and visual inconsistencies in both maps. Comparisons with accurate maps of Gaza and California confirmed that the viral image was misleading, and AI detection tools indicated a 99.1% likelihood that the photo was artificially generated.

Image of Double-Amputee Israeli Soldier's Wedding
Following reports of high figures of injuries among Israeli soldiers in the Gaza war, an image claiming to show a double-amputee Israeli soldier getting married circulated on social media. Misbar’s team analyzed the image, finding several anomalies typical of AI-generated visuals, such as the woman’s right hand having only four fingers, the soldier’s left hand blending unnaturally with hers, the bride holding the bouquet by the thorny stems instead of the ribbon, unusual and misplaced badges and pins, and a belt that does not match standard Israeli military uniforms.

Read More











