Pro-Israel Trolls Cast Doubt on Photo of Double-Amputee Gaza Girl
Recently, pro-Israel accounts on X circulated claims that a viral photo showing a Gazan girl with amputated legs was fake and AI-generated. They backed these claims by sharing responses from ChatGPT and X’s Grok, both labeling the image as AI-made. Misbar's team debunked the Israeli false claims.
Pro-Israel Accounts Question Photo of Gaza Girl with Double Amputation
On May 22, “@Awesome_Jew_,” an X user known for spreading fake news about Gaza, posted a reply to a photo of a Gazan girl with amputated legs, claiming the image is fake. The user supported this claim by sharing two screenshots showing responses from ChatGPT and X’s Grok.
ChatGPT stated, “The image appears to be digitally altered or AI-generated,” and X’s system gave a similar answer. Both AI tools explained why the girl’s photo might be altered or AI-generated.
The user hashtagged X’s Community Notes to label the girl’s photo as AI-generated. Alongside the screenshots, the user commented, “All they do is spread lies—and the truly unfortunate part is how many gullible people end up believing them. @CommunityNotes.”
Misbar Debunks Disinformation Targeting 3-Year-Old Rahaf
Misbar’s team investigated the circulating claim and found the photo to be real. It shows 3-year-old Rahaf from Gaza, who lost both her legs in an Israeli airstrike in September 2024.
On September 8, 2024, Al Jazeera uploaded a video of Rahaf Saad, reporting that she lost her legs during an Israeli strike on her home in Al-Nuseirat camp in central Gaza.
Palestinian photographer Hassan Esliah, who Israel assassinated in a recent airstrike in a hospital, captured the video and posted it on Instagram on September 8, 2024, reporting that Rahaf Saad lost her limbs in the Israeli attack.
Media outlets also covered Rahaf Saad’s case. Rahaf’s father told the Palestinian Information Center that the Israeli military bombed their home despite its location in the Israeli-designated “safe zone.”
Israeli warplanes struck the house without warning, causing Rahaf to lose her limbs instantly and killing her grandfather.
Only Rahaf, her late grandfather, and her brother, who suffered multiple injuries, were inside the house. Since the attack, Rahaf has been exposed to psychological trauma from the loss of her lower limbs.
Children’s Educator Ms. Rachel Meets Injured Girl Rahaf
A few days ago, the renowned American children’s educator Ms. Rachel released a video showing her meeting with Rahaf, who recently evacuated from war-torn Gaza to the U.S. for treatment.
Rachel Griffin Accurso, known to millions of families worldwide as Ms. Rachel, posted the video on Wednesday evening, where she met three-year-old Rahaf and shed light on Israel’s brutal bombardment of Gaza and its devastating impact on children.
In the video, Ms. Rachel sings to Rahaf as Rahaf dances, struggling slightly due to using prosthetic legs.
In the caption, Ms. Rachel states that Rahaf’s young brothers and father remain in Gaza, which endures relentless daily bombardment from Israel. She also highlights the catastrophic humanitarian crisis in the Gaza enclave, which has been under total siege for over two months now.
International media outlets shared Ms. Rachel’s video, and CNN interviewed Rahaf’s mother, who confirmed that Rahaf arrived in the U.S. for treatment after months of injury.
Ms. Rachel has previously spoken out about the dire conditions in Gaza, including the soaring rates of child malnutrition in the enclave.
Last month, a pro-Israel organization targeted her for raising awareness about the starvation of Palestinian children. StopAntisemitism, a U.S.-based pro-Israel group, accused her of being a “mouthpiece of Hamas” and “spreading Hamas propaganda” for expressing concern over the suffering of children in Gaza.
In the CNN interview, Ms. Rachel firmly denied pro-Israeli escalating hate speech, saying, “I care about all kids,” whether they are “Israelis or Palestinians.”
Are Grok and ChatGPT Reliable for Detecting AI-Generated Photos?
Misbar’s investigative team asked the two AI chat models, which claimed Rahaf’s photo is fake, whether they can properly detect AI-generated images.
OpenAI’s ChatGPT responded that the system itself “cannot reliably detect AI-generated (fake) photos” because it is a text-based model.
Similarly, when Misbar’s team asked X’s Grok the same question, it answered, “Grok, including my own capabilities, isn’t specifically designed as a tool for detecting AI-generated photos.”
Moreover, Misbar’s team has warned about the misinformation that AI chat models can produce. Back in March 2023, Misbar’s team asked ChatGPT several questions, which the system answered incorrectly.
In a recent blog, Misbar’s team also revealed that X users have been relying on X’s Grok to fact-check information, photos, and videos on the platform, but the AI model often generates completely false answers.
Read More
FTC Investigates OpenAI's ChatGPT Regarding Consumer Protection
AI-Operated Accounts Promote the Israeli Narrative During the Gaza War