Seeing things
Developments
- As millions watch fake images of the war in Iran, many are dismissing real footage as fake, while the White House keeps posting clips from Grand Theft Auto and cheat codes. What’s going on?
- Iran’s unseen new leader has spoken—through a news anchor. … The new American market for Russian oil. … + A rapper from Kathmandu’s showdown with Nepal’s political order.
Features
- How did a criminal industry worth tens of billions entrench itself in Cambodia? Jacob Sims on the nexus of organized crime and autocracy in Southeast Asia.
Books
- What turned Libya from a fledgling democracy into a civil war? Inga K. Trauthig’s Ruining Revolution: How International Islamists and Salafi Forces Have Held Libya Hostage Since 2011.
Music
- Who are Tomorrow’s Warriors?
- & New tracks from Shabaka, Turnstile, Peter Gabriel, The New Pornographers, & Bill Callahan.
+ Weather report
- Spring isn’t winning everywhere …
Developments
The war on screen
An AI-generated video showing Iranian missiles striking Tel Aviv appeared in more than 300 social media posts over the past two weeks, accumulating tens of millions of views. The skyline shudders; buildings erupt. An Israeli flag hangs in the foreground—a detail the AI tools insert automatically when prompted for “missile strike on Israel.” Of course, the attack never happened. Another fabricated video showed captured American special forces held at gunpoint by Iranian troops. Another depicted the aircraft carrier the USS Abraham Lincoln on fire—footage Iran’s Islamic Revolutionary Guard Corps themselves initially cited as evidence of a successful strike. (The Lincoln was fine.) The New York Times has catalogued more than 110 unique AI-generated images and videos about the war in two weeks. BBC Verify’s Shayan Sardarizadeh says this conflict may have broken the record for viral AI-generated content during wartime.
Meanwhile, as we saw last week, the White House has been splicing real strike footage with Grand Theft Auto, Call of Duty, and SpongeBob SquarePants—its communications director tweeting video-game cheat codes for unlimited ammunition. All while the administration still hasn’t entirely defined what victory looks like.
Genuine footage, it turns out, looks “wrong.” Real video of missile strikes tends to be distant, shaky, shot at night—missiles as bright specks, explosions as plumes of smoke. The AI versions look like cinema: mushroom clouds, sonic booms rippling across cities, hypersonic missiles trailing light. The fabricated war is more vivid, more emotionally legible, more shareable than the actual one. And authentic footage of an Israeli rocket hitting a busy Tehran street—cars sent flying—circulated online and was widely dismissed as AI-generated. The fakes aren’t just building confusion about the conflict. They’re eroding your capacity to trust mediated information about anything.
This is a war whose objectives have multiplied by the week. The official social-media accounts of those prosecuting it have been treating killing as content. And its visual record is now the most contested of any war, ever.
How can you make sense of it?
- The scale. The social-media-intelligence company Cyabra traced identical videos, synchronized posting windows, and fixed hashtag clusters to centralized Iranian content production across TikTok, X, Facebook, and Instagram. TikTok alone accounted for 72 percent of total views and more than seven million engagements. The content arrived simultaneously in Persian, Arabic, Urdu, and English. This wasn’t improvised. It effectively launched alongside the missiles.
- The satellite front. A new category of fabrication: AI-generated satellite imagery designed to mimic the visual authority of open-source intelligence. Iran’s state-aligned Tehran Times posted a “before and after” image claiming to show a destroyed American radar facility in Qatar. Researchers found it was a doctored Google Earth image of a U.S. base in Bahrain, taken in February 2025—same cars, same parking spots, a year apart. AFP detected Google’s invisible SynthID watermark, confirming AI manipulation. Imposter OSINT—open-source intelligence—accounts have appeared on X, mimicking the format of legitimate digital investigators. The verification method that arose to circumvent government censorship during conflicts is now a target.
- Not your grandad’s propaganda. The old wartime information model—two sides, each broadcasting a version of events, from Lord Haw-Haw to the Gulf War—assumed a shared visual record that competing sides interpreted differently. The Iran war doesn’t have one. It’s a simultaneous, multilateral information environment in which Iranian state campaigns, White House meme videos, Israeli AI-processed intelligence, Russian targeting data, commercial engagement farmers, and automated bot networks all occupy the same feed. Cyabra caught Iranian bot networks that had previously posed as British users stoking Scottish separatism—1,300 fake profiles, 224 million potential views—pivoting overnight to pro-Iranian war narratives when Iran’s internet came back on. A NATO Strategic Communications Centre of Excellence report, published weeks before the war, concluded that AI-enabled influence operations are now faster, more persuasive, and harder to detect than anything that preceded them.
- The engagement economy. Not all of this is statecraft. BBC Verify, the broadcaster’s fact-checking unit, found that a significant share of AI-generated war content comes from creators farming views and revenue—cutting fabricated clips to trending audio, captioning them in multiple languages, designed to perform rather than persuade. State propaganda and commercial grift now use the same tools, the same platforms, and the same incentive structure. The conflict’s visual record is, in no small part, a commercial product.
- The closed loop. When users on X turned to Elon Musk’s AI chatbot Grok to check whether footage was real, Grok repeatedly confirmed fabricated videos as genuine—citing Newsweek and Reuters to support conclusions those outlets’ reporting didn’t warrant. AI generates the fakes. AI confirms them. The architecture of X—which rewards engagement over accuracy—ensures they circulate. The verification layer is producing its own misinformation.
- The inversion. Factnameh, an independent Persian-language fact-checking outlet run from Toronto that has tracked Iranian-regime disinformation since 2017, observed a pattern accelerating after last June’s 12-Day War—the U.S.-Israeli strikes on Iran: As AI fakes multiply, people dismiss real evidence that contradicts their expectations. They reject authentic footage; they believe the fabrications. The volume of fakes has produced a permission structure for disbelieving anything. The technology doesn’t just manufacture false images. It’s eroding the authority of genuine ones.
- The accountability gap. Congress voted not to require the administration to seek authorization for continued military operations—less a debate about the war’s purpose than a test of party loyalty. The administration’s stated objectives have expanded from four to at least five without anyone in a position of authority requiring a definition of success. The visual record on which any public reckoning would depend—what was hit, what was destroyed, who was killed, what the war looked like—is itself unreliable. And on Saturday, Brendan Carr, the chair of the U.S. Federal Communications Commission—the agency that regulates American broadcasters—threatened to cancel the broadcast spectrum licenses of news outlets he accused of “hoaxes and news distortions” about the war. U.S. Defense Secretary Pete Hegseth suggested CNN’s coverage would improve once David Ellison—who reportedly assured the White House that his overhaul of the network would result in its coverage being more favorable toward the administration—completed his acquisition of its parent company. While hundreds of AI-generated fakes circulate unchallenged, the administration’s regulatory pressure appears to be aimed at the reporters trying to sort real from fabricated.
Every prior American conflict had a visual record that anchored public understanding—however contested, however partial. Saigon had its photographs. The Gulf War had its night-vision video. Iraq had footage of Abu Ghraib. Those images could be argued about, but they existed as shared points of reference: This happened, and here’s what it looked like.
The Iran war, two weeks in, is producing something different—and abjectly disorienting. Its most widely seen images are fabrications. Its official communications treat airstrikes as memes. People dismiss its authentic footage as inauthentic. And the AI tools that generate the fakes are improving faster than the tools built to detect them—BBC Verify’s Sardarizadeh, one of the most prominent conflict-disinformation analysts working, says new fakes appear faster than anyone can flag them.
Millions of us are watching this war. A lot of what we’re seeing—produced by governments, bot networks, and engagement farmers, often indistinguishably—isn’t real. Since the invention of the camera, at least, people have had to try to understand conflicts based on incomplete and sometimes misleading visual evidence. Now the evidence is abundant, vivid, and arrives in hours—and the false material looks more like what many of us expect reality to look like than the real footage does. The immediate struggle, in this informational environment—the practical struggle we’re faced with—isn’t to understand what’s happened, even though that’s what democratic accountability requires of us. It’s to figure out how many of the 50 things we just watched didn’t happen.

Meanwhile
- The photograph on screen. Mojtaba Khamenei broke his silence on Thursday—or someone did. Iran’s new supreme leader’s first public statement arrived as text, read aloud by a state television anchor over a still image. No video of Khamenei. No recording of his voice. The regime has offered no direct evidence of his condition since February 28, when the strike that killed his predecessor also reportedly wounded him. The statement promised the closure of the Strait of Hormuz, attacks on American bases across the region, and, or so it hinted, the opening of additional fronts. A day earlier, Iran’s President Masoud Pezeshkian floated the possibility of ending the war under certain conditions—a position the supreme leader’s office has not acknowledged. … See “Heir apparent.”
- What the drones couldn’t do. Iran’s navy is largely destroyed. Its answer: laying naval mines in the Strait of Hormuz, with small boats. Where Iran targeted tankers with drones episodically, mines sit on the seabed and destroy whatever passes over them—and clearing them takes months, not days. The International Energy Agency—the body that coordinates emergency oil reserves among major consuming nations—responded with the largest emergency oil-stockpile release in its history: 400 million barrels, enough to cover roughly 26 days of disruption at current rates. Crude crossed US$100. The Navy may start escorting commercial ships by month’s end, per U.S. Energy Secretary Chris Wright—though an escort fleet doesn’t sweep a minefield. In the meantime, the U.S. Treasury has authorized American firms to purchase Russian crude already at sea, connecting the European and Persian Gulf wars in the most improbable way yet. … See “Mining the strait.”
- MC, mayor, prime minister. A decade ago, before becoming mayor of Kathmandu, Balendra Shah was performing underground rap about broken infrastructure and graft. On March 5, the 35-year-old civil engineer led his Rastriya Swatantra Party to 182 of 275 parliamentary seats—the largest majority Nepal has seen in 60 years—and is now set to become the country’s youngest prime minister. Shah unseated K.P. Sharma Oli, a four-time premier, in his own constituency, capturing three-quarters of the vote. The election was Nepal’s first since last year’s Gen Z–led uprising, which left more than 50 dead and brought down Oli’s government. Nepal has yet to see an elected government complete a full term.

Wary of fast fashion?
Shop Congo Clothing Company and
make a difference—in style.
Features
Code black
How did a criminal industry worth tens of billions entrench itself in Cambodia? Jacob Sims on the nexus of organized crime and autocracy in Southeast Asia.