How artificial intelligence and synthetic reality shaped Bangladesh’s 2026 election

Global Voices

This post is part of Global Voices’ May 2026 Spotlight series, “Human perspectives on AI.” This series will offer insight into how AI is being used in global majority countries, how its use and implementation are affecting individual communities, what this AI experiment might mean for future generations, and more.

A photograph began circulating across Bangladeshi social media on December 14, 2025, that would majorly influence the national election on February 12 — the first election since the July 2024 student-mass uprising toppled former Prime Minister Sheikh Hasina’s government. The high-stakes vote was considered a penultimate test of whether Bangladesh could rebuild and realize its hopes for a free and fair democracy following the uprising.

The image showed Shadik Kayem, 27th Vice-President of Dhaka University Central Students’ Union, sitting across a small table from another man, apparently sharing tea in what looked like a casual meeting between acquaintances. The timing made the photograph explosive: just three days earlier, Osman Hadi, the coordinator of Inqilab Moncho (Revolution Platform), a cultural organization formed by people associated with the July 2024 student-mass uprising, had been shot in Dhaka, triggering immediate political controversy about who orchestrated the attack.

The man sitting with Kayem in the photograph was allegedly Hadi’s shooter.

However, all was not as it appeared. The fact-checking organization FactWatch determined that this photograph was fake and had been generated using artificial intelligence (AI).

Welcome to Bangladesh’s first AI-saturated election, where seeing was no longer believing.

The scale of synthetic reality

Between December 2025 and February 2026, as Bangladesh prepared for its national elections on February 12, a study identified 72 cases in which AI-manipulated content, specifically designed to manufacture false narratives and shape electoral outcomes, gained significant momentum online. This study reviewed fact-checks on AI-generated content targeting political parties, political figures, and the electoral process to understand how these false narratives were being leveraged.

Nearly half of all AI-generated content involved claims of false activities and statements. Forty-nine percent of AI manipulations created entirely fictional realities about what political leaders were doing, saying, or experiencing during the campaign.

Twenty-eight percent of cases deployed AI to create false statement attributions — putting specific words into the mouths of political figures through AI-edited photocards mimicking trustworthy media outlets.

The dominant AI tactic involved generating synthetic images of political figures in false contexts. AI-generated photographs showed newly elected Prime Minister Tarique Rahman digging in a field with a spade alongside a constituent during supposed campaign activities. Synthetic images showed him shaking hands with children at events he never attended. Each manipulation created false impressions about his campaign, his priorities, and his public persona.

AI-generated images and videos also circulated showing his mother, Khaleda Zia (the former Prime Minister of Bangladesh from 1991 to 1996 and 2001 to 2006) walking, claiming the seventy-six-year-old BNP chairperson was moving freely despite known serious health conditions. Additional AI content about Khaleda Zia addressed strategic questions about her political capability, potentially affecting BNP’s electoral prospects.

Meanwhile, despite being banned from participating in the election, the previous ruling party, Awami League, remained a target for AI manipulation. Deepfake videos showed former Prime Minister Sheikh Hasina making statements from her exile in India that she never actually made. Additional AI video manipulations attempted to create impressions that she maintained governmental authority despite her exile, serving narratives that the election lacked legitimacy without Awami League participation.

Perhaps most audacious, an AI-generated video appeared showing former Malaysian Prime Minister Mahathir Mohamad endorsing Hasina as Bangladesh’s legitimate leader. The deepfake leveraged international authority to support the Awami League’s exile position. Mahathir never made such statements. The video was entirely synthetic.

Falsified photocards

Political actors exploited this fragmented media landscape by creating or editing photocards (graphic news cards) that appeared to originate from established outlets like Somoy TV, Channel i, Jugantor, or Kaler Kantho.

False quotes attributed to the Bangladesh Jamaat-e-Islami party candidate Syed Abdullah Muhammad Taher appeared in fabricated Somoy TV photocards. Channel i photocards surfaced with similarly altered quotes from the same Jamaat leader.

When voters saw inflammatory quotes appearing across several trusted media brands, the multi-source validation made the fabricated quotes more credible than isolated false claims.

Edited photocards supposedly from the news site Barta Bazar about the Jamaat in Mirpur propagated hyper-specific, false narratives targeting particular constituencies.

Fact checkers encountered twenty separate cases involving AI-edited fake quotes. When photocards showed National Citizen Party chairperson Nahid Islam making incendiary comments, the manipulation used media brand logos to add credibility to the fabrications.

Additional modified photocards targeting political personalities demonstrated the systematic nature of this strategy across several parties and candidates.

The Osman Hadi falsehood

Three days after Inqilab Mancha coordinator Osman Hadi was shot on December 11, political actors began weaponizing his tragedy. AI-generated images claimed to show Hadi opening his eyes in his hospital bed, manipulating the public’s desperate hope for his recovery. The fabricated photographs spread rapidly across social media, each share intensifying a manufactured moment that his family and doctors could only contradict with denials — denials that were never as compelling as emotional, visual “proof.”

The incident didn’t die down. As February’s election neared, altered photocards using Barta Bazar’s name emerged, falsely claiming that the Jamaat-e-Islami party had sponsored the attack. Opponents fabricated evidence of Jamaat’s involvement, potentially swaying uncertain voters. This exploitation continued months after the shooting.

A rise in conspiracy theories

Conspiracy theories that once circulated as whispered speculation gained sudden credibility when an onslaught of AI-manufactured proof emerged to support them. An image appeared, claiming to document a secret meeting in Delhi between Bangladeshi activist Pinaki Bhattacharya, Indian National Security Advisor Ajit Doval, and politician Krishna Nandi. The synthetic image transformed unproven theories about foreign interference into apparent visual documentation. No such meeting happened. The image was entirely artificial.

Meanwhile, AI-generated images claimed to show Awami League protest marches in Bhola, suggesting the banned party could still gather supporters despite legal prohibitions. Fabricated images of Jamaat assemblies served dual purposes — either demonstrating overwhelming popular support or raising alarms about dangerous Islamist mobilization, depending on which narrative benefited the circulator. Additional synthetic images depicting massive political support and gatherings were released to reinforce these narratives.

The war between parties

Examining who attacked whom reveals the election’s underlying dynamics. The Bangladesh Nationalist Party (BNP) absorbed the heaviest assault, with forty-seven cases targeting the party that would ultimately secure a landslide victory. Opponents clearly identified BNP as the threat requiring maximum firepower.

Jamaat-e-Islami confronted thirteen separate AI attacks. Edited photocards spread fabricated quotes from Jamaat figures, often questioning the party’s Islamic credentials or exploiting religious themes. When hackers compromised a Jamaat leader’s account and attributed false comments to him, the attack merged traditional cybersecurity breaches with AI-amplified distribution.

The Awami League persisted relevant enough from exile to warrant six counter-campaigns. Deepfake videos featuring Sheikh Hasina and fabricated photocards sought to challenge whatever legitimacy claims the banned party maintained.

The National Citizen Party’s statements to convener Nahid Islam. Opponents sought to influence younger voters who’d participated in the July Uprising by releasing heavily edited images with inflammatory language, attributing them to Nahid Islam, attempting to alienate the very constituency that elevated NCP into relevance.

Another AI-generated video circulated, claiming to show police releasing election survey results, fabricating official statements about electoral prospects. The manipulation attempted to shape voter expectations. Edited photocards showed fabricated interactions between political figures, creating entirely fictional political relationships designed to confuse voters about actual coalition dynamics. AI-generated videos even purported to show Tarique Rahman asking for money during campaigning, suggesting vote-buying or financial corruption. The volume alone overwhelmed fact-checking capacity.

Blueprint for regional democracy

What happened in Bangladesh’s 2026 election represents something categorically different, wherein AI overhauled the political battlefield.

Bangladesh’s experience provides the first comprehensive documentation of AI weaponization in South Asian electoral democracy. India faces elections regularly. Pakistan’s political landscape remains volatile. Nepal and Sri Lanka conduct their own democratic processes. All now face the prospect of similar AI-driven misinformation campaigns.

The patterns documented here: temporal escalation toward election day, deployment of synthetic images and edited photocards, exploitation of sensitive political events, all represent tactical knowledge that political operatives throughout the region are certainly studying.