top of page

Official’s Clumsy Handling of ChatGPT Exposes Beijing’s Intimidation Campaign

  • Writer: By The Financial District
    By The Financial District
  • Mar 4
  • 1 min read

A sprawling Chinese influence operation — accidentally revealed through a Chinese law enforcement official’s use of ChatGPT — focused on intimidating Chinese dissidents abroad, including by impersonating US immigration officials, according to ChatGPT-maker OpenAI, Sean Lyngaas reported for CNN.


The report provides one of the most vivid examples yet of how authoritarian regimes may use AI tools in documenting censorship efforts.
The report provides one of the most vivid examples yet of how authoritarian regimes may use AI tools in documenting censorship efforts.

OpenAI said the official used ChatGPT like a diary to document the alleged covert suppression campaign.


In one instance, Chinese operators allegedly disguised themselves as US immigration officials to warn a US-based Chinese dissident that their public statements had broken the law.


In another, they described using forged documents from a US county court in an attempt to have a dissident’s social media account taken down.



The report provides one of the most vivid examples yet of how authoritarian regimes may use AI tools in documenting censorship efforts. The influence operation appeared to involve hundreds of operators and thousands of fake online accounts across multiple social media platforms, OpenAI said.


“This is what modern Chinese transnational repression looks like,” Ben Nimmo, principal investigator at OpenAI, told reporters ahead of the report’s release.



“It’s not just digital. It’s not just about trolling. It’s industrialized. It’s about trying to hit critics of the CCP [Chinese Communist Party] with everything, everywhere, all at once.”








TFD (Facebook Profile) (1).png
TFD (Facebook Profile) (3).png

Register for News Alerts

  • LinkedIn
  • Instagram
  • X
  • YouTube

Thank you for Subscribing

The Financial District®  2023

bottom of page