OpenAI blocks Iranian group’s ChatGPT accounts for targeting US election

OpenAI recently shut down accounts belonging to an Iranian group that were using its ChatGPT chatbot to influence the U.S. presidential election and other global issues. The group, identified as Storm-2035, utilized ChatGPT to create both long-form articles and social media posts, touching on topics like U.S. election candidates, the conflict in Gaza, and Israel’s participation in the Olympic Games. Despite their efforts, the content generated by these accounts failed to gain significant audience engagement.

OpenAI

 

The posts addressed subjects including former President Donald Trump being purportedly blocked on social media and Vice President Kamala Harris choosing Tim Walz as her running mate. The content was intended to seem to be from both liberal and conservative perspectives. But even with the clever application of AI, the operation failed to engage the target audience. Likes, shares, and comments on most posts were scarce to nonexistent.

These actions were discovered by OpenAI’s analysis, which also reveals a larger pattern of foreign organizations trying to use cutting-edge AI techniques to sway political debate. This incident comes after firms like Microsoft and Meta have revealed discoveries similar to this one, reporting foreign influence operations from China, Russia, and Iran, some of which also used AI-generated content. 

The event also highlights the continued difficulties in battling false information on the internet, even if the accounts involved in this operation have been prohibited from utilizing OpenAI’s services. While the U.S. intelligence agency is on guard against foreign attempts to manipulate public opinion in advance of the impending presidential election, OpenAI is keeping an eye out for any more transgressions. 

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top