The United States, Israel and Iran are all embroiled in an extreme fight presently. However, amid this, ChatGPT has come under scrutiny, with events escalating to the point that calls for its boycott began, and the app saw a surge in users uninstalling it by almost 300%.
Within just days, around 2.5 million users reportedly boycotted the platform because many felt that the controversial deal was crossing lines into privacy and data being misused.
Why Are People Deleting ChatGPT?
ChatGPT, OpenAI’s flagship product, became a household name almost overnight, amassing hundreds of millions of users across the globe. With the advancements in artificial intelligence (AI) tools, practically every country is trying to make their own ChatGPT.
As many uses as reports claim this AI chatbot has, there have been as many cautionary points raised about it and its rapid normalisation without proper rules and regulations in place.
The recent controversy began when reports revealed that OpenAI had agreed with the U.S. Department of Defence (DoD), now renamed as the Department of War, to allow its AI models to be used inside the Pentagon’s classified systems.
While OpenAI said the partnership was aimed at defensive and cybersecurity applications, the announcement triggered immediate backlash from some users.
This instantly resulted in a strong backlash against the company, with ChatGPT witnessing a surge in users uninstalling it by 295%. According to data from market intelligence firm Sensor Tower, this was a much higher number, considering that normally, the app’s usual day-over-day uninstall rate is only around 9%.
Read More: Is ChatGPT Biased Towards The Rich, The West, And White People?
It wasn’t just old users boycotting the app, but new users also saw a decline, with downloads falling by 13% and 5% on Saturday and Sunday, soon after reports came out about the controversial development on February 28, 2026.
Sam Altman, CEO of OpenAI, the parent company of ChatGPT, in a tweet announced the agreement with DoD, writing, “Tonight, we reached an agreement with the Department of War to deploy our models in their classified network. In all of our interactions, the DoW displayed a deep respect for safety and a desire to partner to achieve the best possible outcome.”
However, Altman as well was at the receiving end of criticism and concerns raised by the public on what this would mean.
Several of them took to review-bombing the app, with 5-star ratings going down by 50% while 1-star ratings rose by 775% and 100% over the weekend.
This further comes on the heels of OpenAI’s rival company Anthropic’s Claude publicly declining a similar deal, reportedly announcing that it would not be partnering with the US defence department.
The company said it refused the agreement due to concerns that “AI-driven mass surveillance presents serious, novel risks to our fundamental liberties”, along with referring to fully autonomous weapons and how “today, frontier AI systems are simply not reliable enough to power fully autonomous weapons.”
Anthropic CEO Dario Amodei, in a statement, said that he “cannot in good conscience accede to [the Pentagon’s] request” and that “Anthropic understands that the Department of War, not private companies, makes military decisions… However, in a narrow set of cases, we believe AI can undermine, rather than defend, democratic values. Some uses are also simply outside the bounds of what today’s technology can safely and reliably do.”
Image Credits: Google Images
Sources: The Indian Express, The Economic Times, Tech Crunch
Find the blogger: @chirali_08
This post is tagged under: ChatGPT, ChatGPT app, ChatGPT delete, ChatGPT uninstall, ChatGPT delete, ChatGPT openai, ChatGPT news, israel, iran, iran israel war, war, world politics, technology, ChatGPT users
Disclaimer: We do not own any rights or copyrights to the images used; these images have been sourced from Google. If you require credits or wish to request removal, please contact us via email.
Other Recommendations:
Residents Appreciate Dubai Leadership For Handling War Crisis Situation Well

































