People are deleting ChatGPT, wtf is going on?


People are deleting ChatGPT primarily due to mounting privacy concerns over its partnership with the U.S. Department of Defense, fear of data surveillance, and distrust regarding AI safety. Other factors include the rise of competitors like Claude, the spread of inaccurate information, and ethical concerns surrounding the platform’s, as reported in a YouTube video.

Not sure but I think that fear of ‘data surveillance’…that ship sailed years ago. What do you think?

3 Likes

I’m not convinced the “data surveillance” concern is new. Most of us have been trading data for convenience for years—search engines, social media, smartphones, smart TVs, the whole ecosystem. In that sense, the ship probably sailed a long time ago.
What’s different with AI isn’t just data collection, it’s the capability layer—systems that can analyze, generate, and influence at scale. That’s where the real conversation should be: transparency, guardrails, and accountability.

Deleting one AI app might make a statement, but the broader issue is how society governs powerful AI tools across the board.

1 Like

People tolerated data collection before because the tradeoff felt predictable. With AI, the scale, speed, and potential uses of that data are far less clear. So for many people, this isn’t old news—it’s the first time they’re seriously questioning where their information goes and how it might be used.

Deleting an app might not solve everything, but it does show that public trust in AI companies isn’t automatic. And honestly, that skepticism is probably healthy.

1 Like

Yeah—you’re basically right. The “data surveillance” concern didn’t start with ChatGPT. Between companies like Google and Meta, most people have already been sharing tons of data for years.

What’s new is that AI makes that data more powerful—and that’s what’s making people uneasy now.

1 Like