Numerous people have turned to AI chatbots to discuss any and everything, no matter how personal. Mainly, they think it’s anonymous, secure, and private. The truth is, AI chatbot privacy isn’t so private, and this recent massive breach proves it.
Over 300 Million Chats Exposed
The highly popular chat app Chat & Ask AI, which is powered by multiple AI models, experienced a shocking breach in January 2026. One user accessed over 300 million chats from at least half of the 50 million app users. These included casual queries and far more personal chats, including medical, mental health, finance, and illegal activities.
The good news is the researcher, who goes simply by Harry, wasn’t accessing the app’s data for malicious purposes. Instead, he did it to expose the vulnerability so the developers could fix it.
Yet, the vulnerability showcases just how easily it is for hackers to access millions of supposed private chat messages in no time at all.
In this case, Chat & Ask AI had a Google Firebase configuration issue. It’s common and Codeway, the company behind the app, fixed it immediately after Harry informed them of the vulnerability.
The next time someone finds a vulnerability, it might not end the same. It could mean your “private” AI chats are leaked to the world.
This isn’t the first breach or leak. Hundreds of thousands of private Grok conversations showed up in Google search results. Something similar happened with ChatGPT users.
Sometimes, it’s a simple configuration error. In other cases, users don’t understand when they share chats, they share them publicly and not just with select friends. For example, many users mistakenly thought Meta AI’s Discover Feed only shared chats with their friends. Instead, chats were shared with all users.
AI Chatbots Store More Than You Think
Chatbots often claim to keep your data private, but did you read the terms? I hate reading lengthy terms of service and privacy policies, but I do it anyway. I want to know what data is being stored and why. There are ways to make this process less tedious, which I highly recommend.
In most cases, unless you opt out, anything you chat about is fair game for AI training. The more you interact, the more AI models learn about how to correctly interact with humans. They also gain more information overall. For instance, OpenAI states some personal data may be used in model training.
If you have an account, free or premium, most AI platforms collect:
- Name, username, IP address, browser fingerprint, etc.
- Personal preferences, such as preferred personality, likes/dislikes, and anything you tell it to remember for future reference, such as allergies when searching for recipes
- Data from uploaded files, including sensitive documents
- All chats, no matter what they’re about
- Financial details, if you’re using agentic features in AI chatbots to make purchases
The next issue is this data is stored indefinitely. You might delete a chat, but that doesn’t mean it’s not still stored somewhere and being used for training or to personalize your experience.
It’s easy to think you’re just talking to a robot. I get it. It’s easy to chat with AI. But, those AI chatbot privacy isn’t guaranteed, and actual humans may see those chats one day. The lesson is you’re never truly anonymous, even when talking with AI.
Third-Party Chat Apps Are Even Riskier
I’m not claiming Google, OpenAI, Meta, or any other company behind popular AI models are completely secure or even care at all about your privacy. But, when you start using AI chatbots via third-party apps, you’re putting your privacy even more at risk.
Apps like Chat & Ask AI combine multiple models in one. It’s a great way to get the best of everything on one place. I’ve found apps like this highly useful in comparing models and seeing which give the best results, like with Yupp.
I go in fully aware that my data is not only being collected by the third-party app, but any and all models I’m using too. Your privacy is only as secure as the weakest link.
My advice is whenever you’re using a third-party AI chat app to access other models, use even more caution. Never share anything personal.
Information I’d Never Share With AI Chatbots
I’m not surprised users overshare when chatting with AI. It doesn’t seem real, so what’s the harm? Let me use social media as an example. Users assume only their friends see what they post. Yet, they’re shocked when an employer suddenly sees a controversial post or a burglar takes advantage of their “on vacation” post.
Whatever you share online could become public. Privacy settings do help lock things down, but it’s not a guarantee. If a platform experiences a vulnerability, everything could be public. And, since AI platforms aren’t always clear on what types of your data they use for training, I’d highly suggest not sharing any of the following:
- Any personally identifying details, like name, address, phone number, etc.
- Any financial details
- Security answers, passwords, or usernames (also don’t use AI to generate passwords)
- Illegal activities
- Confidential documents for work or your personal life
- Health information, including mental health
Think of AI chatbot privacy as the same as public forum. If you do that, you’ll be much safer.
Using AI Chatbots Safely
Completely offline AI chatbots are your safest options, though they’re more limited. Everything’s processed locally, reducing privacy risks. You can even run these on Android.
If this isn’t an option, stick with official chat apps from a model’s provider. These tend to follow privacy laws better than random third-party apps that simply access the models.
I also suggest turning off chat history, if possible. This limits the chatbot from learning and storing quite as much information. At least, delete your chat history when you’re finished.
Opt for AI chatbots that put privacy first. For example, Proton’s Lumo chatbot that uses zero-access encryption to ensure your chats stay private. Or, try Brave’s Private AI Search for encrypted chats that delete after 24 hours.
AI’s a part of our lives now, for better or worse. Now, we just have to remember chatting with a computer doesn’t mean conversations are private.