Some people appear to accidentally be using Meta AI like a public personal diary, or butt dialling with the app. Be careful. (This week, Meta added a warning if you’re about to post your AI chat online, though it didn’t appear consistently in the app.)
If you’re intentionally posting your AI chats publicly – why? Ask yourself whether you’d post the same thing on your Facebook page.
It’s also not clear why Meta thought it was a good idea to create a stream of everyone’s chatbot musings.
2. Don’t develop feelings for chatbots
Chatbots are designed to sound human and hold conversations that flow like a text gab fest with an old friend. Some “companion” chatbots can role play as a romantic partner, including sexual conversations.
But never forget that a chatbot is not your friend, lover or a substitute for human relationships.

If you’re lonely or uncertain in social situations, it’s okay to banter or practise with AI. Just be sure to take those skills into the real world.
You can also try asking a chatbot to recommend local meetups, organisations for people in your age group or similar life stage or advice on making personal connections.
3. Recognise when you’re talking to AI
AI is so good at mimicking human chatter that scammers use it to strike up conversations to trick people into sending money.
For safety, assume that anyone you meet only online is not who they say, particularly in romantic conversations or investment pitches. If you’re falling for someone you’ve never met, stop and ask a family member or friend if anything seems off.
4. Know why chatbots spew weird stuff
Chatbots make things up constantly. They’re also designed to be friendly and agreeable so you’ll spend more time using them.
The combination sometimes results in obsequious nonsense, like our Washington Post colleague who found OpenAI’s ChatGPT invented passages from her own published columns and fabricated why that was happening. (The Post has a content partnership with OpenAI.)

When these oddities happen to you, it helps to know the reason: these are stupid computer errors.
AI companies could program their systems to respond, “This chatbot can’t access that information”, when you ask questions about essays, books or news articles that they aren’t peering into.
Instead, machines might act like a kid who has to give a book report but hasn’t read the book: they fabricate details and then lie when you catch them making stuff up.
An OpenAI spokesperson said the company is “continuously working to improve the accuracy and reliability of our models”, and referred to an online disclosure about ChatGPT’s errors.
5. Don’t just copy and paste AI text
If you use a chatbot to help you write a flirty message to a dating app connection, a wedding toast or a cover letter for a job, people can tell when your words come verbatim from AI. (Or they can paste your text into an AI detector, although these technologies are flawed.)

Roman Khaves, CEO of AI dating assistant Rizz, suggested treating chatbot text as a banal first draft. Rewrite the text to make it sound like you, including specific details or personal references.
6. Be careful what you share, part two
Most chatbots will use at least some information from your conversations to “train” their AI, or they might save your information in ways you’re not expecting.
Niloofar Mireshghallah, an AI specialist and an incoming Carnegie Mellon University professor, was surprised that when you tap the thumbs-up or thumbs-down option to rate a chatbot reply for Anthropic’s Claude, that starts a process of you consenting to the company saving your entire conversation for up to 10 years.

Anthropic said it’s transparent about this process in the feedback box and in online questions-and-answers (Q&As).
Before confiding in chatbots, imagine how you’d feel if the information you’re typing were subpoenaed or leaked publicly.
Mireshghallah said she’s unnerved by the prospect of people working for chatbot companies reviewing conversations, which she said happens sometimes.
At minimum, Mireshghallah advised not entering into chatbots your personally identifiable or sensitive information like Social Security or passport numbers. (Use fake numbers if you need to.)