Plain AI in plain English
About

Family · the always-there friend

AI companion apps

30-second gist~30s read

AI companion apps offer a friendly, always-available chat partner. The largest is Replika (40M+ users) and Character.AI; others (Pi, smaller services) come and go as the landscape shifts. For lonely or socially-anxious people, they can be a genuine source of comfort. For young people forming emotional attachments to a chatbot, the picture is messier. Researchers are early in studying the long-term effects.

It's not a black-and-white thing. Most usage is harmless. Some patterns are worth watching for.

If you want more

When companion apps help~30s
  • An older relative living alone who finds the chat a daily moment of contact.
  • Someone with social anxiety practising conversations they'd otherwise avoid.
  • Late-night spiralling thoughts that benefit from any other voice.
  • Language practice — the AI is patient, never judges your accent.
When to step in~30s
  • A child or teen is forming emotional dependence on a bot — preferring it to friends, missing it during school, distressed when the app is down.
  • The companion is being shared with sensitive personal info — health, finances, abuse — that may end up in the company's data store.
  • A vulnerable adult is using it as a substitute for human connection rather than a supplement to it.

The fix is usually a conversation, not a ban. Companion apps work best when they sit alongside real-world contact, not in place of it.