Family · the long con
The AI-aided romance scam
30-second gist~30s read
The classic romance scam used to need a patient writer with strong English. AI gave the same playbook to anyone in any language. The messages now feel right — warm, specific, attentive. The scam still ends the same way: a sudden financial ask, and then the person disappears.
The fastest-growing victim group is people over 60, often after their spouse has passed.
If you want more
The shape of the conversation
It usually starts on a regular platform — Facebook, a dating app, even Words With Friends. "Sorry, I sent that message to the wrong person." A friendly conversation grows. Photos exchanged. Sometimes phone calls. The AI helps the scammer write thoughtful, well-timed messages, remember details, and adapt their persona on the fly. They ask about your life. They listen. They feel real.
Weeks pass. Months. Then there's a problem: a stuck inheritance, a medical emergency, a customs fee on a parcel meant for you. Money will fix it. Just this once.
Five quiet warning signs
- They never video-call, or the call has problems every time.
- Their job + location conveniently explains why you can't meet ("offshore engineer", "deployed soldier", "surgeon abroad").
- They want to move off the platform early, to WhatsApp or Telegram.
- The conversation is strangely well-timed — quick replies, on cue, around the clock.
- The first money ask is usually small. Then bigger. Then much bigger.
Scale check
The US Federal Trade Commission reported over US$1.14 billion in romance-scam losses in 2023. Median losses are highest among adults over 60. The figure has trended upward since AI tools made the scam easier to industrialise. New Zealand, Australia, the UK, and Canada all report similar trends.