Beware of the ‘fake relative’ scams ahead of the holidays
Frame Stock Footage // Shutterstock
Beware of the ‘fake relative’ scams ahead of the holidays
As the holiday season approaches, families are preparing to reconnect, but so are scammers. Experts are warning about a surge in “fake relative” scams, where criminals use advanced AI voice cloning to impersonate loved ones in distress. The goal? To tug at your heartstrings, and then your wallet.
With more people traveling, shopping online, and sending money to family this time of year, the emotional chaos of the holidays can make anyone vulnerable, PeopleFinders reports. And the newest generation of scams feels frighteningly personal.
What Are “Fake Relative” Scams?
“Fake relative” scams, sometimes called AI impersonation or voice cloning scams, involve fraudsters using artificial intelligence to mimic the voice or identity of someone you know. They might call, claiming to be your child, grandchild, or sibling, and say they’re in trouble and need money right away.
The voice often sounds eerily real. Scammers can now replicate a person’s tone, pitch, and even emotional inflection using just a few seconds of online audio pulled from a social media clip, a YouTube video, or even a voicemail.
These scams are an evolution of the old “grandparent scam,” but with AI technology raising the stakes—and the believability.
Why They’re Spiking Ahead of the Holidays
Scam reports typically spike between November and January, when people are preoccupied with travel plans, financial pressures, and the hectic holiday season. They capitalize on spikes in online shopping, charitable giving, and other seasonal activities. And, as always, it pays to be careful. In 2024 alone, Americans lost more than $12 billion to fraud, marking a 25% rise from the previous year.
The holiday season creates the perfect storm:
- Families are spread out and communicating more by phone and video.
- People are making quick financial decisions, such as sending last-minute gifts or covering travel emergencies.
- Scammers know people are more likely to act quickly out of love or guilt.
Scammers often exploit our most human instincts, like care, concern, and a sense of urgency. During the holidays, these vulnerabilities are heightened, as people are more likely to react quickly to a family emergency or last-minute request without taking the time to verify the details.
Five Signs a “Fake Relative” Scam May Be Targeting You
These scams rely on quick emotional reactions, which is why they’re so effective during the holidays. Paying attention to certain warning signs can help you catch a scam before it escalates. The following are some key indicators that a call or message may not be coming from the relative you think you’re talking to.
1. An urgent request for money or gift cards.
The caller claims they’re in immediate trouble. Perhaps they’ve been in an accident, lost their wallet, or are stranded somewhere. Scammers often request payment via wire transfer, prepaid gift cards, or cryptocurrency, which can be difficult to trace.
Remember: Real relatives rarely demand instant payment without any discussion or verification.
2. Pressure to keep it secret.
Fraudsters frequently insist that you don’t tell anyone else. Phrases like “Don’t tell Mom, she’ll panic” or “I don’t want anyone else to know” are red flags. This tactic isolates you from others who could confirm the story, allowing the scammer to act quickly without detection.
3. Payment requests through nontraditional methods.
Requests for payment via gift cards, wire transfers, or digital currency are classic signs of a scam. Legitimate relatives are more likely to suggest standard payment methods or discuss the situation with you first. Always be cautious when asked to use unfamiliar payment channels.
4. Slight inconsistencies in the voice or story.
Even AI voice clones can stumble on personal details. The caller may mispronounce names, mix up locations, or get facts about recent events wrong. If something feels “off” about the story, no matter how real the voice sounds, take a step back.
5. Refusal to verify identity.
A genuine family member will understand your caution and welcome verification. Scammers, however, often resist when you suggest calling them back on a known number, contacting another relative, or asking a personal question only the real person would know. Their pushback is a major warning sign.
How to Protect Yourself from AI Impersonation Scams
The good news? You can stay ahead of these scams with a few simple precautions. Most “fake relative” schemes rely on panic and emotional reaction, not complex tech. By taking a moment to slow down and verify what’s happening, you can shut down the scam before it starts.
- Pause before reacting. Even if the voice sounds familiar, take a deep breath and step back. Panic is what scammers rely on most.
- Verify the story. Call your relative directly using a known number—or contact another family member to confirm the situation.
- Set up family “safe words.” A shared code phrase can be a quick way to verify authenticity in emergencies.
- Be cautious with what you share online. Audio and video clips can easily be scraped to build voice clones.
- Use a trusted people finder to confirm identities. With a reverse phone lookup, you can uncover who’s behind any phone number and see details that help you verify the caller, and whether they’re connected to known scam activity.
The Bottom Line
Scammers have always tried to exploit emotions, but AI voice cloning has made their tactics far more convincing (and dangerous). As we head into the holidays, awareness and skepticism are your best defenses against fraud.
If you receive a call that doesn’t feel quite right, trust your instincts. Verify first, act later. The few minutes you take to confirm could save you hundreds, or even thousands, of dollars. And before you send money or share personal information, consider running a quick search to confirm the caller’s identity. Because this season, the best gift you can give yourself is peace of mind.
This story was produced by PeopleFinders and reviewed and distributed by Stacker.