Not so many moons ago, seeing was believing. Today, however, that assumption no longer holds. Advances in artificial intelligence now allow highly convincing fake videos, images and even voice recordings – known as deepfakes – to be created with alarming ease. While the technology has legitimate uses in film and entertainment, it’s increasingly being exploited by scammers targeting everyday Australians, including retirees.
Deepfakes work by training AI systems on real images, video or audio of a person. The result can be a fabricated clip that appears to show someone saying or doing something they never actually did. In some cases, scammers are cloning voices from just a few seconds of audio, often scraped from social media or voicemail recordings.
Why it matters
For retirees, the risk is not theoretical. A growing number of scams involve deepfake phone calls or videos impersonating family members, bank staff or even government officials. Imagine receiving a distressed call that sounds exactly like your grandchild asking for urgent financial help. The emotional pressure, combined with the realism of the voice, can override even the most cautious instincts.
Spotting the signs
While deepfakes are improving, they’re not flawless. Look for subtle inconsistencies:
- Unnatural facial movements – lips slightly out of sync with speech or odd blinking patterns
- Visual glitches – blurring around the face, especially when the person turns their head
- Strange lighting or shadows – mismatched with the environment
- Audio irregularities – flat tone, unusual pacing or slight digital distortion
Equally important is context. Ask yourself: Does this message make sense? Is the request urgent, secretive or out of character?
Practical safeguards
The most effective defence against deepfakes is a combination of awareness and simple habits:
- Pause before acting – urgency is a key tactic in scams
- Verify independently – call the person back on a known number
- Use a family ‘code word’ – a simple agreed phrase to confirm identity in emergencies
- Limit what you share online – especially clear audio or video clips
Banks and authorities, including Scamwatch, consistently emphasise that legitimate organisations will never pressure you into immediate financial decisions over the phone or via video.
Staying confident, not fearful
Technology will continue to evolve, but so can your confidence in using it. The goal is not to distrust everything you see or hear, but to apply a healthy level of verification when something feels off.
In a world where reality can be convincingly simulated, a moment of pause – and a quick double-check – remains your most powerful tool.