Skip to content
Information Technology, Internet

What to do about deepfakes: opportunities and problems as AI tech makes leaps and bounds

UNSW 5 mins read

Telltale signs to spot deepfakes are disappearing at an alarming rate, but a UNSW academic says these advances in AI also provide opportunities for a better, more accessible internet.

The boom in generative AI, with programs for creating ‘deepfakes’ becoming freely available for regular people, is causing just as many problems as it’s creating opportunities.

AI generators used to be awful at drawing hands (ironic, given how many humans struggle with it). They’re getting better at that. That weird, plasticky sheen over generated photos? That’s going away too. 60% of respondents to a survey of more than 1000 people thought a video made by OpenAI’s Sora program was real. See for yourself.

Cybersecurity and AI expert Professor Sanjay Jha says it’s fast moving out of regular human reach to tell what’s real and what isn’t.“We couldn't foresee what was coming out of AI 10 years back. If you interviewed anyone like me, they wouldn't be able to tell you.”

And while there’s danger in that, there’s also the potential for the technology to make the internet more accessible for everyone.Prof. Jha says he’s fully on board. He’s pressing on with his own technology, patent pending, that could break down barriers for people with disability.

Sanjay’s technology

During COVID, Prof. Jha developed lower back nerve pain and couldn’t teach students to the best of his abilities.

“I was thinking…is there an alternate way that I can generate content so that it can be automated, and then I could probably use it for teaching by not spending hours on recording the videos.”

He started working on a prototype program that could make a digital version of himself without using lots of content (hours of footage, audio files) to save on time and energy usage.

I volunteered as a guinea pig for this article. Prof. Jha’s student, Wenbin Wang, took a minute of me speaking into a webcam and created a digital clone of myself teaching one of the professor's subjects.

.

A video comparing footage of the real me and the AI clone version. The clone is presenting lecture slides for a computer science class.

In the video above, you’ll see a side-by-side comparison of myself and the AI version. Up close on the clone, you can see the unnatural sheen over my skin, the eyes flickering in and out and the mouth movements don't match.

The voice, while capturing my natural way of speaking, does not have the same highs and lows that come with human speech.

Those things may be obvious, but when the video shrinks to fit with the lecture slides and the audio is compressed by the recording, it becomes harder to tell. The clone only took three minutes to process.

“If we had hours of your recording, then we can definitely do a lot better,” Prof. Jha says, “but I would like to highlight that it was your voice and you were listening…an average person meeting you in casual settings would not be able to pick it up.”

Prof. Jha sees no problem with this tech if people are honest about it.

Musician FKA twigs recently told a US congressional inquiry that she had created an “AI twigs” that will interact with fans while she focuses on her art, but her written statement doesn’t say anything about whether people will know when they’re talking to the real her.

“You should tell your fans, for example, that this is my persona created by an AI agent,” Prof. Jha says, “and I'm not sure whether that's going to be terribly popular, but time will tell.”

The bright side of AI clones

Prof. Jha’s technology was made with accessibility in mind. He sees having a virtual clone as a game-changer for people who struggle with public speaking and presenting.

“There are people I know personally who have speech stutters. They're fantastic researchers and they could be great teachers if they don't have to speak for an hour or two in front of the class because they don't feel that confident. By using our technology they could produce automated lecturing.

“I think with disability also there are possibilities of using this tech for, say, sign language. There are numerous opportunities for multilingual capabilities. It could be doing translation of my speech in Mandarin or Hindi or German. And when they ask questions it could translate back to me in English.”

This idea was tested out in this year’s election in India. Prime Minister Narendra Modi’s party used AI to translate his speeches into several languages in a country that has more than 800 official and unofficial dialects. 

Defending against the dark arts of deepfakes

When it comes to tips for spotting malicious deepfakes so you don’t get tricked or scammed, Prof. Jha says there’s no longer much of a point.

“We need tools and techniques to detect that rather than relying on people.”

Australians are being scammed by fake TV shows. UK engineering company Arup lost millions to hackers in an elaborate video-conference scam. Last year, a fake photo of an explosion at the Pentagon went viral and caused a stock market dip.

Detection tools are available and improving but they’re scattered.

A recent study by the Reuters Institute found you can fool some of them by editing the content, doing things like lowering the resolution or cropping the image before submitting it for analysis (like how my video became more believable when it shrunk to fit the lecture slides).

An internet industry body called the Content Authenticity Initiative has developed a watermark system called Content Credentials. They are voluntary tags that show details of how content was made and its edit history. TikTok will use them to label videos made with AI on its platform. Instagram now asks people to label things made with AI before they post.

From the legal side, the Australian government is hoping to pass legislation that punishes the sharing of non-consensual AI pornography with up to seven years in prison.

But audio might be the toughest nut to crack of them all. Deepfake audio is cheap to produce and free tools to detect them right now aren’t great.

On top of that, a deepfake phone call scam would have the element of surprise.

“Your reaction time [in this situation] is impulsive, you're not going to search [for answers] and find out when you're panicking,” Prof. Jha says.

“We are in an era of active research in this kind of area and it's a cat and mouse game as usual.”

So what can you do? Prof. Jha’s best advice is to be wary when on the internet and know that urgency when doing anything online is a good sign something is off.

“Say if your boss is calling you and asking you to transfer $200,000 into some account and you are accountant in charge of the money…ask some questions and so forth to make sure that you get more context of it.

“Be vigilant. I would never ask people not to pay attention, always be suspicious, and if you have any doubts, do due diligence.

“Like any powerful tool, AI can be used for construction or destruction. The excitement of innovation must be paired with critical thinking.”


Key Facts:

A cybersecurity expert says deepfakes are evolving so fast that human detection is becoming obsolete and telltale signs are disappearing.

Professor Sanjay Jha says the technology used to create AI clones can also make the internet easier and more accessible for everyone.


Contact details:

For enquiries about this story or to arrange interviews, please contact Jacob Gillard:

Email: [email protected]

Phone: +61 2 9348 2511

Media

More from this category

  • Information Technology
  • 12/12/2025
  • 08:11
Datavault AI Inc.

Datavault AI Inc. (NASDAQ: DVLT) Announces a Distribution Date of Dec. 24, 2025, for the Dream Bowl Meme Coin Tokens to All Eligible Record Equity Holders of Datavault AI and Holders of Common Stock of Scilex Holding Company

PHILADELPHIA, Dec. 11, 2025 (GLOBE NEWSWIRE) -- via IBN-- Datavault AI Inc. (NASDAQ: DVLT) (“Datavault AI” or the “Company”), a leader in data monetization, credentialing, and digital engagement technologies, today announced that its board of directors (the “Datavault Board”) has set Dec. 24, 2025, as the distribution date for the Dream Bowl 2026 Meme Coin token (the “Meme Coin”) to all eligible record equityholders of Datavault AI. Dec. 24, 2025, will also be the distribution date for Datavault AI’s voluntary distribution of Meme Coins to record holders of common stock of Scilex Holding Company (NASDAQ: SCLX), which is being made…

  • Internet, Youth
  • 12/12/2025
  • 07:00
Monash University

4 in 5 Australian adults support social media ban for kids

With Australia’s social media ban coming into force this week, a new survey from Monash University has found that almost four out of five Australian adults support the Australian government’s social media ban for children under 16. The survey, funded by the Australian Research Council and conducted by Roy Morgan on behalf of researchers at Monash University surveyed 1,598 Australian adults, found that 79 per cent supported the ban. Support was lowest (72 per cent) among 18-24-year-olds. By contrast, 80 per cent of those aged 50-64 agreed with the ban as did 87 per cent of those 65 or older.…

  • Information Technology
  • 12/12/2025
  • 05:26
Denodo Technologies Inc. ("Denodo")

Denodo Named a Leader in the 2025 Gartner® Magic Quadrant(TM) for Data Integration Tools for Six Consecutive Years

Denodo believes this recognition is due to the strength of its AI capabilities and the loyalty of its diverse customer basePALO ALTO, Calif., Dec. 11, 2025 (GLOBE NEWSWIRE) -- Denodo, a leader in data management, today announced that Gartner® has positioned the Company as a Leader for the sixth consecutive year in its 2025 Magic Quadrant for Data Integration Tools. “Data integration tools remain a fundamental architectural component as organizations increasingly seek improved capabilities to support their operational, analytical and AI use cases,” states Gartner. “This research helps data and analytics leaders make their decisions by analyzing 20 vendors in…

Media Outreach made fast, easy, simple.

Feature your press release on Medianet's News Hub every time you distribute with Medianet. Pay per release or save with a subscription.