Advertisement

Deep fakes and artificial media are coming for your business, researchers warn

Executives were increasingly likely to be victims of synthetic media attacks, or deep fakes, including extortion, blackmail and theft on intellectual property, according to research from the Queensland University of Technology.

Jun 02, 2023, updated Jun 02, 2023
QUT researchers have warned of the threat from deep fakes (Image: QUT)

QUT researchers have warned of the threat from deep fakes (Image: QUT)

According to PhD researcher Lucas Whittaker, artificial intelligence-generated content, such as deep fakes, was so realistic that humans could not tell the difference from authentic media up to 50 per cent of the time.

He and his co-researchers analysed the risks and produced a playbook for organisations to prepare for the risks.

“Inauthentic, yet realistic, content such as voices, images, and videos generated by artificial intelligence, which can be indistinguishable from reality, enable malicious actors to use synthetic media attacks to extract money, harm brand reputation and shake customer confidence,” he said.

“Artificial intelligence can now use deep neural networks to generate entirely new content because these networks act in some ways like a brain and, when trained on existing data, can learn to generate new data.

“We are getting to the point where photorealistic images and videos can be easily generated from scratch just by typing a text description of what the person looks like or says.”

Mr Whittaker said bad actors could collect voice, image or video data of a person from various sources and put them into neural networks to mimic, for example, a CEO’s voice over the phone to commit fraud.

Co-researcher Professor Rebekah Russell-Bennett said every CEO or brand representative who haf a media presence online or haf been featured in video or audio interviews must brace themselves for impersonation by synthetic media.

“Their likeness could be hijacked in a deepfake that might, for example, put the person in a compromising situation or their voice synthesised to make a false statement with potentially catastrophic effects, such as consumer boycotts or losses to their company’s share value,” she said.

The guide, Brace Yourself? Why managers should adopt a synthetic media incident response playbook in an age of falsity and synthetic media, has been published by Business Horizons.

 

Local News Matters
Advertisement

We strive to deliver the best local independent coverage of the issues that matter to Queenslanders.

Copyright © 2024 InQueensland.
All rights reserved.
Privacy Policy