In March, the FBI released a report declaring that malicious actors almost certainly will leverage “synthetic content” for cyber and foreign influence operations in the next 12-18 months, which includes deepfakes: audio or video either created or altered by artificial intelligence or machine learning to convincingly misrepresent someone as doing or saying something that was not actually done or said. Have you seen more synthetic content recently?

Author’s note: This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it.

About Shelly Palmer

Shelly Palmer is the Professor of Advanced Media in Residence at Syracuse University’s S.I. Newhouse School of Public Communication and the CEO of The Palmer Group, a consulting practice that helps Fortune 500 companies with technology, media and marketing. Named LinkedIn’s “Top Voice in Technology,” he covers tech and business for Good Day New York, writes a weekly column for Adweek, and is a regular commentator on CNN and CNBC and writes a popular daily business blog. He’s the Co-Host of the award-winning podcast Techstream with Shelly Palmer & Seth Everett and he hosts the Shelly Palmer #CryptoWednesday Livestream. Follow @shellypalmer or visit shellypalmer.com.

Tags

Categories

PreviousPlan Your CES 2022 Experience NextSouth Korea passes ‘Anti-Google law’

Get Briefed Every Day!

Subscribe to my daily newsletter featuring current events and the top stories in technology, media, and marketing.

Subscribe