![NOVA](https://image.pbs.org/contentchannels/iAn87U1-white-logo-41-7WCUoLi.png?format=webp&resize=200x)
How Deepfakes Manipulate Reality
Clip: Season 51 Episode 5 | 4m 22sVideo has Closed Captions
Deepfakes are getting more convincing and easier to make.
Correspondent Miles O’Brien investigates the world of A.I.-enabled deep fake imagery.
National Corporate funding for NOVA is provided by Carlisle Companies. Major funding for NOVA is provided by the NOVA Science Trust, the Corporation for Public Broadcasting, and PBS viewers.
![NOVA](https://image.pbs.org/contentchannels/iAn87U1-white-logo-41-7WCUoLi.png?format=webp&resize=200x)
How Deepfakes Manipulate Reality
Clip: Season 51 Episode 5 | 4m 22sVideo has Closed Captions
Correspondent Miles O’Brien investigates the world of A.I.-enabled deep fake imagery.
How to Watch NOVA
NOVA is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Buy Now
![NOVA Labs](https://image.pbs.org/curate-console/3d6fe803-30cf-41f9-b528-9cac42120bf4.jpg?format=webp&resize=860x)
NOVA Labs
NOVA Labs is a free digital platform that engages teens and lifelong learners in games and interactives that foster authentic scientific exploration. Participants take part in real-world investigations by visualizing, analyzing, and playing with the same data that scientists use.Providing Support for PBS.org
Learn Moreabout PBS online sponsorship(dramatic electronic music) - [Miles] Fakes are about as old as photography itself.
Mussolini, Hitler, and Stalin all ordered that pictures be doctored or redacted, erasing those who fell out of favor, consolidating power, manipulating their followers through images.
- They've always been manipulated throughout history, but there was literally, you can count on one hand, the number of people in the world who could do this.
But now, you need almost no skill.
And we said, give us an image of a middle-aged woman newscaster sitting at her desk reading the news.
- [Miles] Hany Farid is a professor of computer science at UC Berkeley.
- And this is your daily dose of Future Flash.
- [Miles] He and his team are trying to navigate the house of mirrors that is the world of AI-enabled deep fake imagery.
- Not perfect.
She's not blinking, - Pretty good.
- but it's pretty good.
And by the way, - For a quick half-day job, it's pretty good, yeah.
- he did this in a day and a half.
It's the classic automation story.
We have lowered barriers to entry to manipulate reality.
And when you do that, more and more people will do it.
Some good people will do it, lots of bad people will do it, there'll be some interesting use cases and there'll be a lot of nefarious use cases.
- Okay, so glasses off.
How's the framing, everything okay?
About a week before I got on a plane to see him-- - Hold on.
- [Miles] He asked me to meet him on Zoom so he could get a good recording of my voice and mannerisms.
- And I assume you're recording, Miles.
- [Miles] And he turned the table on me a little bit, asking me a lot of questions to get a good sampling.
- How are you feeling about the role of AI as it enters into our world on a daily basis?
- I think it's very important, first of all, to calibrate the concern level.
Let's take it away from the "Terminator" scenario.
(fire roaring) The "Terminator" scenario.
- Come with me if you want to live.
(dramatic cinematic music) - You know, a malevolent neural network, hell-bent on exterminating humanity.
A week later, I showed up at Berkeley School of Information, ironically located in the oldest building on campus.
So you had me do this strange thing on Zoom.
Here I am.
What did you do with me?
- Yeah, well, it's gonna teach you to let me record your Zoom call, isn't it?
So yeah, by the way, - I did this with some trepidation.
- let me just say, I-- - [Miles] I was excited to see what tricks were up his sleeve.
- I uploaded 90 seconds of audio and I clicked the box saying, Miles has given me permission to use his voice, which I don't actually think you did.
(laughs) And I waited about, eh, maybe 20 seconds, and it said, okay, what would you like for Miles to say?
And I started typing, and I generated an audio of you saying whatever I wanted you to say.
We are synthesizing at much, much lower resolution.
This is-- - [Miles] You could have knocked me over with a feather when I watched this.
- Terminators were science fiction back then, but if you follow the recent AI media coverage, you might think that terminators are just around the corner.
The reality is that much-- - [Miles] The eyes and the mouth need some work, but it sure does sound like me.
- That doesn't mean that there isn't-- - [Miles] And consider what happened in May of 2023.
Someone posted this AI generated image of what appeared to be a terrorist bombing at the Pentagon.
- Today, we may have witnessed one of the first drops in the feared flood of AI-created disinformation.
- [Miles] It was shared on Twitter via what seemed to be a verified account from Bloomberg News.
- [Newscaster] It only took seconds to spread fast.
- With the Dow now down about 200 points, the-- - Two minutes later, the stock market dropped a half a trillion dollars from a single fake image.
Anybody could have made that image, whether it was intentionally manipulating the market or unintentionally, in some ways it doesn't really matter.
- [Miles] How are we to know what's real anymore?
- How you look at the world, how you interact with people in it, and where you look for your threats to that change.
- [Miles] Generative AI is now part of a larger ecosystem that is built on mistrust.
- We're gonna live in a world where we don't know what's real.
- There is distrust of governments, there is distrust of media, there is distrust of academics.
And now, throw on top of that video evidence, (chuckles) so-called video evidence.
I think this is the very definition of throwing jet fuel onto a dumpster fire.
And it's already happening, and I imagine we will see more of it.
(minimal electronic music)
Video has Closed Captions
Explore the promise and perils of new A.I. technologies. (26s)
The Birth of Artificial Intelligence
Video has Closed Captions
The origins of modern AI can be traced back to World War II. (3m 24s)
How A.I. is fighting wildfires
Video has Closed Captions
Firefighters are using AI to find wildfires before they explode. (1m 34s)
How Large Language Models Like ChatGPT Work
Video has Closed Captions
It’s not always easy to tell the difference between writing by a human and ChatGPT. (1m 44s)
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipNational Corporate funding for NOVA is provided by Carlisle Companies. Major funding for NOVA is provided by the NOVA Science Trust, the Corporation for Public Broadcasting, and PBS viewers.