FIRST AIRED: April 8, 2019

Nice work! Enjoy the show!

×

You’re busy. We get it.

Stay on top of the news with our Editor’s Picks newsletter.

US Edition
Intl. Edition
Unsubscribe at any time. One click, it’s gone.

Thanks for signing up!

We've got more news

Get our editor’s daily email summary of what’s going on in the world.

US Edition
Intl. Edition
Replay Program
0:00
0:00
More Info

COMING UP:Share Opener Variant 1

×

Transcript

00:00:00
we're entering an era in which our enemies can make it look like anyone is saying anything at any point in time they're called deep fakes digitally altered videos and images that are so lifelike there virtually indistinguishable from the real thing it's true like this one made by researchers at Carnegie Mellon University showing comedy show host John Oliver's facial expressions on the left seamlessly transferred onto a video of CBS host Stephen Cole bear , reading images like these increasingly easy to make our planting deep fears in Washington and they use technology to create essentially a false reality after a twenty sixteen election plagued by alleged Russian interference US officials are warning deep takes pose a new and potentially more dangerous threat as a fresh crop of candidates barrels towards twenty twenty are we organized in a way that we could possibly respond fast enough %HESITATION to a catastrophic defects attack we certainly recognize the threat %HESITATION of emerging technologies %HESITATION and the speed at which that threat %HESITATION increases charm he twenty twenty maybe the point that all but a real expert or technologically enhanced reviewing capability would be fooled former homeland security secretary Michael Chertoff who served under president George W. bush is co chair of the trans Atlantic commission on election integrity and says deep takes could disrupt future elections in myriad ways into right now or you can just tell by reading the news what kinds of things are sure fishing from people's past even our back twenty or thirty years that are now viewed as disqualifying for office imagine if these aren't just historical relics but actually completely fabricated video images and you can see the have a good track record as in our political process deep faith are a quantum leap from simpler techniques such as photo shopping the technology draws on artificial intelligence AI to create a fabricated image of a real person with the potential to make them say or do anything for instance they could have me say things like considered this video of former president Barack Obama it looks like him and sounds like him keep watching president trump is a total and complete it's actually the voice of comedian and director Jordan Peele his lips superimposed on , to an old video of Obama moving forward we need to be more vigilant with what we trust from the internet peel produces deep bank last year along with buzzfeed to warn the public that scene is no longer necessarily believe it , it sounds like the stuff of Hollywood special effects artists but cyber experts warn the technology to make deep bakes is becoming more accessible to the average user who use programs such as bank gap and some have already paid a price I would say that about ten percent of the individuals who contact me are victims of defects Charlotte Lawson anti revenge porn activist in California says debates originated on **** sites a few years ago the perpetrators targeting famous actresses or even non public figures ripping off images from their social media accounts if you can get them to go viral then they can just pop up everywhere and everybody else see at and then it becomes very hard for the victim to be able to say no no this isn't true and you can just feel like you're overwhelmed and that you can never get the truth out there as the threat spreads to the area of national security it's driving research in the private sector academia and even at the defense department to detect and root out deep fakes it was pretty incredible %HESITATION the the changes that were happening David Dorman launch the deep face program at DARPA the defense advanced research projects agency in twenty fourteen he's now a professor at the university of buffalo and says both artificial intelligence and human analysis can do a lot to sniff out fakes looking at everything from color modification to compression rates whether the subject is blinking in a natural way even if they would never say those things but in a world of rapidly changing technology it's a constant game of catch up it's just like spam email %HESITATION it's not clear that we're ever gonna solve the problem completely %HESITATION but we're gonna get to the point where we make it expensive enough , expensive in in many %HESITATION definitions of the of the word %HESITATION but it's not worth it for people to to be able to do this but Chertoff says technology can't be the only protection against a potential deep bass attack but a big part of this is educating people and teaching them to use their common sense and critical thinking and if something doesn't make sense even if you see with your own eyes maybe you should hit pause and do a little bit of research secrets January a message he hopes voters especially will heed heading into twenty twenty