31/07/2023

0:14 / 5:43
Transcript
0:00
you blame me for interfering with your
0:02
democracy it's not hard for democracy to
0:05
collapse
0:06
these days it's not totally clear what
0:08
to believe and what not to these videos
0:11
are fake liars have been around for
0:13
thousands of years deep fix are new but
0:16
they're just lies in a slightly other
0:18
form ground control laws have the risk
0:21
of non-compliance deep fakes are
0:23
becoming increasingly easy to make and
0:25
hard to detect and that's mainly because
0:27
of advances in generative AI in theory
0:31
anyone with a computer and the internet
0:33
can make a deep fake there's this
0:36
potential for videos to deceive people
0:39
there is also an example of a deep fake
0:42
of President zielinski and that was to
0:45
admit defeat to the Russians about a
0:48
year and a couple months ago
0:52
a new one
0:54
yet it's actually pretty hard for even
0:57
experts to work out whether a video is
1:00
real or not it's a problem that Intel
1:03
claims to have solved by detecting blood
1:05
under the skin so we've come here to
1:08
check it out
1:09
[Music]
1:14
the effect videos are everywhere now you
1:17
have probably already seen them
1:19
fortunately Intel is developing several
1:22
solutions to detect defects in real time
1:25
like the one you are seeing right now
1:27
because you are not the real academic I
1:31
am
1:32
alright you got me
1:35
the real life ilka agreed to sit down
1:38
with the BBC and explain the unusual way
1:42
it works if you only try to find the
1:45
wrong things
1:46
sometimes they can be fixed and you
1:49
cannot no longer find the wrong things
1:50
so we twist that question and we ask
1:52
what is uh real about authentic videos
1:55
what is really about us so fake catcher
1:58
looks at that question in the sense of
2:01
looking at your height
2:03
when your heart pumps blood our veins
2:06
are changing colors and that color
2:08
change is called photoflexmography PPG
2:11
for short we take those PPG signals from
2:14
many places on your face and convert
2:16
them into PPG maps and then we develop
2:18
deep learning approach on top of that to
2:21
classify into fake or real videos in
2:24
short a fake catcher looks for minuscule
2:27
signs of blood flow in your face
2:28
something a deep fake wouldn't have it
2:31
also analyzes videos for authentic human
2:34
eye movement normally when humans look
2:37
at the point when I look at you if I'm
2:40
sure
2:42
converging on you but for deep fakes
2:44
it's like googly eyes they are like
2:46
everywhere
2:47
Intel claims the system is 96 accurate
2:51
and can work on all kinds of deep fakes
2:54
so we decided to give it a go with deep
2:57
fakes generated by MIT I had the answers
3:00
on my phone
3:02
liability protections for companies are
3:05
more important than individual Financial
3:07
relief for teachers or sanitation
3:09
workers so in the beginning when it sees
3:12
like very little it may say that okay
3:14
this is this a PPG it looks real and
3:17
then it accumulates and has hazardous as
3:19
fake see it turns into fake yeah it
3:23
finds it at the end 84
3:26
accuracy fake okay okay interesting so
3:29
that was that was that was correct today
3:31
as the sitting president in the white
3:33
house I still believe that marriage
3:35
should be between a man and a woman I am
3:39
a traditionalist okay really it's just
3:41
that I'm a traditionalist what does what
3:44
does fake catcher say paid with 66
3:46
percent accuracy this is fake yeah yeah
3:52
yay okay next one I'm enjoying do you
3:55
enjoying this yeah this is good the
3:58
system was good at finding fakes but not
4:00
so good at working out that a video was
4:02
real I'd be working with the leaders of
4:04
Congress now today yeah okay I think
4:09
fake it's saying fake so what is it
4:11
again three uh three BCD
4:14
that is actually real really that's a
4:17
real one okay so
4:19
then um that means like the PPG signals
4:21
are broken at some point so okay here's
4:24
a problem with the system it doesn't
4:26
analyze for audio yet so often videos
4:29
that seemed fairly obviously real were
4:32
still labeled as fake and
4:37
grow the country
4:39
real yep that one's real I thought so
4:42
too
4:43
because just thinking about I guess the
4:45
worry would be that if it's in real time
4:46
you say something's fake and it's and
4:49
it's actually real and you could you
4:50
could actually be the one spreading fake
4:51
news yeah that's that's also true but
4:53
like
4:54
um verifying something fake versus be
4:57
careful this is like with 50 percent
4:59
accuracy may be fake is like a different
5:03
related differently yeah deep fakes are
5:06
going to become more and more of a
5:08
problem perhaps fake catcher will be
5:11
part of the mix of tools people use to
5:13
catch them but it's still by no means
5:16
the finished article I beat China all
5:19
the time and of course they're mad
5:21
they're very very mad