What is Your Digital Evidence Worth? Photoshopped Contracts, Doctored Recordings and Deepfakes

26/03/2021

"In a digital world, everything can be manipulated..."

In this episode (available here), Amber Boothe and Kelsey Farish discuss how the rise of photoshop contracts, doctored recordings, and deepfakes can pose serious challenges for individuals and businesses. Although this type of tech is often used in amazing and inspiring ways, we are entering a world where your words, your voice, and even your face can be edited and emulated. 

Kelsey is a Media and Technology Lawyer at DAC Beachcroft and is a leading voice on the legal implications of 'synthetic media'. She recently appeared on Channel 4's 'Dispatches' ("Deep Fakes: Can you trust your eyes?") and has written a chapter on Ghost Acting and Deepfakes in The European Audiovisual Observatory's special report.

Kelsey's blog features a wide range of fascinating articles and material - she even has a dedicated section for law students: https://kelseyfarish.com

You can also follow Kelsey's groundbreaking work on Twitter @KelseyFarish.


This episode was produced and edited by Amber Boothe.

The Legal Bytes Podcast is a Queen Mary University of London Project produced by Postgraduate Law Students in collaboration with the Technology, Media and Telecommunications Law Institute and in association with qLegal.

Resources:

Recording Software: https://cleanfeed.net/

Editing Software: https://www.audacityteam.org/

Text to Speech Sound FX: https://www.lovo.ai/

Music: Wondering in Space by Dicotomica from https://pixabay.com/

Transcription: https://otter.ai  


Full Transcription

Introduction 0:00

In a digital world, digital world, everything can be manipulated... In a digital world, everything can be manipulated. Say you run a business, you're careful with your receipts. And on top of your contracts, you do everything right. So if that day comes that you have to enforce your legal rights, you have the paperwork to back them up. And that day does come. When you pull out that contract, the person on the other side of the courtroom produces a different one. Same dates, same electronic signatures, but different insurance terms. They have a recording as well as a phone call, you know, never happened. But it's unmistakable. It's your voice agreeing to their terms. When your words, your voice, and even your face can be edited, and emulated. How can anyone tell the truth from a lie?

Amber 0:59

On today's episode of legal bytes, we'll be talking about Photoshop contracts, doctored recordings, and deep fakes. Completely synthetic recordings like the one you just listened to a cheap, easy to produce and getting more realistic by the day. So what does this all mean when it comes to your commercial and your personal rights? We've got some practical steps you can take as well as some bigger questions to consider. I'm your host, Amber. And my guest today is a leading commercial lawyer and an expert on synthetic media. I think it's fair to describe her as prolific when it comes to publishing thought-provoking work. Without further ado, I'm thrilled to welcome Kelsey Farish to the show.

Kelsey 1:40

Thank you so much for having me. Hi.

Amber 1:42

It's absolutely fantastic to have you. And after such a dramatic introduction, I think my first question will definitely have to be, what do you think the threat level is? If I'm a small business and small claims court, how likely Am I going to come up against these kind of new issues? These doctored recordings, Photoshop contracts?

Kelsey 2:01

Yeah, it's a really interesting question. And it's something that I think is going to becoming a much more of a of an issue in years to come. One of the easiest ways to kind of break down the risk, and the threat is to kind of look at it from three different perspectives. And the first is societal risk. So that's when we're talking about things like manipulated videos and contracts that pose harm to society. And, you know, they distort our notions of truth and justice, and, you know, the court system and elections and so on. So that's societal risk. The second risk is, is risk to businesses. And what I'm thinking about, there are things like falsified records and contracts, put before insurance companies, for example, or, you know, businesses where they manipulate someone into transferring funds when they shouldn't. And the third risk is risk to the individual. So that's really more of what you described in the in the intro. So you know, someone's saying, hey, look, I have a doctored contract, or I have a photograph of you doing something that you didn't do. There can also be risk to your reputation, and to your emotional well being. So if we look at those, if we again, step back at societal risk, business risk and risk to the individual, I'm, I'm pretty confident to say that the possibility is very much there. The probability at the moment, I would say, the likelihood of you being tricked into thinking that you've signed something when you didn't, or vice versa. I think it's relatively small. But I do think it's going to be growing and in the years to come.

Amber 3:46

I mean, I think, yeah, you already touched a little bit on kind of the intro on that like risk to the person. And I think obviously, in this wider conversation, we've definitely got a touch on those kind of more insidious forms of trickery that go beyond photoshopping contracts to emulating people's faces and voices. So it'd be really great if you could just I guess, yeah, give a give an overview of what deep fakes are, and I guess how they're created for anyone who hasn't really come across that term yet.

Kelsey 4:16

Sure. So when we say deep fake, what we're talking about is a synthetic or AI generated video, in which someone's face is essentially superimposed on top of someone else's body. And what happens is you look at this image, this video, and it's so realistic, that there's really no obvious way to tell that it's fake. So the term deep fake comes from deep learning, which is a type of machine learning or artificial intelligence and fake Well, it's because it's a fake video. So what happens is you basically have a machine learning algorithm that over time is trained. It's shown millions and millions of pictures of human faces and it kind of learns you know the shape and outline and you know how shadows appear on on the face and so on. And it learns a certain sequence or a pattern. And so in due course, all you have to do is essentially show it a selfie, or a couple of selfies or a video clip of someone, and using its previously acquired knowledge. So it's learning, it can then generate a really realistic video using the face that you showed it.

Amber 5:27

Yeah, and I think, I think a point that is important to touch upon here is- Yeah, I think you definitely gave a great explanation of kind of how it's created. And I think that does lead to a lot of questions of how it can be abused. But at the same time, this technology can actually be really fun and actually really helpful. I mean, personally, at least on Snapchat, those filters hell of a good time.

Kelsey 5:49

Absolutely. And they are fun. And I'd be lying. If I said that I haven't done them myself. I say that, as someone who's been studying this for, gosh, almost three years now. And yeah, there are a lot of fun, and you get a laugh out of it. And usually, you know, you can use them for very innocent purposes. But you're absolutely right, they can be used for helpful and scientific purposes as well. For example, some museums have used them to bring people who passed away back to life. And the reason why they do this is they found that people who visit the museum and interacted engage with the historical figures, or, for example, people who, who died in the Holocaust, that's a notable example. And they engage with them, they can actually the learning and the impact that you take away with you is really profound. And there's just something special about the human face and hearing the voice. You know, we as humans have evolved to be able to identify people by their faces very easily. You know, there's, there's just something very powerful and very emotive. And when you use that you can do things like helping people with Alzheimer's, it's easy to imagine a situation whereby a person who's in their late 70s, you know, they don't recognize their wife, who's before them. But once you create a deep fake of what their wife looked like 30 or 40 years ago, then suddenly their their memory comes back for a few short moments. And that can be really special and really important, and certainly use for medical research purposes and other educational purposes.

Amber 7:27

Yeah, I mean, I'm think it is always really important to touch upon how it's not the technology that is evil. Or it's not the technology that's going to be defrauding people. It is other people, I think, yeah, that definitely leads on to the question of, what are those legal problems, even if maybe we're not facing them right now? What is that future going to look like?

Kelsey 7:51

Yeah, so one of the things that I find really fascinating is just looking at the last 10-15 years or so the ways in which we are increasingly living online. And I don't just mean online shopping and online banking, I mean, forging these these communities with other people across boundaries and across time zones. And so much of that comes down to how we present ourselves, and this digital persona that we're creating. Now, we've seen the press pick up a little bit on the fact that, you know, certain troll farms, right, where you have these actors in other countries, create fake profiles of individuals, and then they go on Twitter, and they stir up mischief. Right? So that's a pretty simple form of faking a personality in order to get a desired outcome, whether it's people voting a certain way, or people sending bank details or what have you. But let's look at this a couple of years from now, where it's not just a Twitter profile that can be made fake, but it's now a zoom call. You can imagine, you know, more people working from home and engaging in things over the internet. You know, do you really have to go into the office anymore? And if you don't need to go into the office anymore? Well, let's think about having a Microsoft team's meeting or a zoom call with your boss, like, what if you just recorded it the day before, and he kind of knew what he was gonna say, or you operated it remotely. I mean, there's so many different crazy ways that this could stir up a lot of problems. My particular area of expertise is on image rights. So that basically means the rights that every person has, with how their likeness or their appearance and photos and videos are used in a commercial context. That's a really fancy way of saying publicity and privacy law. So that's my area of specialism but in a criminal context, for example, if you had someone who was impersonating another person for for fraudulent reasons, trying to get money extortion, you know, It's safe to say from my perspective that a lot of the traditional legal tools that we have at our disposal just haven't really matured together with the technology. So there's, you know, you can you can do good old fashioned criminal fraud law guess, and apply it to fake videos, but then you come into the bigger question of Well, how do you prove it's fake? What does the judge gonna think? You know, we're so used to trusting CCTV, for example, that, you know, if you see a video of something, gosh, do you really want to spend hours and hours trying to verify it? I'm not sure. And I'm not sure that we're ready to do that at scale.

Amber 10:42

I completely agree with your point that the law is lagging behind the technology. And I guess it's that lack of legal protection, that raises the big question, what can people do to protect themselves? Now, I promised some top tips when it comes to fake contracts and recordings. But before I sum that all up and round off the episode, it'd be great to get your advice on what people can specifically do about deep fakes. Because I know that a lot of those same measures apply.

Kelsey 11:07

I think the first, the first piece of advice or recommendation I would give to help people is actually the hardest one. And it's so just try to keep you know, the photos of yourself, the selfies, pictures of your kids, especially, just just keep them behind a private account, you know, really, really query whether or not you need 15,000 images of yourself, you know, on your public Instagram profile. And the reason for that is good, deep fakes are made accurate and convincing, if they have quality source images, so if you have a couple of selfies here or there, you know, the likelihood is someone is not going to be able to make a fantastic deep fake out of you. But if you have a lot of really good images of yourself, then it's very easy to plug those into the algorithm I described earlier. And to generate something, you can also try watermarking images, you know, putting emojis on top of things, making them a little less unattractive to use. And they're all also some third party solutions, whereby you can purchase a subscription and you'll have someone keeping an eye on your digital assets for you, whether that's running checks and seeing if images of you appear out on the web or you know, other other things like that. But definitely prevention is the best cure.

Amber 12:36

I mean, each of those pieces of advice you just hit on really captured three take home messages when it comes to protecting digital assets, whether that's contracts, phone calls, or photos, one, keep as much a plan as possible. voice recordings, videos, pictures, signatures, if you possibly can try and make sure that those aren't up on Facebook on Instagram where people might be able to copy them. Number two, watermarks are a useful measure button imperfect solution overall, by putting them on your documents, like photos, like contracts, that can just be a really useful first line of defense. Third, and finally a third party solutions. Have a look into this if you feel like you and your business are at a high risk. Now that's all from me. So all that's left to say is an absolutely massive thank you for Kelsey for joining me today.

Kelsey 13:22

Thank you so much for inviting me on today.

Amber 13:24

Before we end for anyone looking to know more about deep fakes. How can they keep up with you and what you're working on?

Kelsey 13:30

Probably the best way is just on Twitter. Kelsey Farish is my username. And then I also have a blog where I talk about deep fakes anything related to media and entertainment law. And that's just KelseyFarish.com.

Amber 13:43

Fantastic and I will link all those things up in the show notes. To our listeners, thank you for listening.