Technology has blurred the line between truth and fiction; and with so much misinformation circulating in today’s world, deepfake is the ultimate lie that is frighteningly better and better. Dr Matthew Beard, philosopher from The Ethics Centre in Sydney and author of Ethical Design: Principles of Good Technology, tells Siddhartha Mishra in an interview the rights and wrongs of deepfake. Excerpts:
There are deepfake videos of Indian actors in pornographic content. The porn industry embraces and incorporates technology faster than any other. Political figures in the West have been faked. In India, forget deepfake videos, even rumours on WhatsApp or old, selective footage circulated as new lead to mob lynching. Where do you see this kind of misinformation spreading next?
Technologies become even more complicated to manage today because they’re released into an already complex ecosystem. Deepfake is as hard to get a handle on; so are encrypted platforms like WhatsApp. Put the two together and you’ve multiplied the ethical complexities. This makes life really hard for regulators, designers and the general public to do a good ethical cost-benefit analysis. It’s hard to know what the tradeoffs are when the uses for a technology can be so widespread. Personally, I think video manipulation, whether deepfake or not, will be the next wave of fake news. So the question is: what are we going to do about it?
An argument being made is that “people believe what they want to believe” anyway and that deepfake is just content that thrives along with the likes of a Breitbart in the US, for example. What are your views?
“A barrier life presents to us is ethics...things we should not do as they’re wrong.”
I don’t have much time for that kind of thinking. It’s a red herring. Of course, people’s instinct is to prefer evidence that supports their views. That doesn’t make it OK to lie to them or manipulate them on that basis. We should be concerned that people are taught to make good, reasonable decisions on the basis of evidence and we should also try to stop them from being manipulated. It’s not an either-or situation; we can do both.
What could be the way forward to tackle deepfake aside from relying on tech? When a ‘big story’ breaks, even journalists go with the flow, so to speak, sometimes.
There’s no easy answer here. Technology is going to form part of the solution, but it can’t be the only weapon in our arsenal because technological fixes can only come after the fact. We need to wrestle with the fact that deepfakes were enabled by open-source technology. When should tech be made open for everyone to use? How can we manage the risks? Do the benefits outweigh the costs?
Would you put deepfake in the same genre of misinformation as fake news?
It’s different on the level of complexity and in terms of how accurate it can be. It’s also different in the potential impact, as we know more people engage with video than other mediums, like text. But in general, the risks, and many of the causes, are going to be the same as fake news.
Speaking of deepfake, you say, “We need to ask why someone thought it would be a good idea to begin with.” Is it because we fiddle with new tech when we come across it, but the rate at which tech is advancing is too difficult to keep pace with or let it even out?
Again, there’s no easy answer. I think part of it is because technology invites us to think about a world where our choices and powers are limitless. We’re trying to overcome the ordinary barriers life throws at us. But one of the barriers life presents to us is ethics, the things we shouldn’t do because they’re wrong. It’s easy for those who are keen to innovate to be sucked into seeing ethics as just another unnecessary hurdle. Another problem is that tech designers can sometimes fail to properly understand the human effects of what they’re building. Good design is centred on the people who will be affected by the technology. The people in their homes making fake porn videos of celebrities weren’t considering how that would affect those women; they were just doing it because they could.