It’s the crack of dawn on a Sunday morning and phones around the nation, soon to be around the world, are lighting up. A video of a Premier League footballer caught in an extremely compromising position has been leaked on social media. The player has already been on Twitter and swears it isn’t them. You’ve seen the footage, you’ve been here before, it’s clearly your player and you need to make a call. But then the footballer’s management get in touch, they say the player wasn’t there, they say it’s a deepfake. 

You may have heard the term “deepfake” bandied around over the last few years, often framed as a serious threat to our democracy. But could deepfakes also be a threat to those in the sports sector and is there any light at the end of a (potentially very dark) tunnel?

What are deepfakes? 

In a nutshell, the term “deepfake” refers to a media-manipulation technique that can be used to stitch a person’s likeness and voice into pre-existing video that they have not actually participated in and make it appear that they are doing and saying things they haven’t. An example of this, is this deepfake video of “Boris Johnson” endorsing Jeremy Corbyn for Prime Minister.

The “deep” bit comes from its use of a subset of artificial intelligence (AI) technology called “Deep Learning” which is a machine learning technique in which a computer system interprets and learns from data and then uses that learning in a way that mimics the way the human brain works.

The more images or video clips of a person’s face and voice that are fed in to the computer, the more realistic and seamless the deepfake produced. With reams of photos and videos of athletes in all sports readily available, it is clear that there is a real risk across the sector, with the most famous sports stars being particularly vulnerable.

Potential for foul play

Today’s deepfakes are still relatively easy to detect by the human eye. In the scenario with the footballer player above, once you had watched the video it’s highly likely that you would have been able to tell that something wasn’t quite right. But by that point, with the video going viral and it being a topic of national conversation, much of the reputational damage for the player and club may have already been done. As the technology improves it is likely that it will become much harder to distinguish between what is real and what is fake, adding a time-consuming and expensive layer to any investigation into player misconduct should they cry “deepfake”.  

However, deepfakes have the potential of causing much more harm than reputational damage. University College London has recently ranked deepfakes as the most worrying use of AI in its potential applications for crime or terrorism. As we reported last month, the sports sector has been identified as a target for cyber attackers, and there are two key ways that deepfakes could become the newest and shiniest tool in the cyber attacker’s arsenal:

The flip side scenario of our football star alleging that a compromising video is a deepfake, is one where the video is indeed fake and the player is approached prior to its release with the threat that the blackmailer will do so unless he hands over large sums of money.  While extremely wealthy sports stars are more likely to be targeted in this way, athletes at all levels are potentially at risk. For example, it is feasible that a professional cyclist at one of the lower tiers could be threatened with a deepfake video in which they “admit” to doping before a race and are asked to hand over a smaller amount to avoid the risk of disqualified from competing pending investigation.

Spear phishing
It’s not just individual athletes who can be targeted, Clubs, federations and businesses across the sports sector are at risk. Last year, deepfake technology was alleged to be behind a spear phishing scam when a UK CEO in the energy industry was tricked into transferring £200,000. The CEO received a phone call apparently from his boss at a parent company. The voice at the other end of the line sounded identical to that of his boss and told him to make an urgent transfer of money. However, the voice was not his boss at all, but potentially a deepfake created to defraud the company.

With working from home having become the norm for obvious reasons this year, the potential for deepfakes being used in this way has rocketed and all levels of employees in the sports sector, already a target for cyber attackers, need to be aware of the threat.  

Potential for fair play, or even for good sport

As mentioned above, much of the coverage of deepfakes frames the technique as a major threat to society. However, as the technology becomes better and more understood, we are beginning to see ways in which it can be used commercially, and even for moral good. In fact, this has been particularly the case in the sports sector.

Beginning with the good, deepfake technology was used in this charity video by Malaria No More to enable David Beckham to “speak” in nine different languages. In a behind the scenes interview Beckham reportedly said (in his own voice): “It’s great to be involved in something where the tech side of our lives and our world get involved, to be one voice of many different people.” 

There’s clearly huge potential for deepfakes to be used in endorsement and sponsorship deals. Not only will it be possible for sports stars and coaches to be able to “speak” in different languages without the need for subtitles or interpreters, there’s also a theory that celebrity athletes would be able to licence their “personal deep network models” so that deepfake footage can be created without the need for travel to a video shoot. In fact, Hulu recently launched a clever new National Football League campaign using deepfake technology to circumnavigate and poke fun at the limitations of production due to Covid-19 by stitching the heads of American football players onto less athletic bodies.

Of course, image rights are not formally recognised in the UK, and the copyright in any image or video taken of the athletes will be owned by the person who created them and not the athletes themselves. The question of how “personal deep network models” will be licenced, and by who, is therefore a complex one. Although under English and Welsh law, “passing off” may assist athletes in cases where deepfakes are used to falsely suggest they endorse a product, we may even require the development of new law to protect an individual’s right not to be deepfaked without their consent.