Dean Russell: Thank you for your evidence today. I will be frank. Last week, we interviewed Facebook. I came away from that session feeling that it was not really committed to the safety levels that we have heard about in our sessions. Call me cynical, but the timing of its Meta announcement was quite interesting. It happened within probably minutes of our session, where we were asking it about safety. It then put out what I thought was a rather toe-curling and cringeworthy video talking about safety, just after we had heard much evidence about the fact that the safety is not there. I would like to get your sense of how much actual accountability you think the platforms will take. I know you talked about it just now, but do we need to really toughen up this Bill so that individuals within Facebook and the other platforms feel the pain if they cause harm to other people?
Dame Melanie Dawes: There are a couple of areas where I would slightly toughen the Bill. One of them is something I mentioned earlier, which is requiring them to engage with external researchers. There is an opportunity for the regulator to set some terms for that, some accredited researchers. There is a slight risk that the European Union will make that a requirement in its Bill, and therefore British research groups do not get the same cut in what could be a growing market. It would disadvantage UK users if we do not have the same powers as are going to come in the EU. That would be a toughening, and I think would go to the heart of some of the engagement that needs to change. We see some platforms—Google, for example—opening up its centre in Dublin, where people can go in and engage with it on its algorithms and how they are designed. We see Twitter publishing research on when it has had external researchers looking at whether or not it is recommending content from different sides of the political spectrum. That sort of engagement with external researchers and others, while it still has 13 further to go, is what we want to see from all the platforms. As I say, it is cultural as much as anything.
Dean Russell: Within that, we have heard evidence where even in a public body like this, with MPs who are looking to change legislation around online safety and online harms, platforms still have not taken down certain content. I will not give examples that have been given before by colleagues, but even in this very public and visible platform they still have not removed content. My concern is about the immediacy that will come off the back of the Bill and the powers that you have. Do you think there will be immediacy of change? The football example was used earlier. There is also the more damaging one of flashing images being shown, or sent directly, to people with epilepsy. Do you think this Bill will address that, or is it going to be two or three years down the line before we start to see any actual change?
Dame Melanie Dawes: What we would like to try to achieve, in the years before the Bill comes into force, is a deepening of our conversation with the platforms that we are not already regulating. We are already working in a formal way with quite a lot of the platforms, particularly those where young people spend a lot of their time—TikTok, Snapchat, Twitch and so on—so we are well placed. In fact, we will be writing to them all by the end of this year to confirm the steps we are looking for from them in the coming 12 months. That is already happening with some of the competitors to the bigger platforms, to be honest, those such as TikTok, which is already biting away at their audience share, particularly the younger audience share. I think that will have an impact.
What we want to do beyond that in relation to this Bill is to begin to have a deeper conversation—we have not really started the regulatory conversation yet—during 2022 and 2023 about how we are all going to get ready. In relation specifically to the sending of flashing images to people with epilepsy, the Law Commission has made recommendations to make that illegal. It seems extraordinary that it is not already, or that anyone would do it. That has the potential to come clearly under the scope of the Bill. It is the sort of specific example that we could begin a conversation on very quickly indeed. Dean Russell: In the practicalities of this, in three years’ time this Bill comes in in its current form. In a real-world example, if somebody sends flashing images to a child with epilepsy, what would the process be at that point? Would someone report it to Ofcom? Would they report it to the platform? How would you make sure that that bit of content—flashing images as an example—comes off that platform immediately?
Dame Melanie Dawes: First, I hope we would know about it. There are a number of ways in which that can happen. People can tell us. We may not be able to deal with individual complaints, but they can certainly tell us if they have a concern. We can follow that up straightaway with the platform. It is possible that we might find, in this case, maybe the Epilepsy Society or a similar organisation ready to make a super complaint if things move fast. If it is illegal content or clearly harmful, particularly if it was in relation to children, the Bill is very clear that that sort of thing is not within what is acceptable, and I think we have the ability to get on the phone very quickly about that. That may sound a bit unspecific, but that is the starting point: “What is going on? What have you done? What have you changed? What are your terms and conditions? Why is this still happening?” We would expect action.
Dean Russell: If they do not take that action, what then?
Dame Melanie Dawes: Then we can go down an enforcement route. As the Bill is coming in, obviously there are a number of bits of it that need to be put in place. One of the reasons why we are very pleased that we have already had a lot of resourcing from the Government to begin our preparation is that we can do as much as possible in advance of Royal Assent and the various stages after that. Until we actually have the formal powers we cannot use them, but we can certainly anticipate them in the way we engage with the platforms on issues that are so problematic—pretty quickly, I hope.
Dean Russell: Is there anything we need to add to the Bill to help that process at your end? What would it look like as it stands versus what we could do to make it better?
Richard Wronka: In the example you are giving, there is a premium on the platform addressing the issue before someone is harmed by the dangerous content—in this case, flashing imagery that has been sent maliciously. To generalise a little bit, we think there is a really important role, as indeed we have seen already, for platforms to use technology to proactively identify harmful content. In some cases, they have done that broadly successfully, and in some cases less successfully. First, there is a point about how the use of technology powers is currently constructed in the Bill. We think there is a case for extra clarity to describe the situations in which Ofcom might be able to recommend or require the use of technology to identify illegal content with the intention of it being removed. In this instance, there is a separate question about what is technically feasible. In this particular situation, it is probably the only way to avoid the harm happening in the first place, which is what we all ideally want. That is one specific thing that we think could be addressed through the Bill.
In addition, recognising that some of this is technology at the cutting edge, we think there is an important role for Ofcom in researching, potentially with the industry, what is viable today and what the state of the art might be in five or 10 years’ time, and trying to speed up some of the processes and channel investment in the industry into developing technologies that are more effective than the ones that are currently at the disposal of platforms, as well as thinking about how they can be 15 deployed by a wide diversity of services, not just the very biggest services.
Dean Russell: Thank you.