Interview: Chloe Colliver- Head of Digital Policy & Strategy at the Institute for Strategic Dialogue
With disinformation and “fake news” continuing to impact such global events as Brexit and the Covid-19 pandemic, we sat down with Chloe Colliver to hear her insight. As the Head of Digital Policy and Strategy at the Institute for Strategic Dialogue, she fights for honesty in the face of online deception.
Chloe’s role at the ISD is to research disinformation, misinformation and digital extremism. Along with her team, such important work aims to tackle online terrorism and equip organisations with the knowledge needed to see beyond misleading statements online.
Discover what the future holds for “fake news” in our latest interview:
Q: How do you predict fake news and disinformation will evolve in the future?
“Lots of people have tried to predict the next big thing in disinformation and often the answers you'll hear will be deepfakes or people's audio being placed on different bodies and all sorts of quite snazzy technological developments and advances.
“However, from what we've seen over the last few years, I would say that the biggest development is going to be the accessibility for people who are using disinformation tactics to try and affect different kinds of change.
“So, it will actually be not necessarily high quality or professional content, but using disinformation to target issues like climate change, to target issues like migrant rights, all sorts of areas of policy and discussion that can become targets of this kind of tactic online.
“I worry that actually, we'll see a broadening and a flourishing of disinformation, toolkits and tactics across a much broader area of issues rather than necessarily a new tool or kind of technique that is used to promote false information.
“I do think we need to look out for a particularly effective image and video editing on women, which is something that we're keeping an eye on in terms of how those sorts of tools are used to harass, attack and defame female public figures across the world.”
Q: Why is it important to regulate the technology industry, and how must governing bodies do so?
“The technology industry has given us enormous benefits and wonderful gifts in terms of our ability to communicate around the globe. But like any other corporate industry, there has to be some kind of oversight of the effects that these kinds of products could or might be having.
“At the moment, the tech industry is pretty much a Wild West in terms of the effects it might be having on children and adults, on our democracies. So really, it's up to democratic governments to retake the baton and to formulate and design responsible and proportional regulation that can ensure that those systems, those products, those companies are doing the jobs they set out to do and not doing things that are counterproductive to human rights or fundamental rights.
“It seems like a controversial idea, but actually, regulation of corporations is a very, very accepted status quo. This kind of regulation should really look to curb any excesses and negative effects of these products, rather than to curtail their ability to grow or to provide really useful and interesting services for their customers.”
Q: You talked about disinformation and the Covid-19 vaccine, what was the biggest cause of disinformation?
“When we talk about disinformation, we're specifically thinking about intentional false information. So, that's separate from the general false information you might share with your friends and family, or you might see on your social media from people in your social networks.
“With disinformation, it's quite hard to draw a comparison of the reach of disinformation from say, foreign states like China, or politicians, celebrities or influences that might be shouting it from their own accounts or profiles. There's some research this year that shows, particularly around the Covid-19 pandemic, that disinformation was disproportionately shared by celebrities online, which is a very interesting finding.
“It shows the importance of public figures in disseminating false narratives and conspiracy theories around really important issues like public health.”
Q: Why are unfounded conspiracy theories so quickly believed, is there a social or psychological element to it?
“Conspiracy theories are as old as time, really, and they've always thrived in times of crisis and social upheaval. And that is partly due to the way that our brains work, showing curiosity around conspiracy theories.
“There are a few reasons for this. People are drawn to conspiracy theories in part to try and satisfy a psychological motive. So, the need for knowledge or certainty – we sometimes see that low education rates can correlate with people who are susceptible to conspiracy theories.
“That's not because those people are silly. It's because they're actually in search of knowledge, often in places that aren't reliable, and they don't have the tools or the people around them to help them know where reliable and trusted sources of knowledge might come from.
“But then there's also a psychological need to feel safe and secure in the world. That's also part of why people seek out and believe conspiracy theories. [Covid-19] is a really good example of this one because a genuine external threat can affect how people interpret information.
“And then finally, the other psychological aspect to this is the science suggests people have a desire to feel good about themselves as individuals, but also the groups that they're part of.
“So there's a little bit of an inner dynamic that conspiracy theorists often promote to enhance the sense of belonging to a community or opposition to a supposed ‘bad guy’. And we see that a lot with conspiracy theories that overlap with extremist propaganda, for example.”
Q: How can organisations ensure clear and concise communication with their consumers, to avoid disinformation?
“So, our team at ISD have done quite a lot of work, both with businesses but also in schools and youth communities, to think about building resilience against disinformation and other kinds of online harms. Transparency and clarity of communication with peers and networks are critical to that.
“We're really thinking about getting ahead of the curve on these issues, building resilience, helping people understand critical thinking about information, rather than debunking information after the fact, which can often be counterproductive or really difficult to achieve success with.
“The advice that I would give to businesses or organisations working with large audiences or consumers is to always consider transparency and clarity in your messaging and to make sure you're directing people to sources of information about your products or your organisation that are clear and trustworthy.
“That's really the first step we can take to make sure we're all taking part in a much more open information system that doesn't promote these kinds of disinformation or conspiracy theories.”
Q: How has the digital revolution increased hate speech, and what more must social media sites do to clamp down on online bullying?
“It's difficult to tell whether social media has created more hate or more hate speech, but what it's certainly done is make it much more accessible to many more people. Visibility and accessibility of hate speech mean that the victims of this kind of content are manifold, and they're receiving [hate] not just in the streets, but also in their bedrooms, on their phones and all around them. So, we really need to be able to apply existing laws better when it comes to hate and harassment in the online space.
“That's one aspect of this. We're not really set up very well to deal with existing legal parameters in a very fast-paced informational world. But we also need to adjust those laws and those expectations better, given that we have a whole new way of communicating with one another.
“There are a number of developments looking at whether online platforms should take responsibility for some of the content that is on their sites, including hate speech, terrorist content, disinformation. There’s a fine line between censorship and expectations of censorship from these platforms but keeping people safe and secure at the same time.
“What we can see, is platforms need to impose their own existing terms of services much more effectively to protect people from hate speech and targeted harassment.”
Having completed a Masters degree in Creative Writing, Megan joined Champions Speakers in 2019. She is found regularly contributing to the company blog where she discusses everything from the latest news to self-improvement, as well as contributing t... Read more
Champions does not speak for or claim to represent all the individuals listed on this website as the exclusive booking agent or agency unless specifically stated. Acts listed on this site are for informational purposes only.