& Views.

Return to list

Social media and the knowledge illusion

Written by Gareth Morrell, Head of Insights

Donald Rumsfeld once made an infamous statement to the UN about things that we know and things that we don’t. He was largely ridiculed at the time, perhaps because of his associations and what he was perceived to stand for as much as careful critique of his speech. While convoluted, the passage culminating in the concept of “unknown unknowns – things we don’t know we don’t know” not only became the title of his memoirs but is actually a useful way to think about our own capacity for knowledge.

In their book the Knowledge Illusion, Stephen Sloman and Philip Fernbach provide a more articulate and rigorously argued case for the importance of spending time considering things we don’t know and areas in which we may be unaware of our ignorance. The recent revelations about how personal details given to social media providers are used by third parties makes it timely to consider what their book is getting at.

Sloman and Fernbach are cognitive scientists, interested in what the human mind is capable of and, importantly, how it is fallible. Underpinning their central argument is the assertion that we can’t possibly hold in our mind all the knowledge we need for everything we do on a day to day basis. The problem, though, is that we often think we can; we typically think we know more about how things actually work than we really do in practice.

The authors cite a simple experiment to back up this claim. Graduate students at an ivy league university in the US were asked to rate their ability out of ten to explain how a zip on a jacket works. They were then asked to write down a detailed explanation of how the zip works. The task was challenging and quite an eye-opener for these students: when asked a second time to rate their ability to explain how the zip works, they were much more modest, scoring themselves much lower than the first time they were asked.

The zip experiment has been repeated plenty of times with many alternatives to the zip but always the same outcome. There are two important implications here for consumers and users of social media. Firstly, we often think we know how even the most basic things work when we don’t. In reality, we often need to rely on experts, media, friends, colleagues and others to help us understand things when we need to. Some of the commentary around the Cambridge Analytica scandal has tried to open our eyes to how social media platforms work and the data they hold on us, yet a recent survey exposes the unknown unknowns around social media advertising. Nearly half of people surveyed did not know that ads targeted at them were based on information they’ve entered about themselves into a website or social media channel.

Following on from this, the second implication of the simple zip experiment is that it can’t really be possible for any single person or collection of people to know how all of Facebook’s targeting algorithms work – not completely. They are complex, ever-evolving and self-learning calculations. And this is not to absolve Facebook of any responsibility, but significant and deliberate effort would be required to understand the total wider impact of their targeting algorithms on individuals and social structures.

If we need others to provide us with information to make simple decisions, we certainly need others to help us with more complex, important decisions, in particular democratic decisions. Relying on the likes of Facebook to provide news and commentary presents two challenges. First, as the content we see is increasingly targeted, we can’t be sure that we’re being exposed to alternative arguments and it may be difficult for Facebook to ensure this; we may even already have stopped caring about alternative arguments. Second, even if we do find a plurality of voices, it becomes much less natural for us to discriminate between the different arguments based on the content rather than who is delivering them.

In Knowledge Illusion, our mind is described as a problem solver, not a hard drive; we need to feed it the right information at the right time for it to spring into action. There is a responsibility on companies like Facebook and Google to provide more collective transparency about how information is reaching us and who is behind it. They are making money from these adverts and can control how they work; they are not simply a disinterested medium through which they reach us. But, as consumers and citizens, we also need to acknowledge the fallibility of our own knowledge and ask more questions – just because the information we receive is immediate and accessible, it shouldn’t be spared interrogation. And when users start turning away from these platforms or blocking more content, and ad revenue falls, it will be difficult for Facebook and Google not to change.