At the 2019 conference Gemma Milne was in conversation on stage with our keynote, anthropologist, Simon Roberts, so we are delighted to be in conversation with *her* about her new book, Smoke & Mirrors: How Hype Obscures the Future and How to See Past It, which is published on 23 April. A technology and science writer who writes for publications like BBC and Forbes, Gemma talks to our Founder, Dawn Walter, about her book and how hype is responsible for fundamentally misdirecting or even derailing crucial progress, and how we can combat it.

Gemma, firstly congratulations on your first book. While it must be frustrating to have your book published when you can’t do events at bookshops and meet your readers due to the pandemic, I think your book is one of the books for the time we are living in. We’re being forced to confront many things we have taken for granted, for example, our food system so I think Feeding the World is a really relevant chapter. And the chapter on the cure for cancer, too, at a time when we’re hoping a vaccine for COVID-19 will come soon. It’s terrifying to think that hype might funnel resources (people, money, time) into approaches that might not work. Has that thought crossed your mind as well?

Absolutely. One of the main points I make in the book is that hype is a tool – a tool that is needed when communicating complex important ideas, but one that can wielded irresponsibly when loud messages are taken out of context. We’re already seeing instances of hype showing its dark side with premature excitement around certain drugs or misinterpreted ideas taken out of context from epidemiology papers. In times like these, when people are hungry for information and there is a lot of complexity to work through, hype can be both a blessing and a curse.

We’ve seen with the coronavirus that misinformation – which is essentially what hype at its worst is – can literally kill. What prompted you to write a book about hype and why?

It came from a place of frustration – I was frustrated at the 5 minute startup pitches at tech conferences misleading audiences; I was frustrated by over-simplified headlines routinely being published in mainstream trusted media, such as ‘robots are going to steal your jobs’; and I was frustrated that more people weren’t talking about the importance of nuance when discussing, explaining and putting forward solutions for some of the world’s biggest problems.

So I wanted to write something that kind of played three roles – one, set the record straight on some of the most timely issues; second, showcase how we can all improve our critical thinking no matter our level of intellect or qualification; and third, make the case that really it’s not just something that makes science and tech more interesting and understandable, but that it’s also our collective and individual responsibility to think more critically and engage in the future of society.

I was frustrated that more people weren’t talking about the importance of nuance when discussing, explaining and putting forward solutions for some of the world’s biggest problems.

As I was reading your book, two of my scribbled thoughts were (1) who controls the narrative and (2) who seeks to gain. Would you agree those are key questions to be asking when we read about science and tech in newspapers, company websites, online magazines, etc.? And how can non-experts — even those with critical thinking skills — see past the hype when they are confronted with a barrage of information on any given topic?

Absolutely. Those are two questions which, when asked, can really open up your understanding of an area and give you so much more context and rationale for the narratives you’re reading and the actions you’re seeing being taken. It also helps you make up your own mind about what’s going on.

In terms of how to see past hype, the book has nine chapters – one ‘lesson’ of sorts per chapter – but my biggest piece of advice is to pause when you read something, and ask yourself: “depending on what?” It means you start to consider different scenarios that play out in different fields and forces you to remove your blinkers to build a more thorough mind-map around the information in front of you.

My biggest piece of advice is to pause when you read something, and ask yourself: “depending on what?”.

Your book reminded me of Naomi Oreskes and Erik Conway’s Merchants of Doubt, which revealed that well-established scientific research on say lung cancer or climate change was being called into doubt by a select but powerful group of people to further their own ideological, financial, and political interests. The book ultimately argues that if there is overwhelming consensus from the expert scientific community on something, we should trust them rather than dismiss the consensus. The issue of trust comes up in your book — that hype can damage people’s trust. Trust in experts seems to be at an all-time low. What are your thoughts on this?

First of all, I’m not actually sure trust in experts is at an all-time low – I wonder if this narrative in itself is hype! Hard one to measure, methinks! But yes on the issue of trust, essentially there are two questions: 1) if we encourage people to think more critically, are we also essentially telling them not to trust expertise, and 2) is airing the dirty laundry of an industry an irresponsible act if it means it lowers overall trust?

Being critical doesn’t mean being dismissive. Not believing everything you read is just as absolutist as believing everything you read.

For the first question, I’d answer no – being critical doesn’t mean being dismissive. Not believing everything you read is just as absolutist as believing everything you read! Being critical means considering context, asking questions and allowing yourself to explore complex topics – most issues in the world are complicated and require this kind of mindset in order to be understood, even at a surface level. More critical thinking means more trust, in my opinion, as I believe it would encourage more people to understand where experts are coming from in their information, and contextualise it more easily.

For the second question, the issue of trust is a little more nuanced. If you tell people that science is falliable and that there are issues with fraud in academia – a truthful statement – do you run the risk of having them not trust science any more? In my mind, I’d argue that being upfront is a better way of gaining trust as it means when something does get exposed, people can contextualise it. But if it comes as a shock, as people are not primed for this information, or even know that it’s a thing, that surprise can lessen trust. Trust is definitely a delicate and complex topic though and it was one of my favourite things to explore in Smoke & Mirrors.

Trust is definitely a delicate and complex topic though and it was one of my favourite things to explore in Smoke & Mirrors.

The chapter that’s most relevant to the Anthropology + Technology Conference is, of course, the one on Artificial Intelligence. You make the point earlier in your book that hype is needed “to keep the money coming”, and that’s certainly true of AI, isn’t it? Funding is necessary otherwise research may stall. The concern though is that investment or funding ends up pushing research in one particular direction at the expense of others. What do you think about this conundrum?

For sure – hype is a double edged sword. It’s a tool which can be used in many different ways, some responsible, some not. Hype is absolutely required if you remember that hype is – simply – publicity. We need to get messages heard; we need to simplify the complex; we need to convey excitement and human stories to get time-poor decision-makers on board. On the other hand, hype absolutely can result in an opportunity cost: something getting funded or supported over something else, which can of course cause grave issues if and when that funding isn’t infinite.

I think the conundrum is where I kind of start the book, but over time, I guess I switch the question away from ‘is hype good or bad?’ and rather focus on something more pertinent – in my view: how we can use hype responsibly. I argue that the answer is in contextualising it, as opposed to banishing it, and empowering everyone to spot hype through critical thinking.

Hype has lots of emotions attached to it; learned people tend to see it as a nuisance, something they wouldn’t get caught up in. Believing you’ll never get ‘caught up’ is also a fallacy – we all need to be more accepting of not knowing answers and more willing to dive into the nuance and complexity. It’s the only way to navigate these conundrums.

How we can use hype responsibly? The answer is in contextualising it, as opposed to banishing it, and empowering everyone to spot hype through critical thinking.

One of the many important points you make in the AI chapter is about responsibility. As you say, it’s easy to forget that humans are behind the creation of robots and AI, and you put it so nicely, “if we don’t know who we’re meant to congratulate [about the achievements made in this field]…how are we going to know who to blame?”. This argument has been made as well by scientists like Joanna Bryson and Alan Winfield that ultimately if AI goes wrong, someone—not something—has to take responsibility and be made accountable. Can you summarise this important issue from your perspective?

Absolutely. If you take the narrative ‘robots are stealing your jobs’, the thing responsible is the robot. Whereas if you were to reframe this to read ‘corporate execs are making active decisions around which tasks will now be automated, resulting in human labour being replaced’, you’re arguably saying the same thing (that automation is here) but you put the person at the centre and don’t ‘other’ the responsibility to the robot.

It might sound like an unimportant (and less fun) distinction, but when you use the first narrative, you open up conversations about the singularity and rights of robots and ‘what is creativity?’, whereas when you use the second, you get right into discussions about corporate power, reskilling and universal basic income. It’s important to have conversations about tech based in the present, as opposed to predominantly the future, if we want to get the masses to understand where responsibility lies – so that those who have the power cannot hide. It’s not about saying automation or AI is bad – it’s about being more realistic and less fearmongering in the way we talk about it, and paving the way for safer technology to be built.

It’s not about saying automation or AI is bad – it’s about being more realistic and less fearmongering in the way we talk about it, and paving the way for safer technology to be built.

Do science and technology writers, in particular, have a responsibility not to promote hype around the ‘robots are stealing our jobs’ narrative, a narrative that’s just plain wrong?

I do think that science and tech writers have a responsibility in particular – they have a lot of influence at the end of the day. Unfortunately it’s not always the thing science and tech writers are encouraged to do – not that they are discouraged, it’s just not really front of mind. And this is where a lot of my first thinking came for the book; I was thinking about my own responsibility. I write for Forbes, and I was on the tube one day and saw a startup ad which only had one line of writing on it: a short quote from a Forbes piece. I was instantly worried that one of the startups I cover would do the same thing – take one line out of context and use it for advertising, regardless if it was reflective of my piece or not.

So it got me thinking much more about what I put out there, why I put things out, and how I do it too. Lots of science and tech writers do think about this and do care – but many do not. And also many people who cover the space are enthusiasts too, so can fall victim to their own idealism. I hope that writers and journalists read my book too – we are absolutely not exempt from the message of Smoke & Mirrors; in fact maybe it’s one of the most important audiences.

Many people who cover the space are enthusiasts too, so can fall victim to their own idealism.

At the end of your book, you discuss our individual roles and responsibility in all of this – we should “actively engage” with science and tech topics such as AI. I think you’re suggesting – and I agree with you – that if we play along with the hype – “blindly believe” – or don’t recognise the role that (negative and positive) hype plays in society, then we are in a way complicit. Can you speak to that a bit more?

I thought long and hard about how to get this message across in the book without it feeling too blameful! But yes you’re right, I argue that not engaging and thinking critically can be a form of complicity. At the end of the day, society shifts due to the people within it – we’re crew, not passengers, of ‘spaceship Earth’, as they say – and I guess if you read Smoke & Mirrors or listen to one of my talks…or even read an interview like this, where I’m making the case for each of us to check our own behaviour and responsibility; if you then don’t make any changes – even just a little – whilst in full knowledge that it makes an impact if you don’t, surely that’s complicit. It’s knowing, right? Of course, engaging can mean many different things – from diving into the literature to simply pausing before retweeting – and that’s the key thing here that I want to drive home. I’m not advocating for a complete overhaul of behaviour, just a little more care around information!

Is there anything you’d like to leave us with to think about before we part ways?

The final point I make in the book is about what I see as the biggest hyped-up false narrative of all, which is the idea that science and tech ‘isn’t for you’ if you’re not an expert in it. That’s total rubbish. Every single person has the ability, the right and the responsibility to get involved! It’s easier than you think – I promise. And science and tech NEEDS voices from places outside their silos. It’s not just about engaging more and enjoying it, it’s about having a say. I hope that Smoke & Mirrors empowers more people to do this, and love it.

The idea that science and tech ‘isn’t for you’ if you’re not an expert in it is total rubbish.

Thank you so much, Gemma, for taking the time to talk with us about your book.

You can get your signed copy of Gemma’s book through Simon at Big Green Books (@biggreenbooks). Bristolians can also support our local indie, StorySmith, who had a bookstall at the 2019 conference. And if you’re not in Bristol, two of Gemma’s favourite independents who have been very supportive of Smoke & Mirrors are Lighthouse Books in Edinburgh and Libreria off Brick Lane in London. All indie bookshops need our support right now!

https://www.gemmamilne.co.uk/book