Researcher, broadcaster, author. We had the pleasure to meet Stephanie Hare on her book’s one year anniversary. ‘Technology is Not Neutral: A Short Guide to Technology Ethics’ addresses one of the key challenges our societies are and will be facing in the coming years.
Technology has already changed many industries in the past 20 years and will continue to do so with the rise of Artificial Intelligence (AI), Machine Learning (ML) as well as the continuing digitalisation of our industries. This is transforming the nature of our jobs, the skills required to do jobs, as well as our interactions and perception of the world around us. Equally, it is creating new questions that we collectively need to answer: Is technology neutral? How do we maximise the benefits of technology and minimise the risks? Where do we draw the line? Who is responsible for the technology that we design, create, implement and use in various areas of our lives?
These are all questions Stephanie tries to answer in her book, with the caution and passion of a woman who has spent much of her career working in the technology space.
The writing style of your book is friendly and approachable. Who’s the target audience?
Thank you! From the outset I was very conscious of trying to create a resource for as wide an audience as possible, from tech workers to investors, lawyers to journalists, lawmakers and regulators, designers and academics, and — most of all — parents, teachers, and children. The goal was to create something that everyone could use according to their needs and wishes. I wanted it to feel like a conversation, as though we were spending time together trying to figure out how to create and use technology that makes the world a better place and does as little harm as possible. And that stems from the way I wrote it, which involved having precisely these conversations with as wide a group of people as I could find.
What responses did you get since its publication?
The response has been really rewarding. It has led to a lot of conversations with people in government and the civil service, in companies, law firms, and NGOs, in schools with children, teachers, and parents, and also from other members of the public who get in touch to share their feedback or to raise questions about how technology is being used in their community. There are so many people working on trying to make technology more beneficial and less harmful (two separate but related challenges). My book is just one of many that aim to help with that, but it has been wonderful to see how it ‘clicks’ with people and then they start to use it in their own work and lives.
You explain in the introduction that you “could have used a book like this” as a student but also as a professional. Can you tell us about your motivations for writing this book?
Writing this book was very cathartic. I had only been an independent researcher for a few months, after years of working in different companies, when the editor got in touch to ask if I had any ideas for a book. As it happens, I did — about twenty years’ worth of ideas! The challenge was in deciding how to structure them, and that meant asking not just who the book was for but what it was for. My long-suffering family and friends endured endless discussions and drafts. At one point, one of them asked me if I wanted to write the kind of book that people read once and then put back on the shelf, or the kind of book that people read and keep close so that they can use again and again. That was so helpful: I knew instantly that I wanted to create a book that people could use as an easy-to-use tool to help them think through their relationship with technology, whether at work or in life.
Responsibility is a key topic when talking about ethics. The title of your book already gives us a clue but based on your work, research, and experience: who should bear the responsibility for ethics?
All of us who have agency and can take responsibility for our actions (e.g. not young children) can take responsibility for ethics. That said, some people have more responsibility than others, such as when they have power over others. That is why technology ethics is so critical — people who create technology can have an impact over millions of other people. Anyone who is creating or investing in technology needs to have ethics at the core of their thinking, and our laws and regulations should reflect it, too.
When should ethics be considered when designing new technologies? At which stage is the impact the greatest?
Ethics should be considered from the very outset and throughout the whole process, from idea to execution, and revisited continually. It’s hard to give a general rule regarding at which stage the impact is the greatest because that will vary from case to case. One example though might be deciding not to create something or to launch something because the risks are just too great. “Move fast and break things” is not always a good thing when you think through what you might break. We might want to start from the premise of what we want to protect — “move slow and save things”.
How do you prevent technology ethics from becoming a tickbox or an afterthought exercise?
By accepting a very annoying fact: that it is not a tickbox or a once-a-year/quarter/month exercise, but a continuous process! It’s like staying healthy: we don’t do that once; it is a process that we engage with every day, across multiple decisions: whether to get enough sleep, eat nutritious food, hydrate, exercise, get outdoors, or socialise. If we do these things only once a year, we will not enjoy good health overall. It has to be every day, and consistent over time.
You explain in the book that technology is not just about tools, products or services. It is also about power. How does this fundamentally influence our relationship with technology and the process of creating, selling, buying and using technology?
The mere act of identifying where power is in our relationship with technology can be transformative because once we’ve done it, we can’t unsee it. From that moment onwards, whether we choose to act differently or not, we are doing so with a different mindset. We become aware of the choices and decisions that shape each stage and influence each person. It is key to improving our technology.
What advice would you give to someone interested in working in the field of technology ethics?
There are specific roles, such as AI ethics researchers or designers who use value-sensitive design principles in their creations, or algorithmic auditing or ethical tech investing, but another way to think about it is to integrate ethics into existing roles. So whatever job you are already doing, or thinking about doing, there will be ways that ethics can strengthen it and improve it.
As I wrote in my book:
“A new role – that of technology ethicist – is emerging in our economy, but its contours are still being shaped. Is it a technologist who works in ethics? An ethicist who works in technology? Can anyone call themselves a technology ethicist or is it an anointed position?
Rather than focus on what technology ethicists are, let us consider what they do. They might be trained in law, data science or philosophy, or they might be artists or designers. They might be employed by universities (and not just in the philosophy and computer science departments) or they might work in think tanks, in non-governmental organisations (NGOs), in companies or in any part of government. They might do open-source intelligence investigations into crime, terrorism and human rights abuses. They might be lawyers who take on cases relating to privacy, civil liberties, data protection, human rights and competition. Perhaps they infuse new meaning into existing roles – professor, researcher, data protection officer – or they may reflect new and more specific responsibilities, such as responsible AI lead, algorithmic reporter, or AI ethicist. They might sit in a team that is explicitly dedicated to ethics, or they might work in teams such as the ‘responsible technology’ team, the ‘trustworthy AI’ team or in teams that deal with legal affairs, risk, compliance or cybersecurity. They could have junior, middle management or senior responsibilities. They might sit on internal ethics panels or boards. Finally, they may be on the board as a Chief Privacy Officer, Chief Ethics Officer or Chief Responsible AI Officer.”
On top of being an author, researcher and broadcaster, you are also a woman in Tech! What has been your experience working in the sector?
On balance, and over the long term, my experience as a woman in tech has been positive. I’ve been lucky to receive an education and training and professional opportunities that keep me learning and growing every day. That said, I have also seen and experienced some things in the technology sector that were really tough at the time. This made me even more determined to call out bad behaviour and structural inequalities, and to do what I can to help improve working conditions and to cultivate the best talent pipeline possible. These experiences have certainly influenced my work, both in the focus of my research (technology ethics) but also in bringing that research to the public, whether by going on TV and radio, or writing for newspapers, or public speaking, or teaching in schools and universities. I want to help empower people in their relationship with technology, and to welcome them to contribute to technology in whatever way they feel fits best.
To conclude our interview I have one remaining question and as you can maybe guess it is from an Equality, Diversity and Inclusion perspective. In your opinion, can the development of technology ethics contribute to the creation of more diverse and inclusive workplaces and societies?
That’s my hypothesis and my hope, at any rate! I am encouraged by the fact that we are talking about things that we simply did not think about when I began working in technology over two decades ago. Academics were talking about them before, of course, but today these are topics and questions are increasingly mainstream. They are on agenda of the C-suite, investors, lawmakers, regulators, and all over the media. That’s a good thing, as we will have to fight for a better society — it won’t improve on its own — but the good news is that we are making progress and there’s so much more we can achieve.
To learn more about Stephanie Hare: https://www.harebrain.co/
To order her book: https://londonpublishingpartnership.co.uk/technology-is-not-neutral/