Huge advances in technology, an increasingly connected digital world, political and corporate scandals involving social media and personal data, if there was ever a time to think about ethics in tech, it’s now.
Recent global events have certainly helped users of technology become more and more aware, both of the implications that their interactions could potentially have, and of the global trade in convenience in return for personal data. However, there are still huge knowledge gaps in the understanding of technology, with only a third of people aware that data they have not actively chosen to share has been collected and a quarter having no idea how internet companies make their money.
And then there’s the unintended consequences, where ethics are cast aside in order to meet some other business goal, like in this heartfelt story from a once junior coder.
So surely a large responsibility lies with the makers. How can we, as a tech community, be sure that the technology we continue to advance, has a lasting positive impact, on the end user, on society as a whole and on the planet?
Fortunately, it’s a topic which is being taken more seriously. Two universities synonymous with Silicon Valley, Harvard and Stanford have this year developed courses focusing on computer and data science ethics. Also coming out of Silicon Valley this year is the EthicalOS Toolkit, a joint partnership between The Institute for the Future and Omidyar Network, which aims to make businesses and makers more aware of the risks that may emerge and so allow them apply future-proofing strategies.
Leaps forward are being made in the UK too. The government has released plans for it’s Centre for Data Ethics and Innovation and the work of the Alan Turing Institute is shaping the future of data science and artificial intelligence. Meanwhile, Doteveryone, an organisation and think tank founded by philanthropist and co-founder of Lastminute.com, Martha Lane Fox is championing responsible technology for a fairer future.
Here at Simpleweb, we work with founders every day who have a vision to change the world in some awe-inspiring way, however we understand that in the real-world growing their business, getting their product to market or raising investment is the number one priority.
So what can startup founders and businesses do to ensure the business and/or the technology they build is developed in a responsible and mindful way? Not only so that it withstands the challenges of the future but also has a positive impact upon its users and society as a whole?
We spoke to those championing ‘responsible tech’ to see what they think…
Samantha Brown – Programme lead for Responsible Tech at Doteveryone
“The current ‘techlash’ against the tech giants of our time represents a crucial turning point in the public’s trust in technology. Our research into the public’s digital attitudes and understanding has shown people are concerned about the impacts of the internet on society and also feel disempowered in the face of changing technologies. Only 12% of Brits say technology has had a very positive impact on society. There is also a strong appetite for greater accountability from organisations that create and use these technologies.
Only 12% of Brits say technology has had a very positive impact on society
At Doteveryone, we believe that the industry needs a change in mindset, processes and networks within the conception and design of technology to ensure it is developed in a more responsible and mindful way that can withstand the challenges of the future.
But adopting a responsible technology approach isn’t straightforward. There’s currently no roadmap, or even a common language, about how to embed responsible technology practices in practical and tangible ways.
We’ve identified three core concepts that will help technology owners and creators better consider how to build responsible technology:
Context: rather than focusing solely on individuals as consumers, technology creators need to understand and respect the different contexts its users interact with their technology in, and the communities their product can affect
Consequence: technology owners must to learn to anticipate the potential impact of a technology product or service from the outset by identifying the main drivers for unintended consequences
Contribution: it’s crucial that businesses examine holistically the contributions of informal labour or information that a technology receives from its users and determine if these are known and understood, optional, and fair. Further, a business should consider what their technology gives back to the world; whether that be a contribution of knowledge to better the industry, insights or data to governments or community leaders, or the betterment of public services
Ethics and responsibility take time, budget and hard work, and often involve weighing trade-offs. But being responsible, however is often basic common sense; thinking through risks and planning sensibly.
In this time of rapid change, it’s more important than ever for industry, government, civil society, technology users and the wider public to come together in order to define the collective values our societies are based on and ensure our technologies reflect these. It’s vital that startup founders, business leaders and those who code engage in these conversations and help to shape a strategic future vision for the role technology will play in society.”
Jane McGonigal – Director of Game Development at the Institute for the Future and co-creator of EthicalOS
“Tech companies are under extraordinary pressure to become better ethical actors. Nobody wants to find out what the next version of “fake news” or “propaganda bots” or “smartphone addiction” will be. We’ve had enough of unpleasant surprises. The general public, government regulators, investors and people who work in tech companies all want to feel more hopeful that we can identify and prevent big risks to truth, democracy, privacy, security and mental health before they catch us collectively off guard again.
we tend to focus on the problems we’re trying to solve, rather than new problems we might accidentally create
The challenge, however, is that making ethical decisions about the technology you build or decide to use means you have to essentially be able to predict the future. How do you know what the right, more ethical action is? You have to anticipate the long-term consequences of your choices. Otherwise, you’re guessing blindly. Is this good for humanity? Is this good for society? You can’t answer those questions if you’re not thinking like a futurist.
So how do you anticipate future consequences? How do you imagine something that hasn’t actually happened yet? That’s the big problem we’re trying to solve with the Ethical Operating System.
It’s not that we, at the Institute for the Future, think the technologies are inherently bad for society. It’s just that everyone in tech, from start-up founders to investors to engineers tend to focus on the problems they’re trying to solve, rather than new problems they might accidentally create.
If your company is a drone tech company, you are probably already thinking about risks in the areas of privacy and weaponization. But what about long-term impacts of living in a society where drones are ubiquitous and open to surveillance and potential authoritarian uses?”
James Hand – Co-founder of Giki
“Right up front, writing down what you intend the culture of your business to be, is really important. It’s an old quote but ‘culture eats strategy for breakfast every day’ and it’s almost a truism. So, work out what do you want to achieve, and how you want to achieve it and then spend plenty of time on it. It’s never going to be top of the to-do list when there’s exciting stuff to do like products to launch, new tech to look at and people to hire, but do it and then come back to it on a regular basis with one question which is, am I am I living up to that? It’s almost like a pledge that you make to yourself and then you know when you’re deviating away from it.
We went through the whole ‘jobs to be done’ process and the customer value proposition, MVPs and all that, which is a great process to go through, but none of it covers things like culture and therefore it’s very easily forgotten but it’s crucial.
The best thing is when other people start holding you to account on the culture of the business, and then you really know you’ve embedded it properly.”
Mahyad Ghassemibouyaghchi – President, KCL Tech, King’s College London
we need to discuss how we want to live in this new reality
“As a millennial, I’ve seen first-hand the power of technology – its huge potential but also how it can disconnect people.
We are a different society to even 10 years ago so we need to discuss how we want to live in this new reality. To me it’s less about the technology, and more about society. I see tech as a mirror of society and there will always be good and bad people. It’s about how we understand and educate people about the implications and responsibility of this technology.
As the next generation we want to work for socially responsible businesses. Growing up with the internet means we can see and are more aware of the challenges facing the world. We want to found companies and work for businesses that have real meaning and can make a positive impact.
My advice is that ethics should be written into companies mission and vision statements from the start. That they should make every effort to put community before KPIs and people before profit.”
To apply for investment for your early-stage startup, just fill out this application form.