Jon Gosier is a data-scientist, investor and entrepreneur. In 2008, Jon founded Appfrica, an accelerator investing in Africa’s technology economy. Since then he’s founded an investment fund, joined a number of early stage startups (most recently medical artificial intelligence company Wounds AI) and invested in a number of interesting startups.
Back in 2014, Jon gave a TED Talk called Trickle Down Techonomics discussing how new technologies can be a bad thing if they’re not equally distributed. If you haven’t seen it, it’s an absolute must-watch for anyone working in tech and its only 6 minutes long.
We spoke to Jon about the inspiration behind the talk, and the responsibilities of tech entrepreneurs to be as inclusive as possible…
Trickle Down Techonomics
Jon’s worked on a number of civic tech and tech-for-good projects over the years, “working on systems to keep people connected in times of communication blackouts, oppressive regimes, or trying to connect people with each other in times of disaster.”
While he was working on such projects, Jon was aware that “we were creating these technologies for one purpose [but] every tool ever created has potentially other uses – the way you intended and ways you didn’t.”
As an example, Jon says “I would see people using technologies for tracking activists that were essentially the same as the technologies we were using to [find people] following a disaster… it reminded me that we should be as thoughtful about the alternative uses of the technologies as we are innovating.”
In the private sector, Jon says tech companies “try to maintain neutrality. They’ll say well, we’re just trying to make money and not trying to actively harm people so that’s good enough. But just in the creation of certain technologies you almost incur the responsibility, whether you want it or not, to think about safety, privacy, security and efficacy.”
The other aspect of tech that caused Jon concern was “global access”.
It’s not always possible, or even necessary says Jon, to “design technologies for people who don’t share the same living standards or economic level… but it’s always something that I feel should be on the mind of creators. Just be aware of who you’re serving. Don’t make a technology that you say is supposed to be for everyone, if it really is only for people who are middle [class] and above.”
Ethics as risk mitigation
How we can help “depends upon the company or the situation or circumstances” says Jon.
“I think awareness is where we should start right now. Most private companies start from a place of not really being super self-aware until it becomes a PR incident, scandal or something happens and the negative press forces them to think about it.
“For instance all the stuff that is going on with Uber right now, many are incidents that Uber could have mitigated in advance [by] being prepared or just being more proactive instead of reactive.”
“As startup founders let’s be more proactive about problem-solving as opposed to waiting for [issues] to happen and dealing with them after. I find that kind of a lazy and reckless way to operate as a business leader and an innovator and a creator.”
Putting himself in the shoes of an Uber Executive (albeit perhaps a more empathetic one than Kalanick), or an investor, Jon asks “would I not want the leaders of that company to go through the mental exercise of all the scenarios that could go wrong so we aren’t blindsided later? To me that seems like it’s protecting company value.”
Jon says it doesn’t make sense for issues like diversity to not be on founders’ minds. “Even if you don’t care about doing anything about diversity, other people do. It’s going to become something you’ll have to deal with down the road. So to me, at the very least, its risk mitigation.”
There have been cases where Uber users have been put into unsafe situations with massive safety and legal risks for the company Jon says. Uber could have mitigated these risks by planning for these possibilities before they happened.
“It’s just as important for innovation to think of these unlikely scenarios in advance.”
Jon gets asked the same question by a lot of people he speaks to about identifying risky situations – where do I find the time to consider every single possibility? For those people, Jon has a “design thinking exercise” he calls ‘outcome design (ODX)’.
“[I tell them to] start with all the different types of users who their technology or services touch. This could be the direct users. In the case of Uber this might be the passengers, but it could be people in the community, or pedestrians (other people on the road), it could be the drivers, it could be the people at the destination. Map out all the stakeholders in our ecosystem.
“The next step is to think about all the ways that our service affects them positively. The ways that we intend to affect them. Then let’s actively think about how these things could go wrong. All the potential negatives. From there you can start to map out solutions.
At the end of the day its just brainstorming says Jon. “If you can’t spend a day with your staff doing a brainstorming exercise that helps you mitigate risk, then whatever happens later you’re kind of asking for.”
The future of ethics in technology
For Jon, the technology industry today is, in a way, comparable to the financial sector in the 80s. “People just ran wild for two decades and then they were forced to adopt a code of ethics. Lawyers have to follow a code of ethics and I don’t see why, with the sheer power and pervasiveness of technology, why that same philosophy wouldn’t apply here.”
Jon points out that ethics needs to be more than just a code or a culture – “what does it mean to have a culture of ethics if it’s not going to be enforced? When there’s dramatic shifts in the way Government thinks about ethics and regulation, that dramatically changes a lot of this. If it can’t be enforced what are we really talking about?”
To be more specific, Jon says “we can have a social construct between the public and companies and each other, or between technologists and each other, but if the government is the type of government that just walks all over ethics or doesn’t care at all, then from the top down there’s no way to hold anyone accountable. It becomes really more of a social contract than anything, and there’s a real risk there. Companies that have the power to do so simply won’t follow the rules. They don’t have to. Who would make them?”