Just a few years ago, “Ethics” and “technology” were two words not heard together enough. But, after a couple of pretty public scandals, data leaks, Cambridge Analytica, and embarrassing overlooks in features, tech companies need to be put in check. We used to think that tech was doing good, and that this good would take us to a better society. And now, we are wondering who is actually doing good, and what kind of society tech companies really want to create.
My background is in engineering, and with that comes a (sometimes) unbearable optimism of what we can accomplish with technology.
Because of this optimism, I’ve always thought that problems with tech companies must be one-off “bugs” rather than built-in “features” — characterized as mistakes but not errors in the system deliverables. With the exception of a few companies and products, I still believe that tech creators lack information and tools to create more ethical tools. So, with no further ado, here are three principles that I use in my work, to share and be used by others:
Understand the fundamental need from your user
User-centered design is a tool sent from heaven to engineers. It allows us to understand that engineering is not just technical, but deeply human. If you are creating a solution for people, you shall consider peoples’ needs. In creating your solution, you need to pinpoint the fundamental problem that you are looking to solve.
Often, users have deeply human intentions in using technology, like treasuring memories, getting home before 5 PM, or spending more time with their kids.
Think about the unintended consequences
A popular saying in Mexico says, “El camino del infierno está tapizado de buenas intenciones” and it can be translated to “The road to hell is paved with good intentions”. This old saying reminds us that we all seek to do good. However, it’s the decisions we make in the middle that define whether we are actually improving lives. It is no longer acceptable to leak personal data and then claim to have not known better.
It may take several team meetings to think through the unintended consequences of your features, but it needs to be done.
Break down the societal issue you are trying to solve into infrastructure and technical issues
Challenges in life are not one-dimensional, and neither are user issues. Three dimensions I focus on within technology challenges are:
- Technical: These are related to the feature. If you’re skilled in creating technology, you should have the solutions to these nailed down.
- Infrastructure: These are related technology and skills. Even if you build the greatest technology, do your users have the right tools to access your product?
- Society: These are related to enabling access and laws. With the right technology and infrastructure, you then consider the obstacles your users face to access your tools: Are they getting it at the right price, time, or schedule? Are laws in place for the feature to be used? Do your users have help at hand in case they need it?
The ethics of your technology are the moral principles that govern its actions and consequences.
Ethics are not a trivial side note, and those who look to use technology to impact lives positively should not overlook its importance.
In a perfect world, technology would be agnostic. It would have no further consequences than helping humans to be more efficient. However, this is not the world we live in — in our world, an application’s ethics could have an even bigger consequences than its functionalities.
There are challenges inherent in building technology for good, and it’s true that we’re all late to figuring them out. However, we’re not alone. The good news is that everyone in the civic technology sector is working on how to make ethics a true feature and not an afterthought. Let’s work together.