3 Big Questions for Building More Ethical Systems

As local government leaders are well aware, our systems reflect the biases and blind spots of the original designers. In the best circumstances, we have full visibility and understanding of the original intent and subsequent design. Too often, we don’t.

In his latest book, We the People: Human Purpose in a Digital Age, my good friend Dr. Peter Temes lays out the argument that we are called to take ethical ownership of the systems we design and operate.

Take the following story, for example…

Peter was driving through a wooded Brooklyn park late one night when he stopped at a red light. Minutes went by and the light didn’t change while there was no traffic in any direction. As time continued to drag out, Peter began to wonder if the light was operating at all, and if he should simply proceed through the intersection.

Being the law-abiding gentleman that he is, he stuck it out and waited for the light, which eventually turned green. However, the event stuck with him, and he sought out the local official in charge of monitoring Brooklyn’s traffic light network to inquire about it.

The response was, “We know when a light needs to be re-timed because we get complaints. When people are complaining, it means something’s wrong and we check it out. When no one complains, nothing’s wrong.”

Peter describes this as “instance-based ethics.” In the absence of organizing principles and values, we default to an ad hoc system of ethics. We don’t bother to define what we value, or how we see the public good ahead of time because we don’t have to, we know it when we see it.

Of course, this reliance on defining the good in the moment leads to wildly inconsistent interactions and outcomes within a system. The person who has the privilege of access can directly impact the system and bend it to their own needs. Those who don’t, are left to simply take what the world dishes out and deal with the consequences as best they can.

The more desirable approach of principle-based ethics allows us to define in advance what our values are.   While there is no limit to the questions we can ask ourselves in defining our values, Dr. Temes has identified what I think of as the Big Three themes for our times – Human Life, Privacy, and Equity.

Human life. How valuable is human life? Which human lives might be more valuable than others? While it is tempting to simply say “all life is valuable”, this doesn’t help guide difficult decisions. What do we expect of the autonomous car that needs to choose between protecting its passenger(s) or protecting a child in the street? How do we adjudicate a conflict between public health and economic development? Do we value citizens over non-citizens?

Privacy. Who gets to know what about whom? Local government collects, stores, and disseminates staggering quantities of data about its citizens.  What are the guideposts for this work? What are the limits? Does a person own their own data, and should they be compensated for its transfer and use?  How does someone opt-in or opt-out, and which is the default? What special treatment do we demand for children?

Equity. As digital systems help us create more material wealth and power, who will benefit, and how equally should those benefits be spread? What amount of disparity is desirable as an inspiration and reward? When is the system out of balance and requiring correction? What are the baseline promises of equity we hold in covenant with each other?

There may be a tendency among us to shrug and think that it isn’t our job to think about these things, to wrestle with these systemic questions.

My counter is that it is everyone’s job to do exactly that.

If we do not rise to the occasion and set very specific expectations for the ethics of our systems, then we allow others to do it for us. These “hidden designers” may make decisions antithetical to our values, or they may simply make no decision at all. Either way we lose, as we are put in the position of serving our systems, rather than our systems serving us.

Joel Carnes is a GovLoop Featured Contributor. He has spent his career in innovation, and has experience at every level — with products, teams, organizations and entire communities. While he’s always been a technologist, he’s much more interested in the impact on real human lives than the technology itself — a passion that inspires him to work with local governments and communities across North America. Joel has held senior executive roles at XPRIZE, Activision, SecondMuse, and Disney Imagineering, and has become a thought leader in connected innovation — where individuals, teams, organizations and businesses come together to solve a problem, going beyond what any single entity could accomplish on its own. With this strategy, Joel has built relationships across sectors, industries and political boundaries, creating entire innovation networks that continually produce solutions to real-world problems. You can read his posts here.

Leave a Comment

One Comment

Leave a Reply

Catherine Andrews

Yes! Way too many technology companies have not asked themselves about the ethics and potential outcomes of what they are building. I love that you are bringing this up in terms of designing systems for state and local tech and IT.