Kaj Sotala (Xuenay) (xuenay) wrote,
Kaj Sotala (Xuenay)
xuenay

Software for Moral Enhancement

We all have our weak moments. Moments when we know the right thing to do, but are too tired, too afraid, or too frustrated to do it. So we slip up, and do something that we’ll regret.

An algorithm will never slip up in a weak moment. What if we could identify when we are likely to make mistakes, figure out what we’d want to do instead, and then outsource our decisions to a reliable algorithm? In what ways could we use software to make ourselves into better people?

Passive moral enhancement

One way of doing this might be called passive moral enhancement, because it happens even without anyone thinking about it. For example, if you own a self-driving car, you will never feel the temptation to drink and drive. You can drink as much as you want, but your car will always be the one who drives for you, so you will never endanger others by your drinking.

In a sense this is an uninteresting kind of moral enhancement, since there is nothing novel about it. Technological advancement has always changed the options that we have available to us, and made some vices less tempting while making others more tempting.

In another sense, this is a very interesting kind of change, because simply removing the temptation to do bad is a very powerful way to make progress. If you like drinking, it’s a pure win for you to get to drink rather than having to stay sober just because you’re driving. If we could systematically engineer forms of passive moral enhancement into society, everyone would be better off.

Of course, technology doesn’t always reduce the temptation to do bad. It can also open up new, tempting options for vice. We also need to find ways for people to more actively reshape their moral landscape.

A screenshot from the GoodGuide application.

A screenshot from the GoodGuide application.

Reshaping the moral landscape

On the left is a screenshot from GoodGuide. GoodGuide is an application which rates the health, environmental, and societal impact of different products on a scale from 1 to 10, making it easier to choose sustainable products. This is an existing application, but similar ideas could be taken much further.

Imagine having an application which allowed you to specify what you considered to be an ethical product and what kinds of things you needed or liked. Then it would go online and do your shopping for you, automatically choosing the products that best fit your needs and which were also the most ethical by your criteria.

Or maybe your criteria would act as a filter on a search engine, filtering out any products you considered unethical – thus completely removing the temptation to ever buy them, because you’d never even see them.

Would this be enough? Would people be sufficiently motivated to set and use such criteria, just out of the goodness of their hearts?

Probably many would. But it would still be good to also create better incentives for moral behavior.

Software to incentivize moral behavior

Sutter Health/California Pacific Medical Center.

This six-way kidney exchange was carried out in 2015 at the California Pacific Medical Center. Sutter Health/California Pacific Medical Center.

On the right, you can see a chain of kidney donations created by organ-matching software.

Here’s how it works. Suppose that my mother has failing kidneys, and that I would like to help her by giving her one of my kidneys. Unfortunately, the compatibility between our kidneys is poor despite our close relation. A direct donation from me to her would be unlikely to succeed.

Fortunately, organ-matching software manages to place us in a chain of exchanges. We are offered a deal. If I donate my kidney to Alice, who’s a complete stranger to me, then another stranger will donate their kidney – which happens to be an excellent match – to my mother. And as a condition for Alice getting a new kidney, Alice’s brother agrees to donate his kidney to another person. That person’s mother agrees to donate her kidney to the next person, and that person’s husband agrees to donate his kidney… and so on. In this way, what was originally a single donation can be transformed into a chain of donations.

As a result of this chain, people who would usually have no interest in helping strangers end up doing so, because they want to help their close ones. By setting up the chain, software has made our interest for our loved ones align together with us helping others.

The more we can develop ways of incentivizing altruism, the better off society will become.

Is this moral enhancement?

At this point, someone might object to calling these things moral enhancement. Is it really moral enhancement if we are removing temptations and changing incentives so that people do more good? How is that better morality – wouldn’t better morality mean making the right decisions when faced with hard dilemmas, rather than dodging the dilemmas entirely?

My response would be that much of the progress of civilization is all about making it easier to be moral.

I have had the privilege of growing up in a country that is wealthy and safe enough that I have never needed to steal or kill. I have never been placed in a situation where those would have been sensible options, let alone necessary for my survival. And because I’ve had the luck of never needing to do those things, it has been easy for me to internalize that killing people or stealing from them are things that you simply don’t do.

Obviously it’s also possible for someone to decide that stealing and killing are wrong despite growing up in a society where they have to do those things. Yet, living in a safer society means that people don’t have to decide it – they just take it for granted. And societies where people have seen less conflict tend to be safer and have more trust in general.

If we can make it easier for people to act in the right way, then more people will end up behaving ways that make both themselves and others better off. I’d be happy to call that moral enhancement.

Whatever we decide to call it, we have an opportunity to use technology to make the world a better place.

Let’s get to it.

Originally published at Kaj Sotala. You can comment here or there.

Subscribe
  • Post a new comment

    Error

    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

    Your IP address will be recorded 

  • 0 comments