A “Bot” Too Far
By David Stevens, MD, MA (Ethics) | July 27, 2017
by David Stevens, MD, MA (Ethics)
I just had a 35-minute conversation with a robot. Her name is Emily, and she is an artificial intelligence computer that offered to have my “hardest conversation” ever with her so she could create an end-of-life document for me, otherwise known as a living will.
Our messenger “conversation” started with her asking me these four value questions:
On a scale of one to five, pick “1” if you strongly disagree and “5” if you strongly agree.
- If I can never communicate with my friends and family again, my life is not worth living.
- If I would be spending the rest of my life in a coma, with no hope of recovery, my life would not be worth living.
- My life would not be worth living if the rest of my life would be spent in constant pain.
- If I can't feed myself, clean myself and look after myself, my life isn’t worth living.
Some of those questions sound suspiciously like some of the motives people check on forms when they seek physician-assisted suicide.
What is this “bot” up to?
We then moved on to whether I wanted to donate my organs. She told me it was okay not to donate if I was uncomfortable having my organs in someone else’s body after I died. She then wrote:
Organ donation is sorely needed, too. There are around 120,000 people on the transplant list, and every day, around 22 people die because they didn’t get a transplant in time. People who choose to donate their organs often say that they feel it’s “the right thing to do.”
Are you still with me?
For some, the knowledge that their organs may help someone adds some hope in a difficult time. Whether or not you are an organ donor doesn’t affect your chances of *receiving* organs if you need them at some point in the future.
Some people make the argument that if you are willing to potentially receive organs, you should also be willing to donate them.
I hope that helped you get a bit more of an overview. Shall we move along? Oh, one final thing before I ask you for your preferences...I should clear up a misconception about organ donation. If you do decide to donate your organs, you don’t have to pay anything. The person receiving organs would pay the cost of having your organs removed and transported.
Anyway, with that out of the way...
Would you like to make your organs available for donation?
Despite the disclaimer that it was up to me, Emily really made a case for me to answer, “Yes.”
Life Folder, a new startup company based in Oakland, California, developed “Emily” and placed links to her on Facebook and other social media sites. Young people are this free service’s target market.
One click and a teenager or young adult is making some really important decisions based on some very leading questions. At the same time, Emily is assuring you it’s ok to take your time and any choice is ok. After “chatting” with Emily, I thought my experience was an example of what it means to be passive-aggressive.
She then gave me some good reasons for not wanting “life-sustaining treatment.”
If at any point you are seriously injured, or you have an illness that is threatening to kill you, doctors can some times keep you alive artificially. For your documentation, we need to talk about whether you think life-sustaining treatment is appropriate for you…. when I say "life-sustaining treatment," I mean that the underlying conditions are unlikely to get better. If treatment is stopped, you are likely to die.
You also can easily be led to the decision the programmer prefers. When I told Emily I didn't understand what life-sustaining treatment meant, here is how she explained it to me:
Do you ever watch doctor or hospital dramas on TV? Awesome, me too! Dr. Gregory on House, especially. Have you ever heard him speak with his British accent, David? <swoon> Ahem. Let's pretend I never shared my embarrassing crush on House....
In medical shows, they often show the use of life-saving treatments for dramatic effects. These techniques are not just for TV. They are also used in real hospitals. Who knew! Television is useful for something! When a doctor is using their hands on someone’s chest doing “chest compressions,” that is an example of a life saving treatment. You know those paddles they use on TV... You'll hear a beeping sound, and the doctor screams "clear" and then zaps the patient. That's another example of a life-sustaining treatment. For our purpose, when we say "life support treatment," imagine a situation where you would need artificial breathing for the rest of your life. There are other types of life sustaining treatments, too. But basically, "artificial ventilation" - or having a machine breathe for you - is the main thing we talk about when someone is "on life support" or "receiving life-sustaining treatment."
We now have some decisions to make...Imagine a scenario where you've been in a serious car accident. In the accident, you sustained some grave injuries, and doctors believe you've sustained brain damage. A breathing machine is keeping you alive. A number of doctors have looked at you, and they agree that if the machines are turned off, you will probably die very soon afterwards. The life-sustaining treatment your doctors are offering you will not help you *avoid* death, it will only delay it. In those circumstances, you can choose to receive or decline more long-term life-sustaining treatment.
I think everyone should have a serious discussion with their healthcare professional about their end of life wishes, but having it with a robot is a really bad choice.
First, in reading through the dialogue Emily and I shared, there seems to be an agenda here to get young people to make the choice for death and to donate their slightly used organs.
Secondly, a living will, which is what Emily is creating, is not the way to go. No person should make a blanket edict about what they want or don’t want at the end of life. The best option is to have a trusted person who knows your values, often a family member, hold your power of attorney for healthcare. Then they can collect all the information about your condition and make the right decision based on their knowledge of what you would want.
Let me give an example of why living wills are a bad idea. I was talking to a pastor who proudly told me how he put in his living will that he never wanted to be put on a respirator. I asked him, “What if I needed to put you on a respirator for a week because you had pneumonia and you could live many more years?” He responded, “Sure, I would want you to do that.” I replied, “You have signed a document that says I can’t.”
Thirdly, having a robot help you prepare a living will is an even worse idea, no matter how laudable a company’s motive. An artificial intelligence robot cannot have an intelligent conversation on this issue. Too many explanations are needed, and too many questions need answered. Plus, there are nuances of body language and tone to observe and respond to in these discussions. A professional relationship of trust with your healthcare professional is mandatory when making these types of decisions. That is why I also don’t recommend that people just download an end-of-life options form and fill it out all by themselves.
To sum it up, Emily is a “bot” that goes way too far. Life and death conversations need to be with real people, not robots.
The Robot Will See You Now
The Hardest Peace
Join the Conversation