Right/Wrong : How Technology Transforms Our Ethics, avec Juan Enriquez

27 janvier 2021

De nombreux changements dans le pendule du bien et du mal sont influencés par les progrès de la technologie. Dans son nouveau livre "Right/Wrong", Juan Enriquez réfléchit à l'évolution de l'éthique à l'ère technologique. Comment l'accélération de la technologie va-t-elle remettre en question et renverser vos idées sur le bien et le mal ? Que faisons-nous aujourd'hui qui sera considéré comme odieux demain en raison de l'évolution technologique ?

To follow along with Enriquez's slides, please watch the full video of this talk on YouTube.

WENDELL WALLACH: Good afternoon. I'm Wendell Wallach, and I am here to welcome you to this event from the Carnegie Council for Ethics in International Affairs, co-hosted by The Trebuchet, which was founded by one of our speakers today. I am a Carnegie-Uehiro Senior Fellow, and here I co-chair the Artificial Intelligence and Equality Initiative.

Today we are going to begin talking with Juan Enriquez. Many of you may know Juan already. He is a TED superstar presenter—he has presented eight times—and he is a research affiliate at the Massachusetts Institute of Technology's Synthetic Neurobiology Lab. Juan has written this wonderful book, Right/Wrong: How Technology Transforms Our Ethics, in which he discusses the ways in which ethics have evolved over the centuries as new technologies have come into being and will evolve dramatically in the coming decades. This book is informative and insightful. It is a tour through many different kinds of technologies. It is quite a fascinating read, and Juan has been particularly careful to avoid jargon. He is creative. His style is humorous.

Juan, in writing this book you had a very serious intent in mind, and I wonder if you could take 10 to 15 minutes to tell us about Right/Wrong and why you wrote the book.

JUAN ENRIQUEZ: I would love to. Thank you, Wendell.

I am going to share some short slides and try to make this as painless as possible.

Let's start with three premises. The first premise is: We all think we know right from wrong, and often in historical terms we find out that we didn't.

The second premise is: Even if you do know right from wrong, it's likely that right and wrong will change across time.

The third premise is: Technology changes our fundamental notion of right and wrong. Technology is accelerating at an exponential pace, and therefore you might consider whether our notion of what is right and what is wrong may begin to change at exponential rates.

Let's start with this whole dichotomy of right and wrong and the first premise, which is, what if right and wrong do actually change across time? There is actually quite a bit of proof of this. If you take a time machine and go back to Aztec or Mayan times, it was normal, natural, legal, expected, and religiously justified as well as legally justified to sacrifice humans in the most horrendous ways possible. So you take some poor schlub, drag them up to the top of a pyramid, rip out their bleeding heart with an obsidian knife, and hold it up to the sun.

Of course, there was a justification for this because in people's minds if you didn't do that, the rains wouldn't come or the sun wouldn't rise. Now we know that, because of technology, astronomy, and other things, to be absolutely idiotic. So in retrospect it looks pretty stupid. But at the time it was considered absolutely right and essential.

We can laugh at the Aztecs, we can laugh at the Mayans. But not that long ago in the fanciest squares in Paris you used to hold public executions, you would cut some poor person's head off, and then you would hold it up for the crowd to see. People would be dressed in their weekend finest, and they would be buying postcards, eating foods, and enjoying the spectacle, which again to us looks absolutely barbarous, but at the time was considered right and part of the rule of law and was justified by the church and all of the authorities.

You can take some of the most disastrous, horrible, and unjustifiable actions done by humans, such as slavery, indentured servitude, and serfdom, and these are practices that were not just prevalent in the United States' South, they were prevalent in the United States' North, they were prevalent with the Incas, the Mayas, the Indians, the Chinese, the Arabs, and the Africans across millennia. All of us on this call know that this is an absolutely wrong procedure, that there is no justifiable reason why any human being should own, control, or do horrible things to another human being.

If we all know that, then why in the hell was this tolerated for millennia across so many cultures? A question which is equally important is: Why did it go away? Why did it go away in legal terms—I understand there is still slavery today, but it is illegal in all civilized societies, and it became illegal in a relatively short period of time, in decades, across many societies.

Obviously part of this is due to incredibly brave abolitionists, like Harriet Beecher Stowe, like Harriet Tubman, like Henry Ward Beecher, putting their lives on the line and opposing a system which was defended by many. They fought against it and put their lives, their jobs, and their reputations on the line, and that is certainly a part of it.

But it may not be a complete coincidence that slavery, indentured servitude, and these practices, which were millennial, began to go away when you started using energy, because one barrel of oil contains the energy equivalent of five to ten years of human labor, and when you tie that barrel of oil to thousands of horsepower, then all of a sudden you can have the work of a million people at your disposal without having to enslave or have in indentured servitude hundreds of thousands or millions of people. So what technology may have done in this instance is it may have given us a series of options that we did not have for millennia. Again, I want to stress this in no way justifies behavior which is absolutely heinous across millennia, but it may be a reason why stuff we tolerated, which was legal for millennia, begins to go away.

So what happens to the world? Once you begin to do away with slavery, once you begin to have energy, once you begin to have thousands of horsepower at your disposal, a life expectancy, which had been flat for millennia, begins to go up dramatically on all continents. One of the things that may be a consequence of this is that technology as it is changing ethics may change ethics faster and faster because technology is becoming exponential.

Take something like gay marriage. I was brought up in Mexico, I went to Jesuit school, I was taught by everybody I respected that it was right to discriminate against gays. I was brought up to be a little bigot. Had I not had exposure to U.S. education and U.S. society I may still be a bigot, and it pains me.

But it is also true that two-thirds of the United States was against gay marriage in 1997 and that this flipped 180 degrees by 2017, to the point where two-thirds was for gay marriage and gay rights. Again, this has a lot to do with technology. It has to do a lot with other cultures, other lifestyles coming into our lives through television, through radio, through film, through international exchanges, through international travel, and through our schools. So it was much, much harder to see these folks as "the other" and somebody who was different. We saw very creative, very talented, extraordinary groups of people that had been in the closet come out with the crisis of AIDS and begin to act up, begin to speak up, and begin to be listened to. Suddenly it was your uncle and it was your cousin and it was your friend, and it became much harder to demonize these groups.

It became so hard that Cardinal Bergoglio, who in 2010 said, "This is a destructive attempt towards God's plan," flipped within three years to, "Who am I to judge?" And last month he said, "I'm in favor of gay rights," even though most of his flock has not gotten there. So you can see a very quick shift in what is right and what is wrong.

Of course, the question is: What if many of the scientists, of the researchers, of the people listening to this are the people causing this shift? What happens if what is okay today may be completely wrong tomorrow?

As you are considering this concept, think about things that we are doing today that may be judged very harshly tomorrow. Assume for a second that your Fourth of July pictures of that wonderful barbecue that you are going to celebrate this July —because we should be free to travel once again and see our loved ones once again—ends up being seen as a savage event in a few decades?

Why would that be true? Because synthetic meat was $380,000 a burger in 2013, $30 a burger in 2015, and about $9 a burger at Whole Foods last week. In the measure that synthetic meats become faster, better, cheaper, in the measure that places like Burger King begin to sell out of "impossible" burgers before they sell out of regular burgers, it may be that pictures of mom and dad going out on date night to the fanciest restaurant and walking past these racks of aging bloody steaks may not be seen kindly when you don't have to slaughter 6 billion animals a year to eat great meat or great steaks.

Again, taking a time machine, what I want to stress is how quickly things have changed, how radically things have changed, and how quickly they may continue to change. Imagine talking about the birds and the bees with grandpa and grandma. If you were talking about the birds and the bees with grandpa and grandma, and you brought them here today and sat them at your dining room table as randy 20-year-olds, they would probably understand sex because they were probably married at that point. But then you would have to tell them, "Oh, by the way, you can now have sex and not have a child, and you can do this consistently." Yes, they had heard about birth control. There was primitive birth control, but it wasn't consistent, it wasn't for the most part legal, and it wasn't practiced by many of them. So all of a sudden you have decoupled the act from the consequence.

Then you begin talking to them about in vitro fertilization (IVF), and you say: "Hey! We can have a child by mixing sperm and egg and never touching one another, in fact, not being in the same room or maybe even the same country." Again, they would look at you and say: "Hang on. You can conceive a child without physical contact? We used to call that the immaculate conception, and we considered that to be a miracle."

So as you are thinking about these structures—and then you explain that you can freeze eggs, freeze sperm, have surrogate mothers, and you can have identical twins born decades apart, you not only decouple sex from the consequence through birth control, you not only decouple sex from physical contact through IVF, but you now decouple sex from time.

If you asked them back when they were 20, are any of these things right, should we be doing any of these things, do you think this is right or do you think this is wrong, they almost all would have said, "This is absolutely wrong," but today it is something we consider not only right but our right.

As we look at headlines like "CRISPR Baby Scandals of Editing Genes," I think many of us would say: "I really don't want you to be editing babies. I really don't want you to be editing the next generation."

But again this could be flipped 180 degrees as things get faster, better, and cheaper because you can easily see a conversation with our 60-year-old grandkids where they say: "Grandpa, grandma, mom, and dad were so savage and so primitive, they didn't bother to edit my genes when I was a baby or an embryo, and because they didn't take a KRAS, because they didn't take out a p53, it turns out that now I have cancer, and they could have fixed that, but they were just too primitive."

Birth may change in a fundamental way. The first synthetic womb was patented back in the 1950s, but now you are beginning to get full animals brought to term in what look like giant Ziploc bags. As you are looking at this stuff, there is kind of a "ewe factor," not just because this is a sheep's ewe, and as you are thinking about those structures there is an itchiness of "Why would you bring something to term outside of the body?" Well, would an incubator like this be better for a preemie baby as opposed to those big plastic boxes we put them in today? And as you think about that, the long-term consequences may be that it is safer to keep a child in something like this than to take it mountain biking or to expose it to Zika on a trip with your loved one. So the logic of this again may flip 180 degrees, and once you have an external embryo it may be possible that you edit and that you choose to edit.

The whole point of this is not whether you agree or disagree with any one of these technologies. The whole point of this is that if you take the position today of "I know right from wrong"—and there is a certain amount of that Sturm und Drang out there today—then you create a no man's land between somebody who is on this position and somebody who is on this position, and there is no discussion, no tolerance, no evolution, and no learning.

That happens an awful lot these days on campus. It happens an awful lot these days in science debate. It happens an awful lot in politics. You take an entrenched position over here or over here, and there is no discussion. But again, if technology is exponential, if technology changes ethics, and if ethics may change at exponential rates, we need to have a more nuanced discussion of ethics because we are going to start seeing technologies that to us seem miraculous, weird, or scary but eventually may become commonplace.

Tony Atala at Wake Forest is thinking about the organs of the body in the same way that we think about teeth. Teeth disappear, they appear, they do it once, they do it twice. The whole point of this thing is that may start to happen with livers. That may start to happen with lungs. That may start to happen with hearts, with a whole series of things. And as it does so, as we begin to program synthetic life forms, as we begin to print life forms, as we begin to assemble life forms that are a mixture of rat cells, rubber, gold, and other structures, we are going to be facing questions about what life is, about how we edit life, and about how we change life. We are going to have complicated debates about ethics and right and wrong, and as we structure these things our position, who we are, where we are, and where we go are going to require two things: Not a certainty about what right and wrong is, because that may change; it is going to require two words that are very rarely used today, humility and forgiveness.

It is incredibly important when we judge the past that we be a little more humble because the meanings of words, actions, and contexts can change. That does not justify the acts in the past, but when people were acting in that context they may not have had the context we have today. And if we don't have some humility, we can't cull out the people who in the context of the past were acting in a truly evil way. We give an umbrella to the truly evil under the context of "everybody does this."

The last thing is forgiveness. It's important that we forgive the past and we forgive each other. Again, not the truly evil. You have to cull out the 1 percent who is truly evil, but we have to be a lot more tolerant in terms of our notions of right and wrong because our notions of right and wrong may fundamentally change over time.

Let me stop there.

WENDELL WALLACH: Thank you ever so much, Juan. You have no doubt stirred up all kinds of questions in the minds of those who are viewing with us.

I suspect even some of you are already beginning to think about technologies that you wonder or you believe are going to change our ethics, so please feel free to put both those questions and comments about those technologies that you are concerned or that you anticipate will create ethical challenges for us in the Chat session, and before the end of our talking today we are going to turn to what is in the Chat and give a few of you an opportunity to bring up what you have brought to our attention.

In the meantime, we are lucky to have two respondents—interlocutors—with us today. One is Sherman Teichman. Sherman is a truly legendary emeritus professor of political science at Tufts University. While he was at Tufts he began so many innovative programs. That is actually how I first Sherman, when he invited me to participate in a very exciting program. But he put these all together and became the founding director emeritus of The Institute for Global Leadership. In addition, he is the founding president of The Trebuchet, our co-host for this event.

Accompanying Sherman we have Kit McDonnell. Back when she was at Tufts, Kit was an alumnus of Sherman's programs and an intern to Sherman, but she has gone on to create a really distinguished career for herself. She is currently the director of corporate affairs at the agtech startup Enko Chem, and formerly she led special projects at the synbiocom company Ginkgo Bioworks. So she works at the intersection of biotech, design, sustainability, and biosafety.

Let me turn to you first, Sherman. Do you want to bring in some comments or questions?

SHERMAN TEICHMAN: Thank you, Wendell. Juan and I have known each other now for decades. Both of you are mentors at this particular point for our own Trebuchet, my consultancy, and the reason that you both are engaged in our work is because you do not see the world in dichotomous terms. You are not Manichean in your thinking. You are disrupters of the first order, and it is an honor, Juan, to think with you.

In a certain sense we are living in one of the more polarized periods of our history. It is disruptive. It is incredibly easy to rip apart. Years ago you actually wrote a seminal book for me—not for me, but we had you as an outward bound instructor—and it was called The Untied States of America: Polarization, Fracturing, and Our Future. I'm wondering if you can look back a little bit as well as you look forward, and in particular think with me about an anecdote I heard from you once, which was, what would happen or perhaps did happen when you spoke before the cadets at West Point?

The whole question of the continuity of our sovereignty, our nation, I think is now at stake in many ways. People are talking about an "uncivil war." We are ripped apart in so many ways, not least by the manner in which we are so judgmental. The reason in a major way that I deeply appreciate your book is not only the exponential rate of technology, but what impact it has on our political structure and the way in which we have lacked capacity for humane discourse.

JUAN ENRIQUEZ: Sherman, I have been going back and forth not just on ethics but on the future of the nation-state and the future of technology for decades now.

The thing he is referring to is a book that I published in 2005 that argued that there was going to be a gigantic financial crisis. It was going to be driven by over-leveraging real estate, but the true long-term effect was that it was going to rip nations to pieces. That was The Untied States of America: Polarization, Fracturing, and Our Future. It was written as a warning, not as a manual of how to.

As you are thinking about that, three-quarters of the nations in the world, the flags, borders, and anthems did not exist a few decades ago. It is very easy to rip a nation apart. You have tripled the number of borders in Europe in less than a century, and you still have a debate going on with the Basques, the Catalans, the Galicians, the northern Italians, the southern Finns, and the Corsicans, etc.

When it comes to the United States, we think that even if we spend several billion dollars convincing 51 percent of the population that you never, ever want to associate with those others over here because they are different, they are pedophiles, they are baby killers, they are rednecks, they are elitists, they are this, that, or the other, you can and will rip our nation to pieces.

So the question Sherman is referring to is the question that I ask cadets at West Point when I am privileged to lecture there. The first question that I ask them is: "How many stars do you think will be in the U.S. flag in 50 years?" That's a gut punch to a cadet who is about to go out and be deployed to give potentially his or her life for that flag.

So you have to follow up with a second question: "Exactly how many presidents of the United States have been buried under the exact same number of stars that they were born under?" The answer is exactly zero. There has never been a president of this country born under the same number of stars that they were buried under. So it's not such a lunatic question, especially in these times of polarization.

I think it is a really dangerous time to stereotype half the country and say: "They are different. They are not like us. They are hateful." Again, because what it does is it creates an umbrella where the truly evil can say, "Well, everybody does this, and therefore you can't judge me separately." It creates a dynamic where what has been happening in the rest of the world may end up happening in the United States, which is, "I never want to associate with those other people."

I think that is a really lousy outcome. I think this is a great country, and I think we put kids in cages and demonize half the country at our peril.

KIT MCDONNELL: Thank you for that one. I have a bit of a follow-up question, especially in light of recent events going on in the United States. In your book you talk a lot about this breakneck speed of technology. It seems that there is a dichotomy right now between the need for fast technology and fast science in contrast with slow and thoughtful technology. Biotech's response to COVID-19 has had to be swift, and it is imperative that it is, but we are also seeing how quickly a form of technology can be weaponized, such as communication apps that recently assisted in the insurrection at the Capitol.

Can you make the case for slower technology and slower science, à la Isabelle Stengers?

JUAN ENRIQUEZ: I think it is very hard to slow technology. It is certainly possible in certain areas. It is certainly possible in certain countries. But I think having a global moratorium on rapid communication technologies would be very difficult to do.

I think it becomes a question of how do you create ethics, legitimacy, transparency, and accountability because it is not that you are having an opinion that is different from mine or you are having a speech that is different from mine that I think is the fundamental problem. I think the fundamental problem is when you are able to weaponize some of this stuff through bots, when you are able to create false voices, when you are able to create fake narratives, and when you are able to put stuff out there with no accountability for what you say.

I think the lawsuits that the voting machine companies are putting out there are a shot across the bow on this stuff. I think the pushback against some of the weaponization of fake news is important, and that is a debate we absolutely have to have. But as you have that debate you also have to think that we never would have had as strong a local, national, and international reaction to Mr. Floyd's death had we not all had a broadcast studio in our pockets that looks like your cellphone, because this is a high-definition broadcast studio with global reach.

I think some of the injustices, some of the corruption, and some of the evil that is out there has been exposed along with a whole lot of fake news, and it is separating and culling. Ninety-nine percent plus of people go to bed at night not thinking, How am I going to hurt my neighbor? They go to bed thinking: How do I ensure a better life for my kids? How do I get liked by my peers? How do I have a decent job? How do I take care of those I love? That is the part that we need to feed as we cull out the less than 1 percent that is truly evil.

SHERMAN TEICHMAN: I am as you know an educator, and you have a wonderful story to tell about South Carolina, Furman University, in the context of who educates and how is it that we have as educators a remarkable responsibility to think about what is authoritative and what needs to be questioned.

JUAN ENRIQUEZ: I think the role of educators in ethics is way understated. Often we think of ethics as a very thick manual that you get when you go to school or that you get when you get your first job. That's not ethics. That's not the stuff that Wendell thinks about and writes about. Ethics is a living, breathing, very complex thing that changes over time, and I think educators in colleges and universities have a particular essential need to continuously question right and wrong because when they don't they make huge mistakes.

You can think about that in the context of a young man arriving at the University of South Carolina circa 1830 or 1840. That person would have been taught in church by a reverend called Richard Furman that slavery was fine, normal, and natural, and that would have been based on select passages from the Bible. You would have had that same person being taught by the family doctor, who during that time in South Carolina was the man who created gynecology and who also thought it was fine to buy human beings to experiment on. That person would have been taught by mama and papa, who likely had slaves, that this was the natural order of the things.

The educators, the teachers, the preachers, the parents, everybody who is supposed to tell this kid, "This is right and this is wrong," was teaching this kid exactly the wrong thing. It is especially troubling when he gets to the University of South Carolina because the person teaching him at that point, Dr. Cooper, was an Oxford "don" who was a chemist, a scientist, and an abolitionist in Britain. He is brought in to run the university to teach these kids, and within a short period of time an abolitionist flips 180 degrees and becomes somebody who is writing pamphlets in favor of slavery and becomes one of the architects of secession in the United States.

That person should have known better. There should not be a library at the University of South Carolina named after Cooper. Because in the context of the time, boy, that's somebody who should have known better. I get really upset when I see an aircraft carrier named USS John C. Stennis, and I walk on the aircraft carrier and I see a little museum to a segregationist. I get really upset when I see some of the buildings named after people who should have known better. This is not a 16-year-old being taught by a system. These are people in the 1960s and 1970s.

By the way, as I look at our own lives these days, why the hell did we tolerate putting kids in cages? Why weren't we out in the streets en masse protesting the stealing of children from their parents and the deliberate separation, which is something which the Argentine junta did, the Brazilian junta did, the Chilean junta did, which is a crime against humanity? Why in our time, in our self-righteousness, did we tolerate that?

KIT MCDONNELL: Juan, I have a question specifically about CRISPR babies as you mentioned in your opening. I know this is a deeply, deeply controversial topic.

I want to ask you specifically—for example, I am red-green colorblind, which is actually pretty rare for a woman. I wanted to ask you: How do you reconcile the tipping of scales from something that is a medical intervention to, say, a cosmetic intervention? How does that translate into the way that we can respect something like evolution and the history that evolution has in perfecting and iterating upon the natural world?

JUAN ENRIQUEZ: Again, that's one of the areas why I love listening to and learning from Wendell and other ethicists, who are not taking one position of right/wrong but are saying, "You have to understand these are complex systems," because again that's where ethics gets so interesting.

The whole plastic surgery debate is a fascinating debate because there are tens of millions of plastic surgeries per year, and there is a rumor out there that not all of them are done for strictly medical reasons. It is just an unfounded rumor, I'm sure.

But as you are thinking about that question, when you have to disclose it, how should you disclose it? Who should you disclose it to? It turns out that the place where women get the most nose modifications in the world is Iran. When the baby is born and the nose looks very different from that of the mother, should that have been disclosed? How do you think about that?

As you think about more complex interventions which are not strictly cosmetic interventions but interventions in—a lot of women are tetrachromal and have four cone receptors and can see many more colors than some men. Some people may have certain senses that are more acute than other people. If you could safely, better, and cheaply engineer that into a person, should that be allowed as a cosmetic procedure if it's safe? Who has the choices on those things?

The question you're asking is a question which today CRISPR is nowhere near as targeted or as safe to be able to say yes, but what if it was, and if you knew it was safe? Who has the right to alter their body for better looks, for more pleasure, for increased senses, and at what point do you make that decision? Again, that's where ethics comes alive. That's why when we hear the word "ethics" we should think: Wow, this is a really interesting, complex, dynamic system, and these debates really matter to the future of our societies and to the future of our bodies and humanity.

WENDELL WALLACH: Let me jump back in here. I usually relate to these topics about the ethics of emerging technologies very much in their own languages, but I am going to indulge being a bioethicist here for a moment because one of the characteristics of your book—and I think what makes it so fascinating and entertaining—is that you do avoid the languages of ethics. But I am also wondering whether that is helpful to us.

Like you, I don't think the languages of ethics necessarily give us right and wrong. They are not algorithmic. At best what they may do is help underscore some of our salient ethical considerations that we want factored into choices and judgments. But in your avoidance of the language of ethics—first of all, I didn't see you use the words "consequentialism" or "deontology" once, which are the favorites of academic ethicists, but you do say at a few points that you are not a relativist. There is a right and wrong. Yet, the way in which you approach technology as forcing the evolution of ethics, I want to know where does that right and wrong come from.

JUAN ENRIQUEZ: One of the fundamental kind of unbending rules—when Karen Armstrong and others went across religions and tried to find commonality in religions, I think the Golden Rule is about as close as you can get, which is, you want to be treated in a way which is decent, respectful, and just. Our understanding of decency, respect, and trust changes over time as to what that standard looks like. But the fundamental consequence of that is that you should treat others in the same way.

I think one of the things that we have gotten better at is not seeing large groups of people as "the others." I think as we become more conscious of the need to address things like climate change or the need to address poverty or the need to address a pandemic on a global scale, it becomes harder and harder to isolate groups of "the others." As long as we don't see them as the others, it becomes much harder to say: "I deserve to be treated in a completely different way from the way I'm going to treat these folks over here."

I think there are two things going on here. One is that the basic Golden Rule is applicable to more and more societies, to more and more people of different religions, of different colors, of different beliefs. I think that's a really great thing because if we don't do that, we simply cannot address things by saying there is going to be a wall.

We just found out through a pandemic that, yes, we can vaccinate every American, but if the Brazilians don't do it or the South Africans don't do it, then the variant is going to come back again and infect us. We leave others behind at our peril. We treat other people as "the others" and we say, "The standards for my society are this, and the standards for that society are the other," and that evolution of language, that evolution of inclusiveness of things becomes incredibly important.

One of the changes that the current administration has made which I think is going to be really important across time is to remove the official word "alien" from how we talk about people who are non-citizens. Just the word "alien" brings up science fiction; it brings up people who come from another planet. In the measure that they come from our planet it becomes much harder to say they're not going to feel the pain we would feel if the kids are deliberately taken, caged, and lost. You can do that with aliens. You can't do that with human beings. You can't do it with your tribe. You can't do it with people who are like you, and that inclusiveness of language, that evolution of language, tends toward that Golden Rule.

We get better over time. The whole notion of whether you're an optimist or a pessimist in part depends on the answer to the following question: If you could safely be put into hibernation and wake up in a hundred years, would you want to do that? My answer is, "Hell, yes!" because I think things are going to be better in a hundred years. I don't think we are going to wake up to a dystopian, nuclear, Mad Max [world], and you see that across time. You see it in how we have reduced the use of violence. We have reduced the use of torture. We have reduced burning people at the stake. That doesn't mean it is completely eliminated, but it means that the legitimacy of things that were formerly considered okay is tending I think, with ups and downs, in the right direction.

You have to assume that if you are going to be an optimist. You also have to limit your judgment of the past and evil people in the past so it's not "Everybody was evil." It's "In the context of their time, in the context of how people were acting, these specific people were truly awful, and we have to highlight their behavior in the context of their time." I wouldn't treat Eli Yale in the same way as I would treat Benjamin Franklin. I think Eli Yale was not a nice human being under any definition. That's not justifying what Franklin did, but it's putting a context in his time as to who he was and who somebody else was.

WENDELL WALLACH: Affiliated with Yale University, I don't know if I should take any issue with that comment, but I do want to come back to this battle between techno-pessimism and techno-optimism because you certainly aren't making the case that technology always forces our ethics to evolve in positive ways.

I think back to Eli Whitney and the cotton gin. Slavery was actually about to become economically unfeasible until Eli Whitney created the cotton gin, and then suddenly it became economically valuable to have slaves again. Then we go on to the Civil War with the Gatling gun and this tremendous evolution of military technologies. I get particularly concerned about lethal autonomous weapons and whether their adoption could actually lead to that "accident" that sets us back in spite of my techno-optimism.

How optimistic are you? Are there technologies that you are particularly concerned about and that you are willing to reject for ethical considerations?

JUAN ENRIQUEZ: If I left the impression that I think every technology always leads to a better outcome and that there aren't wildly unethical ways to use new technologies, that is not the impression I want to leave, and that is not what I believe. I think technology can be misused in incredibly dangerous ways. I think if we don't understand how quickly technology can change our notions of right and wrong, then we are in danger of misusing technology in a fundamental way, and one way is the autonomous weapons debate.

When you think of Asimov's basic rules and you think about one of the three fundamental rules being "Robots should not hurt people," you are violating one of Asimov's rules by making these weapons more and more autonomous, and we do that at our peril. That I think is an incredibly dangerous thing to be playing with.

I think we are underestimating the danger of nuclear weapons. I think we have gotten crazy used to that during a time when proliferation seems more and more likely. It is certain that one single nuclear weapon can truly ruin your entire day. So as you are thinking of that structure, I think we are paying way too little attention to proliferation, to the risks of nuclear war, and to really focusing on the question, which is: What will be seen as truly wrong in a century that we were doing and the notion that we would tolerate that single individuals in this world have a decision and can make a decision to wipe out most of life on earth using nuclear weapons?

People are going to look at that in a hundred years and say: "What the hell were you thinking? Why did you tolerate that? Why was that okay? Why do you think it's okay for the president of a country or the prime minister of a country or the dictator of a country to have that kind of power or weaponry? And why weren't you using every breath of your day to try to stop that"—hopefully before you have a major war that brings that question to the forefront? So I completely agree with you.

Let me step back for one second. Why did I write this book? I spend most of my time thinking about and working on building companies, writing stuff, or studying stuff that has to do with engineering life forms, with building synthetic life forms, and with understanding the brain, what happens inside the brain, and how we can alter the brain. Every time I gave a talk about this stuff I was asked about ethics.

I sat back and thought: Hey, this is something relatively simple. I will come up with 10 principles, and I will have an answer to the questions I am asked. Well, six years later, this book gets published. This book is crazy incomplete because it doesn't have an answer at the end of it. It doesn't say, "If only you do X or Y, you will know right from wrong." This is a book that is intended to stimulate debate and to make debate safer on campus because so many people are running around thinking: I know exactly what's right. I know exactly what's wrong.

That makes it very difficult to talk about subjects. If you use one wrong phrase, if you have one wrong tweet, if you sit next to one wrong person at a dinner, you can be condemned for what you did 10 minutes ago, 10 days ago, 10 months ago, or 10 years ago. That leaves very little space for mistakes. That leaves very little space for debate. That leaves very little space to talk about the stuff you just posed. There are a bunch of places where ethics gets really mushy and gets really complicated. It's not "This is right and this is wrong." It becomes fifty shades of gray. That is why it's so important to pay attention to this stuff.

WENDELL WALLACH: I am going to bring in my colleague, Alex Woodson, for a moment. He has been monitoring the Chat.

Alex, perhaps you can bring out one or two or three of our participants and their comments or questions to lead us forward.

ALEX WOODSON: Sure. There is a lot in the chat. I'm sorry, but I don't think we will be able to get to all of it, but I will bring in two people right now.

Sybille Riedmiller has been writing a few things: "The bottom line of atheist humanism is the famous saying of Immanuel Kant: 'Treat others the same way you want them to treat you.'" They also write: "Weapons of mass destruction are fundamentally ethically evil, irrespective of technological advances in this field."

Here is a question that maybe we can discuss from Whit Henderson: "How will our ability to alter the climate through geoengineering change our future moral relationship to nature? Are you concerned that geoengineering/solar radiation management might be framed as a technological fix to climate change that allows us to continue to burn fossil fuels?"

JUAN ENRIQUEZ: Let's step back. Stuff we tolerate [now] we will be seeing in retrospect as absolutely wrong. When you see the cost curve for the price of solar per hour or when you see the cost curve on geoengineering or when you see the cost curve on wind, those are cost curves that are coming down in a logarithmic scale. They have already crossed most of the price of coal, which means it makes no sense to use coal instead of solar in most instances. They are going to cross the price of petroleum for most uses, not for aviation or a couple of other uses.

But basically you have faster, better, and cheaper energy, and in the measure that you have faster, better, cheaper energy people are going to look at what we did to heat our houses and to move our cars, and they are going to say: "You people were savages because you destroyed the planet to such an extent that you passed the point of no return, and you had to use extremes like geoengineering to begin to bring down the temperature of the planet by 1°C because it was past a tipping point," which is I think where we are.

I would prefer not to use geoengineering. I don't see a way even with extreme conservation—and you have seen it with the pandemic—of having enough of a fast impact on this to cool the planet fast enough before you get to some really horrible things.

In that context I think that we will fix it. It is going to be incredibly painful. It will require an extreme retreat from part of the coastlines because of what is already baked in, and future generations will look back from having faster, better, and cheaper energy and say: "You people were worse than the people who hunted whales mostly to death to light their houses. How dare you have used oil? How dare you have used coal?" Without the context of, during most of our time we did not have access to faster, better, cheaper sun, geothermal, wind, but that doesn't mean we are not going to be judged very harshly.

WENDELL WALLACH: I think Sherman wanted to weigh in, and then Alex will go to our next question.

SHERMAN TEICHMAN: Wendell, I first met you in the context of autonomous killing vehicles, drones, and you actually wrote a statement of ethical concern for Obama with Dirk Jameson, one of the former heads of the Strategic Air Command.

I am thinking about education. I am thinking about ethical norms and iconic figures. Dan Ellsberg is so well known for the Pentagon Papers, but more importantly he has written a book called The Doomsday Machine: Confessions of a Nuclear War Planner, which I would think is an imperative text for people to understand.

Juan, I think what you have spoken about in terms of the context of that threat and in terms of that technology is extraordinary. I think the Bulletin of the Atomic Scientists today is pulling back their so-called "Doomsday Clock" because we are reentering arms-control regimes, but I think the interface between that technology, which we are remarkably ignorant of, and its consequences is important to put into this conversation.

WENDELL WALLACH: Alex, do you want to bring in another question from our participants?

ALEX WOODSON: Sure. There are two questions on data and privacy. I will read them both.

This is from Aaron Markowitz-Shulman in Oxford: "Juan, I am interested in your views on privacy. It is clear that the data collected on all of us will continue to increase over the near term. Do you think that we will figure out a way to empower the preservation of privacy for individuals or that we will collectively give up and submit to a new reality where conventional notions of privacy no longer exist?"

This is from Jessica Kahn: "We have seen many examples of the unethical use of data by private companies causing harm. I have seen suggestions that data scientists take an oath like the Hippocratic Oath to say you will use data in an ethical manner. What are your thoughts on this?"

JUAN ENRIQUEZ: One of the reasons why I am such an advocate of humility and forgiveness in judging the past, besides being able to isolate the truly evil during that period, is because the meanings of words change and the meanings of actions change. What we can do changes. Therefore, we are going to be judged really harshly going forward if we keep the same rules as to how we are judging the past and broad-brushing the past.

Take Mark Twain. You say: "Your books can no longer be placed in the library because I don't like your use of the word." You put no context of was that word used and how was it used during that period, and how was he using it? Was he using it as a hate word? That is something which is incredibly important to this question of privacy.

When you think of the most scrutinized people in history—the kings, the queens, the great artists, what we know about Beethoven, what we know about Mozart, and what we know about Queen Elizabeth or whoever in the past—if you took all of the world's great historians to probe that person and that person's letters, it is trivial what we know about those people versus what we know about each of us. We are all putting out a degree of digital exhaust through Facebook, through Twitter, through Google, through our searches, through our dating profiles, through our posts, and through our credit card purchases. These are all electronic tattoos. In the same way that a tattoo is sometimes hard to cover up, an electronic tattoo is really hard to cover up, and it lives after you die. It doesn't get buried. So people are going to know what grandma's dating profile and sexual preferences were because it's all out there, and people are going to know grandpa spent money on X, Y, and Z. How dare he have done that?

What I tell young people today is just be really careful in what you put out there, and whatever you put out there make sure you are comfortable putting it out in public. When I post on Facebook and Twitter, if I'm not willing to make that post public, I'm not going to put it out there because you have to assume there is going to be no privacy there. That does not mean that I am perfect. That does not mean that I don't make massive mistakes. That doesn't mean that I won't do things that in 10 or 20 years will be seen as "How could this person have done that?" But it does mean that young people today have to be far more conscious of this stuff even as they become more accepting of a lowered privacy.

We have had a series of hacks of now dating sites, which I talked about before they happened on a big scale in this book, but it has now happened on a big scale. That is letting people know that some of your most intimate thoughts and structures. Boy, that electronic tattoo may not look as pretty once you are married, once you have kids, or once you have grandkids. So the data privacy debate is a really, really important debate on ethics, and it's a place where we have to think about right and wrong in a very rapidly evolving structure.

WENDELL WALLACH: I'm sure we all truly, truly embrace and support your effort to bring a little bit of humility into this conversation on ethics, specifically in relationship to these technologies that are posing daunting challenges for this and coming generations.

I also would like to say I think on behalf of all of us, hopefully your book will find the audience it deserves, because we need this conversation, and we need it badly.

Again, thank you. Thank you, Sherman. Thank you, Kit. And thanks to the staff of the Carnegie Council for putting this event together.

Vous pouvez aussi aimer

9 FÉVRIER 2022 - Podcast

Où est la place publique à l'ère de l'information numérique ? avec Stelios Vassilakis

Dans cet épisode du podcast "Artificial Intelligence & Equality Initiative", Anja Kaspersen, Senior Fellow, et Joel Rosenthal, président de Carnegie Council , s'entretiennent avec le Stavros ...

9 DÉCEMBRE 2021 - Podcast

Éthique, gouvernance et technologies émergentes : Une conversation avec l'Initiative Carnegie pour la gouvernance du climat (C2G) et l'Initiative pour l'intelligence artificielle et l'égalité (AIEI)

Les technologies émergentes ayant un impact mondial créent de nouveaux espaces non gouvernés à un rythme rapide. Les responsables des initiatives C2G et AIEI de Carnegie Council...

5 AVRIL 2022 - Podcast

L'IA et les processus de prise de conscience collective, avec Katherine Milligan

Dans ce podcast intitulé "Intelligence artificielle et égalité", Anja Kaspersen, Senior Fellow, et Katherine Milligan, directrice du Collective Change Lab, explorent ce que nous pouvons apprendre de ...