Stephanie Sy and Rana Foroohar. CREDIT: Billy Pickett.
Stephanie Sy and Rana Foroohar. CREDIT: Billy Pickett.

The Dangers of a Digital Democracy, with Rana Foroohar

Apr 3, 2018

The revelations about the misuse of Facebook data have started a pushback against the top five big tech companies: Facebook, Amazon, Apple, Netflix, and Google. How do approaches to privacy and data use differ in the U.S., Europe, and China? What kind of transparency should we demand? How will AI affect workers? All this and more in a lively and informative discussion with author and "Financial Times" columnist Rana Foroohar.

STEPHANIE SY: Hi, everyone. I'm Stephanie Sy. Thank you so much for joining us on Ethics Matter. I'm so pleased to be joined by Rana Foroohar today to talk about big tech.

Big tech lies at the center of a lot of the ethical issues, frankly, we face as a society today. How companies like Google and Facebook and Amazon use our data has long been a topic of controversy among academics. We have spoken to a lot of them on this program, but recent revelations about how a third party, Cambridge Analytica, allegedly exploited the data of 50 million Facebook users has really brought this debate into the mainstream.

Rana has been writing about these threats for a long time in her column in Financial Times (FT). She is also the author, by the way, of the award-winning book Makers and Takers: The Rise of Finance and the Fall of American Business, and we're going to get into Makers and Takers and what that means in this conversation as well. She is also CNN's global economic analyst.

Rana, thank you so much for joining us.

RANA FOROOHAR: Thanks for having me.

STEPHANIE SY: Let's just hop right into the Facebook issue. On the one hand, I think when this story broke I was grateful at the amount of data that Facebook has, how it uses it, and how third parties are involved was being exposed because it had long been suspected. Now it seems to have brought the debate into the mainstream.

On the other hand, I think a lot of people are saying, "Wow."

RANA FOROOHAR: "What's going on?"

STEPHANIE SY: "How do I untangle myself from something I have become completely addicted to?"

RANA FOROOHAR: It's interesting. I think of the challenges facing not just Facebook but big tech in general. If you look at the top five tech companies, the FAANGs, as they are known—Facebook, Amazon, Apple, Netflix, and Google—all of them in recent days have taken major share price hits. There are big concerns about regulation. There are big concerns about what the future growth trajectory of these firms is. That, by the way, just as a side note, has a huge effect on the markets and the economy in general because tech has led the markets for the last two years. This "asset bubble" perhaps that has been created has had a real effect on the economy.

But coming back to what all this means, I think of it in three ways. I think that the big tech companies are facing economic crisis, political crisis, and cognitive crisis. We are starting with the political. We're starting with this revelation that there was leaked data from Facebook users, that it played a part in election manipulating and possibly the election of Donald Trump in November of 2016.

There are also other political issues here about lobbying, about the way in which these companies have asserted their economic power. The largest lobbying bloc in Washington now, which is really interesting—my previous book covered the financial industry, and in some ways these tech firms have become the new systemically important institutions.

STEPHANIE SY: That's why we call it "big" tech.

RANA FOROOHAR: That's why it's called big tech.

STEPHANIE SY: You get the "big" moniker when you have lobbying power, and they have more than big pharma at this point.

RANA FOROOHAR: That's right, and they're using it, I can tell you. That is something that I think viewers should think about because we are seeing and we are going to hear a lot in the next few weeks and months about "Big tech's going to get regulated, there's going to be a pushback."

Well, that's complicated. Yes, there already is a pushback in certain ways, but it is not at all clear that there is going to be regulation that will really curb the power of these companies, and I think that we're facing a similar situation that we did in 2008, where you've got an industry that is so big, so complex, and touches every single person almost in the world really that checking it, reining it in, and creating an ecosystem that really enriches everyone is going to be a process.

STEPHANIE SY: So, 2008, obviously you're talking about the financial crisis, "too big to fail," the power of Wall Street. Makers and Takers is very much about how the markets and the way these financial institutions began working was not about creating and investing in jobs and job training and things that would help grow the economy, it was more just to enrich themselves with smoke and mirrors that led to the financial crisis.

Let's dig a little deeper into your analogy here. How do you see that comparing with these big tech companies?

RANA FOROOHAR: I see a lot of parallels. It is interesting because the first chapter in my book—I was looking for the most Kafkaesque example of how the financial markets were being used or misused, and the example that I came up with was Apple. Of all the FAANGs, it has probably had the least bad press in recent weeks, but it's actually at the heart of the economic problems associated with big tech in the sense that these firms get most of their value from intellectual property (IP) and from data, Apple not so much data but certainly intellectual property.

These things are very easy to offshore. It's just like capital; globalization favors capital over labor. Companies can move money, move people wherever they like, but individuals living in nation-states on the ground don't have those opportunities. If you look at the amount of money that has been offshored not just by Apple but by Google, Qualcomm, any number of other large tech firms, it's about $1 trillion in the last decade.

Actually, it's interesting. During the financial crisis, the financial sector tanked for a while and had to rein in risk, rein in leverage. Big tech was unleashed. So these companies were actually making a lot of money. About $1 trillion of it is offshore. At the same time, they are issuing debt on the U.S. public markets at very low interest rates, using that to pay back the wealthiest shareholders, the Carl Icahns of the world, and this seems very unfair to a lot of people who say, "Hey, these companies actually got rich in part because of government-funded research, and what are they monetizing?"

Again, Apple is in a somewhat separate category. It sells products. Google sells products as well—the Android phone—but a lot of what these companies are monetizing is us, it's data.

STEPHANIE SY: Our data. So that brings us back to the Facebook story, and I really want to get your take—we'll get into regulation, so let's leave that aside and get into what makes sense there—on that for a minute. I think people are aware that that's how ads are targeted on Facebook, that they use your data, and we willingly still log on and share our most intimate life moments because Facebook has this sort of brilliant psychological business plan where they play on our psychology and our narcissism and all sorts of other things.

RANA FOROOHAR: That's right.

STEPHANIE SY: But where does it go from here? Is there a moment for Facebook now where it has to change its business model? Is that what we're talking about?

RANA FOROOHAR: I think so. I think there are a few different issues: First, you're right. The targeted-advertising business model is really problematic. Yes, we are willingly logging on; yes, we are giving our data. Although if you look at it—and some of this stuff is changing in real time—it used to be that there were very few caveats, very few visible buy-in options. You'd have a hard time using the product unless you gave your data.

Now you are starting to see warnings, you are starting to see more of an opt-in sort of situation. I think we are going to much more toward that where there is a lot of transparency: You log in, maybe you get a waiver the same way you do when you see a drug advertisement on television, you hear about all the side effects at the same time that you're hearing about what it can do.

The other thing I think we're going to see a lot of pressure around is the transparency of how these companies value data. One of the things that is really interesting to me as someone who covers economics is these products and services aren't free. We think they're free, but we're paying with our data, but we don't know how much that data is worth.

That actually upends the economic laws of gravity, because if you think about how market capitalism works—Adam Smith, the father of market capitalism, would say that you need three things in order for markets to work: You need equal access to data; you need transparency within a transaction, you and I need to understand what is being bought and sold here; and you need, and this is particularly relevant for Carnegie, a "shared moral framework." I would argue that none of those three things are in play—

STEPHANIE SY: Exist in our relationship with Facebook.

RANA FOROOHAR: That's right, and I think that we're going to see a lot of pressure for these firms to be incredibly transparent about what are we giving up, what is the value of this information.

Interestingly, there are also some new business models that are springing up. The founder of Mozilla, the open-source search engine, has actually come up with a new search engine where users can pay a fee and get total ad blocking, meaning there is no targeting, no ads, or if they do want targeted ads because some people actually want—I like buying nice handbags, I get the new ad for a nice handbag—you can actually be very specific about who you want and how, and they will pay you to receive advertising. That is just one interesting new business model, and I think there is going to be a lot more change in that direction.

STEPHANIE SY: It seems clear that Facebook is looking at all of this, how to give users more control over what data they're sharing and if they do want the targeted purse ads, then they'll get that, and if they don't, maybe they will come up with a model where you can opt out entirely. I think there is a trust issue there.

RANA FOROOHAR: Absolutely.

STEPHANIE SY: I think users knew that their data was being used. The transparency and the lack of transparency that came with third-party use or purchase of that data, that wasn't surprising I think to a lot of people who were already cynical about how their data was used—1984but I think it probably was surprising to a lot of Facebook users.

RANA FOROOHAR: I think that's right.

STEPHANIE SY: Let's talk about regulation versus self-regulation. It seems like Mark Zuckerberg is already making moves before any regulations hit the books. Is that going to be enough, or should we look toward the European model? On May 25 of this year there will be new regulations that will come into effect that will restrict the type of personal data tech companies can collect, store, and use across the European Union (EU), and of course one of the rules that will be enshrined is the right to be forgotten. Is that an approach that the United States needs to be looking at at this point?

RANA FOROOHAR: I think so. There are a few interesting things there. First of all, this right to be forgotten, Europe is giving people the right to just have their entire data trail, the "cookies" that follow you around, erased and permanently forgotten, and 2 million people have actually already opted into that. So this is not a small thing.

I see a couple of things happening on the regulatory front. I think the United States, Europe, and China seem to be going in very different directions.

We have had two models of big tech so far: The United States, which owns most of the world, has been extremely laissez-faire. Tech firms have been very lightly regulated. In fact, one of the core principles that tech firms have lobbied for over the past few years, was an exemption granted under the Communications Decency Act (CDA) in 1996 in which tech firms were considered to be the "town square," so whatever you did in the town square was not their liability. That is very different from a media firm like the FT, for example, or even a non-profit, where you are responsible to a certain extent for what happens in this office, in our pages at the FT. The tech firms have lobbied very carefully to keep that light-touch regulation.

Now you are starting to see that rolling back. There was a law that actually passed just a couple of weeks ago through the Senate that provided a small carve-out in CDA Section 230, this loophole, for sex trafficking. This came up because of the really horrible case of Backpage.com, which was engaging with a website that was engaging in child sex trafficking, and there was a push to say: "Hey, if you are doing sex trafficking on your website or you are engaging in any way and have knowledge of this and are supporting this in the online ecosystem, then guess what? You are liable." That may open the door to a lot more regulation. So that's what's happening in the United States.

China has a different system. China has a state-run system obviously, so there is regulation, but there is also total unlimited data collection. So there are smart cities in China where people are monitored 24/7, in their homes, in their cars. That is important in this conversation because artificial intelligence (AI), the technology of the future that all of these firms and really almost every industry, is counting on for growth, is built on two things: It's built on data, and it's built on quantum computing. So China is saying: "We're going to collect as much data as possible. That's going to help us move ahead in this industry."

Europe—to come back to your initial question—is saying: "Wait a minute. We want to find a middle ground here. We are social democrats, largely. We care about human rights and privacy, but we also don't want to move behind in tech." So they're trying to find a way to protect citizens' privacy and yet also not lose out in the industries of the future.

STEPHANIE SY: There is so much to unpack in what you just said. Let's just start with China. Vacuuming up all of that data, is the central government doing that as well? My understanding is that it is the central government that's doing that, but it's also Tencent and some of the large Chinese tech companies that know they can benefit from data in the same way that Facebook and Google are benefiting.

RANA FOROOHAR: Absolutely. China has its BATs—Baidu, Alibaba, Tencent—like the FAANGs in the United States. These firms are doing just as much if not more. There is more mobile commerce in China, and in some ways the Chinese are more data-savvy and more willing to engage in the digital ecosystem.

But all of these firms have relationships with the state, and this is an important point actually in the trade debate that you have going on right now and the Trump administration slapping tariffs on China around tech. The Chinese tech ecosystem is a strategic sector for China. Any company that wants to work in this sector is going to have dealings with the government, and you can bet that the data is being shared.

STEPHANIE SY: What is interesting about China is that they are also—I am going to get back to ethics for a minute—coming out with this "social credit" score. The idea is that they are using people's online activities and the data that they're sharing in all the different ways that are connected via technology to basically score citizens on their morality and determine whether they—Orwellian, right? What is interesting is that the question is, is reading someone's credit based on ethics ethical?

RANA FOROOHAR: It's such a great question, and you're opening up literally the black box now of algorithms, which is a whole other—

STEPHANIE SY: We don't have to dwell too long on this, but it's interesting how data is being used in ways that on the one hand might gain a competitive edge but is also a huge surveillance tool.

RANA FOROOHAR: It's a great question. The fact is it goes both ways. What you're talking about could be an opportunity—if we look on the bright side for just one moment—for a rural person who might not have access to credit to say, "I can gain access to credit because we're going to use different metrics than have been used in the past."

That is happening in the United States. There are a number of think tanks in Washington, for example, that are looking at how could big data and algorithms be used in a way that would be inclusive, how can we make sure that low-income people, people of color, who typically have lower credit rating scores based on traditional metrics, we can now use a whole bunch of other data points in order to get access.

But those algorithms can be racist, those algorithms can be used for government surveillance. It is very easy in China and very worrisome to think about how—what does it mean to get a bad morality score? Do you have to go to jail?

I think these things are very worrisome, and that is why transparency is huge. It is about trust.

STEPHANIE SY: Let's dig into that a little bit, the idea that AI is programmed by humans, and humans are deeply flawed and racist and discriminatory, and AI ideally makes for better productivity, better efficiencies. That is the plus side of AI.

On the other hand, I have been hearing a lot of buzz about is AI sexist and racist. With such a lack of transparency—again, it goes back to whose responsibility is it to figure out how companies are utilizing AI, for example, in hiring practices, how police departments might eventually use it in criminal justice, or family services might use it to determine whether somebody deserves to keep their child. There are so many applications that I am hearing about with AI.

RANA FOROOHAR: It's interesting. You're reminding me of a really terrific book written by a friend of mine, Cathy O'Neil, called Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, and she gets into algo-racism and all the examples you're talking about. Her take is that you need both the public and the private sectors to be involved in this.

If you look at what, let's say, a company like IBM is doing. They have actually come out with kind of an AI bill of rights, a data bill of rights, real transparency, very easy, very clear to understand: This is how your data is going to be used. You will own the data. We will own some of the insights, the customer can also have some insights in these particular ways. So it is a lot more transparent, and they are actually using data as a competitive tool around privacy.

But I do think that this is the sort of area that is going to require government oversight. One thing that Europe is actually looking at right now is the idea of data trusts. You know that there are health trusts like the Wellcome Trust or the Icelandic genomic bank, where in this case biological or health data is kept in a place where there is a public oversight of that. There is trust built into the system. A private company cannot simply hold and own this data and do whatever they want with it.

There is a discussion now about could we find a middle ground, could Europe find a middle ground, where yes, there could be citizens' data that would be held and used for specific purposes to increase efficiency in health services, to help avoid tax evasion, to create digital identities that could help people to get access to social services, and yet that would be protected, that would be ring-fenced.

There is also another area we might want to talk about, how individuals could use technology to own and monetize their own data.

STEPHANIE SY: I'm hearing that that is happening in Europe. I am hearing that already there are companies trying to capitalize off these new regulations by figuring out once we have valued what your personal data is worth, what if you decide I do want to sell it?

RANA FOROOHAR: I don't know if you've heard about blockchain. It gets associated with Bitcoin, which is I think a kind of bubble cryptocurrency and has a nefarious reputation, but blockchain itself is just a technology that decentralizes data that might otherwise be held in hubs.

If you think about what Google or Facebook do in order to offer that incredibly profitable targeted advertising, they take data from large pools of people. They then hold it in a kind of a basket, in a centralized server, and then they decide that, "Well, you seem like this other group of customers," and then they sell that grouping to advertisers who can then blast out information.

STEPHANIE SY: But it's a public ledger. Is that important?

RANA FOROOHAR: It is a public ledger, although the algorithms are not transparent, and the companies are very reluctant to say exactly how this targeting is working. But what we do know is that there is a grouping of data. That is why when you get an advertisement it may not be exactly what you want, it's not perfect for you, it's kind of an idea about who you are.

If you have a blockchain system where, say, a digital identity, me, Rana Foroohar, all the thoughts I've ever had, my book, my FT columns, lives in this digital identity known as Rana Foroohar, and I can own that almost like a contract, and theoretically, not yet in practice, I could monetize myself. I could say to a publisher, "Yes, you can take the rights for my book in this particular way," and that would be secure. It would cut out a lot of intermediaries, and it's something that is already affecting, say, the financial industry, which makes a lot of money intermediating between transactions.

STEPHANIE SY: Wouldn't that represent an existential threat to a company like Facebook? It would take the value of the data and put it back into the hands of the users.

RANA FOROOHAR: Yes, and that may be—to circle back around to one of your very first questions, how is the business model changing? A lot of people feel that the business model should not be about monetizing your data and keeping you online as long as possible, sometimes doing stupid stuff, like watching cat videos, or maybe not stupid, but wasting a lot of time online in order to monetize more data.

Let's make the business model about providing something that you need—information, a service—quickly and efficiently. Let's let these firms compete online, perhaps in online auctions, for the right to provide you with that service. That might be a new business model.

STEPHANIE SY: I want to get back to a couple of themes you brought up earlier. You basically have written about these big tech companies as monopolies.

RANA FOROOHAR: Yes.

STEPHANIE SY: Why is it important to see them that way?

RANA FOROOHAR: When I joined the FT my mandate was basically to write about the world's most important business issues, and I always follow the money. So the very first thing I did was to look at where is the money going, and I came across a really interesting McKinsey Global Institute survey looking at how about 80 percent of corporate wealth now lives in roughly 10 percent of firms, and those are the very IP- and data-rich firms, the biggest of which are the FAANGs.

I started looking at that sector, and the more I looked, the more I thought: Oh, my goodness. All of the trends we've seen in the last 40 years—increasing inequality, job depression, wage stagnation—are going to be put on rocket fuel with these firms because technology speeds up all the problems of globalization. All the discontents of neoliberalism that favors companies and capital more than labor gets sped up exponentially because these tech firms create a tremendous amount of wealth but not that many jobs.

STEPHANIE SY: In your Makers and Takers dichotomy, these are takers.

RANA FOROOHAR: All too often, yes. Certainly they have created some great innovations. I will caveat that by saying much of the basis of those innovations—the Internet, Global Positioning System (GPS), touchscreen—

STEPHANIE SY: Government research.

RANA FOROOHAR: Government research. So yes, they're innovators, but they're also data monetizers.

The other reason that I got interested in this topic is I started hearing from people from all walks of life—biotech, small software companies, midsized chip suppliers—saying: "You know what? We're getting squashed by the biggest tech firms."

I started looking at why that is, and if you think about what these firms are all about, they make products that include 10,000, 20,000 bits of IP. They would like to pay as little as possible for that, so they want to squeeze their supply chains, and then they monetize data, and that doesn't really require anything except you and me. So they're not really contributing in the same way to the innovation ecosystem as some companies of the past might have done.

I think that is going to require an incredible rethink of policy on everything from anti-trust to patents to privacy to civil liberties. We're talking about an Industrial Revolution-type change in what the regulatory framework should look like.

STEPHANIE SY: Do you think the United States is way behind when it comes to making policies to control the outsized political and economic influence of big tech?

RANA FOROOHAR: I do. There are a number of people who are pushing on this issue. Look at anti-trust, for example. We had a big shift in this country in the 1980s. Robert Bork, the federal judge, had a new idea that was put forward and adopted during the Reagan era: "Well, as long as companies are lowering consumer prices, everything is good. They can get as big as they want to get."

But is pricing the only measure of welfare? Forget about tech, you can look at a firm like Walmart, for example, and say: "Well, they've lowered prices, but they've also put a lot of Main Street firms out of business. They've also offshored a lot of production and crunched the supply chain." Amazon is now putting pressure on Walmart. Amazon takes these trends to the next level.

I think that we need to start thinking about: One, what is the transaction here? It's not about price because we're getting things that are free, although as I said we're paying for them in other ways. But also, what is a broader definition of welfare? Can we think about how a company is enriching the overall economic ecosystem?

People say: "Well, that's too mushy, that's too complicated. Pricing is really discrete and easy to factor." But there are already institutions that work under a broader mandate. The Federal Reserve Bank has a mandate to support community economic development in the cities where it works, so does the Boston Fed, the Chicago Fed, the San Francisco Fed.

STEPHANIE SY: I wasn't aware of that part of the mandate. You hear about inflation.

RANA FOROOHAR: They do have that dual mandate, but another part under the 1977 Community Reinvestment Act, the Fed was given the task of forwarding community development. So there are already institutions that do this. We need to think more creatively.

STEPHANIE SY: Speaking of Amazon potentially being the new Walmart. Its stock today, March 29, was down 2 percent at the open because the president tweeted about Amazon, and he said: "They pay little or no taxes to state and local government. They use our postal system as their delivery boy, and they are putting thousands of retailers out of business."

The point I want to talk about a little bit is the effect of these big tech companies on workers. We can start with my question, which is: The top five companies in the world, all tech companies, don't employ as many workers as maybe a manufacturing company or even a Walmart frankly.

RANA FOROOHAR: That's right. There was a very interesting research study that was done recently looking at this latest generation of tech firms, the Facebooks, Googles. They create fewer jobs than the previous generation of tech firms like Apple or Microsoft or IBM. Those firms created fewer than the previous generation of industrial giants, your General Motors (GM), your General Electrics. So there has been a trend of jobs going like this, but market cap continues to go like that.

Getting to ethics in all this, that is great for the top 20 percent of the population that owns 80 percent of the stock assets in this country. All of our portfolios are going like this. Jobs, not so much.

Here's the rub. I think this is going to become a huge political issue, not just in the midterm elections in 2018, but in 2020 because the disruption caused by these firms and caused not only by what we can see already in the consumer Internet but the coming revolution in the industrial Internet, the Internet of Things, AI, all of these things are going to disrupt jobs farther and farther up the white-collar food chain.

Frankly you see certain amounts of journalism being done by robots, you see radiology reports being read by robots. This is a white-collar problem and a blue-collar problem, and that is going to create interesting new political alliances, I think.

STEPHANIE SY: Having said that, the political lobbying power of Google in Washington, although great, ultimately do you think we are at a turning point with an understanding now in the mainstream of how a company like Facebook is impacting democracy, how a company like Amazon is impacting the labor force and small business? Is there a turning point here that you see happening? Is this that moment?

RANA FOROOHAR: Is this a 2008 moment that we could actually capture and leverage in some way?

STEPHANIE SY: Yes.

RANA FOROOHAR: I think the jury is out, and here is where I think the big question marks are that you are seeing a push for regulation. A number of bills have been put forward in Congress, but a lot of them are either window dressing or at the margins.

The other thing to remember is that we are going through a 100-year shift in our economy. This is Industrial Revolution-type stuff. Moving from an economy that is made up mostly of tangibles—goods coming off an assembly line—to an intangible economy of data and IP is going to require a much broader sea change in regulations. That needs to be done with a lot of smart people in the room working together. If you start to do it piecemeal, you actually may end up in the same situation that you did with Dodd-Frank, for example, the financial regulation, which, by the way, was turned into Swiss cheese in part because of the lobbying pressure from these firms. You had seven different regulatory bodies, a lot of firms piling in saying we want this or that loophole, and you ended up with a system that a lot of people on both the right and the left are not happy with. I think that is the risk now.

I would love to see a stand-alone regulatory body for the digital economy. At least, we could start with a stand-alone executive committee to study what the framework should be for evaluation of intellectual property in the digital age: How can we make sure that the digital economy is inclusive? I think that a task force would be great.

STEPHANIE SY: The questions have to be asked, and you hope at least that comes out of the morass we saw in last year's election, that the conversation begins.

We have had a couple of guests here who have said that one of the things that should be looked at is a universal basic income (UBI), because it is a fact that these technologies, and AI in particular, has been pointed out over and over on this stage as being so disruptive that you are going to see at all levels, like you say, up to the white-collar level, people without work, there not being enough jobs. What do you think about that?

RANA FOROOHAR: I have mixed feelings about UBI. I do think we are going to need a stronger social safety net, absolutely. We're going to go through a period of 10 years of big-time tech-based job disruption at all levels. People are going to need re-training, they're going to need government assistance in some cases during that process of transition.

I'm not sure that UBI is the way to do that, and I'll tell you why. UBI doesn't address the meaning-of-work issue. Meaning is really important. There are a lot of economists who have studied—if you don't have a job, you have higher rates of depression, you have communities literally fall apart. Look at Detroit. They can't keep the lights on.

STEPHANIE SY: People want to work.

RANA FOROOHAR: People want to work. UBI, handing someone a check, does not solve the problem.

I must say that I'm a little concerned about the fact that a lot of the proponents of UBI tend to come out of Silicon Valley, and I feel like they see it as an easy way to make sure that the pitchforks don't come for them.

STEPHANIE SY: Interesting.

RANA FOROOHAR: I was at an event last year talking to an entrepreneur, a venture capitalist, who said: "Well, you know, it's great. We're going to have so much wealth created by these tech firms. They're going to be able to hand everyone a check for $25,000 a year, and coal miners can just become, you know, software designers or surfers or writers."

I thought: Oh, man. You haven't been to the Rust Belt. You don't know what it's like. Families are changing because a lot of people are out of work, and that kind of naivete is worrisome.

STEPHANIE SY: I just want to end by talking about, we've titled this "The Dangers of a Digital Democracy," and I think it does make sense to bring it back to what we've learned and the ways that our data may have been exploited in some ways to influence how we think.

That is really hard for me to wrap my head around because if what you're talking about is in 2020 or 2018, we're going to start to see some of the impacts of what these disruptive technologies mean to voters—there may be some job loss. That would start with the premise that democracy is intact. I think there is concern that the power of these companies goes beyond economic and that it goes deeply into our understanding of truth and fact, and how do you have a democracy when there are these alternative realities that are being accelerated through these "town squares," as you call them?

RANA FOROOHAR: I think you've hit on the core existential issue here. A few facts: More Americans than not now get the majority of their news from social media, from tech platforms. That has coincided with a loss of trust in liberal democracy. If you look at the latest Edelman Trust Barometer—the PR firm Edelman puts out a trust barometer every year—the countries in which social media and platform tech usage is rising, countries like ours, you see a fall-off in trust in liberal democracy.

Interestingly, you see rising trust in autocracies, like China or the United Arab Emirates (UAE).

STEPHANIE SY: Which is so counterintuitive. You would think the more information you have access to—

RANA FOROOHAR: That's right. But this goes to the dangers of the digital democracy. We don't want an autocracy, but how do we make sure that people are not living in silo bubbles? It is 70 percent more likely for fake news to be spread on Twitter than real news. That is a serious problem. Starting with transparency is super-important.

Fortunately, I see a turn. It is well known that the president's election has really galvanized the news media, and the FT has seen its subscriptions go to record levels, so has The New York Times, The Wall Street Journal, The Washington Post. I think that serious news is on the rebound. I think people realize that you may have to pay for content that is legitimate.

But that brings up serious issues, too, that we're going to have to grapple with, because there is a digital divide in terms of who can pay for the FT, over $400 a year.

STEPHANIE SY: Absolutely. There is even a digital divide when it comes to who has access to the Internet.

RANA FOROOHAR: Exactly.

STEPHANIE SY: The people who are most vulnerable to exploitation are those who are not necessarily going to pay to get past the FT paywall.

RANA FOROOHAR: That is absolutely right, and you might start to see ways in which a digital tax of some kind could be used to support broader public access.

Let's look back to what some of the titans of the past did, the Carnegies, the Rockefellers. They started libraries, they started public services. I would love to hear how the tech giants as they go forward in their testimonies in the next few weeks and months are going to approach this issue.

STEPHANIE SY: Rana Foroohar, thank you so much.

RANA FOROOHAR: Thank you.

STEPHANIE SY: Those are really interesting insights. Any one of those topics we could have gone on for another half-hour.

RANA FOROOHAR: Another time.

STEPHANIE SY: But I appreciate that you were able to hit every single ball out of the park that I pitched to you.

RANA FOROOHAR: Thank you.

STEPHANIE SY: It was a pleasure to meet you. Thank you.

RANA FOROOHAR: And you. Thank you.

You may also like

DEC 17, 2024 Feature

Empowering Ethics in 2024

Explore Carnegie Council’s 2024 Year in Review resource which highlights podcasts, events, and more covering some of this year’s key ethical issues.

Dr. Strangelove War Room. CREDIT: IMDB/Columbia Pictures

DEC 10, 2024 Article

Ethics on Film: Discussion of "Dr. Strangelove"

This review explores ethical issues around nuclear weapons and non-proliferation, the military-industrial complex, and the role of political satire in Stanley Kubrick's "Dr. Strangelove."

DEC 3, 2024 Article

Child Poverty and Equality of Opportunity for Children in the United States

This final project from the first CEF cohort discusses the effects of child poverty in the United States and ethical solutions to help alleviate this ...

Non traduit

Ce contenu n'a pas encore été traduit dans votre langue. Vous pouvez demander une traduction en cliquant sur le bouton ci-dessous.

Demande de traduction