Andy Jassy joined Amazon in 1997 and currently serves as the CEO of Amazon Web Services.
Following are excerpts of an interview with Andy Jassy conducted by James Jacoby on September 20, 2019.
Editorial Note: In June 2020, in the aftermath of George Floyd’s killing, Amazon announced a one-year moratorium on police usage of its facial recognition technology. In May 2021, Amazon said it was extending that moratorium indefinitely.
In February 2021, Amazon announced that Jeff Bezos would step down as company CEO and Jassy would succeed him.
We’re living in this political moment where there’s a lot of people questioning the size and power of Amazon.… How do you and Jeff [Bezos] and others at the senior leadership level think about the call to break you guys up?
We don’t think about it very deeply.You know, at the end of the day, if I had to categorize what we spend most of our time thinking about, it’s really in each of our businesses, we have a very large number of customers who are very demanding, appropriately, who are not shy about telling us what they’d like to see us improve in our product and in our offering.And then in each of those categories, there is very deep competition with very well-funded and very capable companies.And so from a very early time—I’ve been at Amazon now for 22 and a half years, and I always remember one of the first things I heard Jeff Bezos say back when we could fit the whole company in just one conference room for an all-hands meeting.And it was actually when there were all of these rumors when Barnes & Noble was going to launch their retail site, and people were saying that was going to be the end of Amazon.And Jeff told all of us, he said: “I would not go to bed at night fearing your competitors or fearing any external issues.I would go to bed at night fearing whether you’re doing right by your customers.”And that really is a credo that we live here; it’s what we spend most of our time thinking about.
I guess my question in part is whether—do you guys think a little bit about where that, where that call is coming from to break you up?In terms of—I know that in several of the market segments of the lines of business that you’re in, you have a lot of competition, but that taken as a whole, you have tremendous resources of data; you have tremendously deep pockets as a company, profitability; that in some ways there is a chilling effect, or there is—in some ways that may stifle competition in the long run in each of the areas of business that you’re in if Amazon just simply looks in the direction of a certain market or a certain line of business.
Yeah, you know, I think that if you look at the reality—some of that sounds very good, but if you look at the reality and the facts, first of all, just because Amazon chooses to enter a business segment doesn’t mean we’re going to be successful.I mean, you probably know that we launched a phone, and that didn’t exactly set the world on fire, no pun intended.And, you know, then I think if you look at some of the successful business segments that we’re in, you know, in our retail business we are about 1% of the world’s retail business segment.If you look in cloud computing or in the AWS [Amazon Web Services] business, 97% of the spend in the IT market is on premises, and only 3% of the spend in the entire IT market is in cloud computing of which we’re a share of that.So we have a relatively tiny share of the overall market segments in the categories in which we operate.And then I think the other thing to remember is that consumers and customers have a choice on where they spend their money.Simply because Amazon decides to pursue a market segment doesn’t mean the customers are going to spend their money there, and so it means that we have to do an amazing job in providing a great customer experience that customers want.And I think that it’s been unusual that we’ve been able to do so in a few business segments.But customers get to choose, and it changes all the time, and we have a very small market segment share in each of the businesses in which we operate.
I think to the public, it may sound strange coming from Amazon, which is a company with basically a trillion-dollar market cap, your CEO is the richest man in the world, you have, you know—that in some ways that the strategy of what you’re telling me and what other executives have told us is that essentially you’re small.I mean, Jeff Wilke said to me that you’re kind of just a speck in the scheme of things.1
1
Do you see how that could seem strange or incongruous to a viewer?
You know, Amazon as a whole has become—has been successful, and across a few different business segments, but simply because the company’s been successful, in a few different business segments, doesn’t mean it’s somehow too big.
You know, I think you have to, in any problem—you know this—in any complicated problem, you have to look at the details, and you have to dissect it and do some analysis.And when you actually dissect what Amazon is, when you look at each of the business segments, the facts are we’re really small relative to the total market segment size in each of the businesses in which we operate, and they’re still all in their very infant, early stages.
Ethical Questions Raised by Facial Recognition
… In AWS specifically, it seems as though you’re kind of entering some issues that come with some thorny moral implications.For instance, I mean, there’s—you’ve sought to empower Immigration and Customs Enforcement at a time when the U.N. has called the situation at the border a violation of human rights; you’re selling facial recognition services to police departments.I’m curious where you draw the lines on who you’ll do business with and who you won’t do business with.
Yeah.Well, you know, first, I don’t think we sought to enforce—or to enable the Immigration and Customs Enforcement agency.I don’t think we’ve said that they’re a customer, so—
No but you did seek their business I believe right?2
2
You know, no, not directly.I mean, I think that what we have always said—and we feel very strongly about this—is that it is unbelievably important for the safety of the country and the safety of the world for the U.S. government to be able to have access to the most modern, sophisticated technology, of which the cloud is, and we believe AWS has the most capability in the cloud.So we believe that it’s important for us to be able to provide access to our government to be able to use that technology, and we’ve been consistent about that across government.
We’ve also been very consistent in saying that if we find that we have customers, regardless of who they are, who are misusing the technology and violating people’s civil liberties and violating the law, and we have documented proof of that, we will suspend their ability to use our platform.
That’s putting a lot of onus on the customers, though, and the ability to get documented cases to you of abuse.And I’m just curious about, you know, there are other companies and other competitors of yours that say that there’s more of a choice at hand here; that you as a company, in the absence of regulation, have choices to make about who you will empower and who you won’t empower.
First, I don’t actually think it’s too much of an onus to put on customers or on end users.We get a number of documented complaints of misuse of technology all the time, because we have millions of customers who use the platform across, you know, all over the world in every imaginable use case.And people don’t seem to have any problems telling us when they think there may be a misuse of the technology, and it’s very easy to send it to us.So we get those types of complaints all the time.We haven’t had them about governments.And certainly when you talk about facial recognition technology, we’ve had zero reported misuses of the technology by law enforcement.So it’s not hard for people to do it.They do it all the time when they see it.
On the law enforcement, I just have a couple of questions on that.We spoke to a former principal scientist at AWS who told us that she didn’t think facial recognition is appropriately battle-tested and ready for prime time and that she, along with a number of other scientists, have called for Amazon to stop selling facial recognition to police departments until there can be an auditing system or some public oversight and appropriate legislation.3
3
Why is that not something that you’re doing, waiting until there’s public oversight and auditing?
I have a different view, and we’ve spent—we’ve had the facial recognition technology out for use for over two-and-a-half years now, and in those two-and-a-half years, we’ve never had any reported misuse of law enforcement using the facial recognition technology.We worry, by the way, about the civil liberties issues.If you know anything about what a lot of the senior leaders at Amazon do in their free time, they spend a lot of time on civil liberties.It’s something that’s very important to me and I think a lot of my peers.
But I would say simply because the technology could be abused in some way doesn’t mean that you should ban it or condemn it or not use it.You know, if you think about computers and servers, and you think about all of the misdeeds that have been done with servers, breaking into people’s systems, stealing email addresses and data and things like that, imagine what our world would look like if we had banned or not used computers or servers.It would be a very different world.And so I think a lot of societal good is already being done with facial recognition technology.Already you’ve seen hundreds of missing kids reunited with their parents and hundreds of human trafficking victims saved and all kinds of security and identity and education uses.There’s a lot of good that’s been done with it.
But I also understand that it could be misused, and it’s why we always say if there’s any kind of documented proof of people misusing the technology, we will suspend people’s ability not just to use the technology but to use AWS.We have very strong guidance for our law enforcement customers that if they’re going to use facial recognition technology in an investigation, they should only use it where you get answers that have at least a 99% confidence level, and then only as one piece of a broad set of pieces of evidence in a human investigation.And so we try to give very strong guidance and take action when we see any kind of misuse.But I also think it’s completely fair for people to say that’s not enough; we want to have some kind of regulation where there’s more prescriptive guidance in how the technology must be used.And we’ve been vocal about thinking that the federal government should do something about that.And I think at the end of the day with any technology, whether you’re talking about facial recognition technology or anything else, the people that use the technology have to be responsible for it, and if they use it irresponsibly, they have to be held accountable.
Brad Smith, who’s the president of Microsoft, a competitor of yours, with regard to facial recognition has said, “If you’re prepared to sell what you make to anyone that will buy it, regardless of how they’ll use it, you can tell yourself that you’re principled, but the only principle you’re following in practice is the pursuit of profit.”And he also basically said that the companies that do create this technology must accept a greater responsibility for the future, because, for instance, Microsoft has a facial recognition software that they will not sell to police forces because they do feel like the technology is too immature.
Yeah, their technology in facial recognition technology may be immature, I don’t know.But while we are constantly improving our algorithms, and we have a lot of internal auditing that we do and benchmarking that we do, we will continue to make it better all the time.
Well, let me ask you this: Why allow police departments which have historically—there’s been all sorts of problems with policing in this country—why allow them, who have such an imbalance of power, an asymmetry of power in this whole equation, why let them experiment with this?I understand with missing kids and organizations that work with that to experiment and improve your technology, but I’m just curious about the intersection of the corporate experimentation, which is basically a black box.We don’t know how these systems are designed exactly; there’s no auditing system.So why allow police departments to experiment?
But again, to me it’s similar to what we were talking about with respect to government, which is, we believe that governments and the organizations that are charged with keeping our communities safe have to have access to the most sophisticated, modern technology that exists.And, you know, if you look at investigations in police departments all over the country, they’re using lots of pieces of evidence, and there are laws around what evidence you need to be able to arrest somebody.And the same applies whether you’re using—whether you’re using facial recognition technology as elements of that entire piece of evidence or not.And so to me, to not allow our police departments to have access to the same modern, sophisticated technology that can help keep our communities safe is the wrong optimization.
Let’s see.I have not—you know, we don’t have a large number of police departments that are using our facial recognition technology, and as I said, we’ve never received any complaints of misuse.Let’s see if somehow they abuse the technology.They haven’t done that, and to assume that they’re going to do it and therefore you shouldn’t allow them to have access to the most sophisticated technology out there doesn’t feel like the right balance to me.
It’s been difficult to even know how many police departments are using the facial recognition technology, and there’s no public auditing to know whether there are complaints about abuse.There may be systems in place that we don’t know about that you have to find out about those problems.But as the public, if we don’t have a window into even the number of departments that are using this, how would the public ever know?
Well —
And we have to take your word for it that you’d kick somebody off that’s abusing it if you get a report.And I’ve spent time with a police department, for instance, that’s not following your guidance on the 99% standard.They’re coming up with five faces that may match, which is not what your guidance says, but they’re still doing it.And yes, they’re using it as a piece of their investigations, and they’re not using it for probable cause.But it seems to me like the possibility of abuse as audited by you as the company that’s creating the technology is an imbalance here.
You know, again, I don’t think we know the total number of police departments that are using facial recognition technology.I mean, there’s—you can use any number.We have 165 services in our technology infrastructure platform, and you can use them in whatever conjunction, any combination that you want.We know of some, and the vast majority of those that are using it are using it according to the guidance that we’ve prescribed.And when they’re not, we have conversations, and if we find that they’re using it in some irresponsible way, we won’t allow them to use the service and the platform.
You know, I think there’s—the public probably doesn’t know the techniques, every single technique and how many police departments use every single technique in all their investigations today and haven’t for the last number of years.Facial recognition technology is just one piece of the overall pie.We still—as I said, it’s very early days in people using it, but we haven’t seen any abuse yet.I think that, believe me, if there’s some kind—we see almost everything in the media today, and I think that you can’t go a month without seeing some kind of issue that somebody feels like they’ve been unfairly accused of something of some sort.So I have a feeling that if you see police departments abusing facial recognition technology, that will come out.It’s not exactly kept in the dark when people feel like they’ve been accused wrongly.
Would you, for instance, sell facial recognition technology to a foreign government?
Yeah.I mean, again, as long—there’s a number of governments that are against the law for U.S. companies to do business with.We would not sell it to those people or those governments.
But if they’re—for instance, AWS has made a big push into the Middle East and Bahrain, for instance, which it’s not illegal for businesses, American businesses to do business, of course, in Bahrain, but Bahrain is a country that has a history of cracking down on dissidents, for instance.What if your infrastructure, your services were being used for that purpose, which is I think a similar question to what if your services or infrastructure were used by Immigration and Customs Enforcement in this country to basically empower the border policy here?
Yeah, again, if we have documented cases where customers of any sort are using the technology in a way that’s against the law or that we think is impinging people’s civil liberties, then we won’t allow them to use the platform.