Marketplace Tech Blogs

Palantir may go public, but can it turn a profit?

Molly Wood Nov 16, 2018
HTML EMBED:
COPY
Peter Thiel is the co-founder of Palantir, named for a crystal ball in the "Lord of the Rings" trilogy. Stephanie Keith/Getty Images
Marketplace Tech Blogs

Palantir may go public, but can it turn a profit?

Molly Wood Nov 16, 2018
Peter Thiel is the co-founder of Palantir, named for a crystal ball in the "Lord of the Rings" trilogy. Stephanie Keith/Getty Images
HTML EMBED:
COPY

The data analytics company Palantir is reportedly considering going public. Palantir is the company co-founded by controversial Silicon Valley billionaire Peter Thiel, formerly of PayPal. It’s named after an all-seeing artifact in the “Lord of the Rings” trilogy. The company promises police departments, governments, even the IRS, that it can take in huge amounts of data and make artificial intelligence-informed guesses to help track down criminals and cheats, among other things. In a secret pilot program in New Orleans, Palantir tech even tried to predict when crime would happen or who might be a victim. But lately its huge $20 billion valuation is in doubt, and privacy activists are concerned about its tactics. Molly Wood talked about it with Mark Harris, a reporter who’s covered Palantir for Wired magazine. The following is an edited transcript of their conversation.

Mark Harris: Palantir’s selling point is that it can take these hundreds of data streams from all different kinds of systems that don’t normally work together and it can squeeze them all into this pot to come up with these insights. But that’s actually really tricky because you have to understand a lot of different types of hardware, a lot of different types of software and databases, and every company and every jurisdiction does things their own way.

Molly Wood: And one of the other things that’s been speculated is that Palantir may have actually promised cities like New Orleans or Los Angeles that it could do sort of pre-crime prediction, right? That it could use artificial intelligence to say, “Oh, crimes are more likely to happen here. You should direct resources that way.” Is that accurate?

Harris: Yeah, it’s certainly the case that police departments want to be smarter about the way they deploy their officers and deploy their resources. The question is can you do that in a way that respects people’s civil liberties, that doesn’t target populations, minorities perhaps, or poor neighborhoods that have got disproportionate amounts of policing. Even though there’s plenty of crime happening elsewhere, there’s a cycle of “There’s been crime here in the past, so let’s go there again.” And you can institutionalize a lot of biases and discrimination that we’ve seen historically in cities across America.

Wood: It reminds me a little bit of the Cambridge Analytica story where they were saying, “Look, we can use this data to micro-target to such an extent that it will be like we’re psychic, we can get right inside someone’s brain.” And I wonder if there is a possibility that no company can do as much with mountains of data as they promise.

Harris: Right. That’s really true. It’s nice that they have success stories, but we’re not seeing the day-to-day work. We’re not seeing how useful it is for officers. And some of the emails that I’ve seen found that some people using the system thought it was powerful. Other people had struggled with it and struggled to get it usable by them and to give them useful results in their work. So you’re right. Palantir has this mystique of having worked through all the intelligence agencies. They have a very attractive background. They sound like an interesting company, to have that sort of technology working in your organization. Can you actually see whether it pays for itself? I haven’t seen those data. It’s entirely possible that if they go public, which they’re planning to do in the next couple of years, then we’ll have a lot more visibility into what they’re doing.


And now for some related tech news:

  • The Wall Street Journal story had an amazing story earlier this week on Palantir’s valuation and possible problems with its business plan.
  • Last month, several immigrant rights groups put out a report accusing Palantir, along with Amazon and a Bay Area company called Forensic Logic that does predictive crime analysis, of helping the Trump administration excessively target immigrants for jail or deportation. Amazon has been selling facial recognition technology to law enforcement agencies. It’s called Rekognition. In May, the ACLU ran a test of the system and found that it accidentally matched 28 members of Congress with mug shots of other people. Amazon said it was miscalibrated. Last month, an anonymous Amazon employee wrote a Medium post saying that 450 employees inside the company had signed a letter to Jeff Bezos asking the company to stop selling the Rekognition software to law enforcement and to kick Palantir off of Amazon Web Services. What a tangled web of surveillance we weave.

There’s a lot happening in the world.  Through it all, Marketplace is here for you. 

You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible. 

Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.