It’s a chilly March morning in the undisclosed mid-Atlantic hotel hosting Palantir’s developer conference. The defense contractors, military officers, and corporate executives in attendance are unprepared for the weather; they’d assumed the previous day’s mid-70s temperatures would hold. A cold rain turns to steady snowfall, and Palantir passes out heavy blankets. As people move between open-air pavilions, it looks like they were pulled from shipwrecks. Nonetheless, spirits are high. To this self-selecting crowd, Palantir is delivering on its promises. The company’s stock price is soaring. The gathering is infused with the giddy groupthink of a multilevel marketing event.
After securing an invite to the conference—a task made challenging by Palantir’s disapproval of WIRED’s recent coverage—I was eager to get an inside glimpse of the mysterious company. Founded in 2003 by Peter Thiel and his then obscure former Stanford classmate Alex Karp, the company has become part of the Pentagon’s AI-based combat transformation. In the past few years, though, its biggest growth has been in the commercial sector. “The commercial business is growing at 120 percent year over year. We’re very proud of the 60 percent growth in government, but they're not even on the same glide slope,” says Palantir’s CTO, Shyam Sankar, who is also part of a four-person contingent of tech execs serving as lieutenant colonels in the Army Reserve.
Generative AI has helped fuel Palantir’s rise, supercharging the hands-on support the company provides to its customers. Early in its evolution, Palantir would embed “forward deployed engineers” into companies, helping them weave Palantir’s software into their operations. Large language models allowed Palantir to build products with more power, and now the engineers concentrate on helping customers build their own tools with Palantir’s technology. “Every time those models got better it seemed like they were tailor-made exactly for us,” says Ted Mabrey, an early employee who now heads the commercial business. Sankar elaborates: “Our whole thesis has been that we’re building Iron Man suits for cognition,” he says. “We were rate-limited by the number of people, the creativity of the questions, all those sorts of things. And then [with Gen AI] that rate limiter was eliminated, and that changed the rate of growth.”
The morning’s keynotes include a US Navy vice admiral, the officer in charge of the Maven AI battlefield project, and executives from Accenture, GE Aerospace, SAP, and the Freedom Mortgage Corporation. The range reflects the company’s trajectory from defense work to the commercial sector. During the breakfast hour I watch a demo from a family-run fashion business with 450 employees. CEO Jordan Edwards of Mixology Clothing says that he found Palantir through an Instagram ad, and that the AI-powered system has transformed his business. He uses Palantir’s software to help make buying decisions and then has it send emails to negotiate prices. For one line he sells, “it drove a 17-point margin swing—from losing $9 a unit to gaining $9 a unit,” he claims. Edwards now describes himself as a “forward deployed CEO.”
Even though Palantir’s major growth is in the commercial sector, its soul remains in defense contracting. During its long struggle to become part of the defense establishment (at one point, it sued the Army to be considered for a contract), it adopted a focus on outcomes. Palantir likes to think that this experience forced it to adopt a level of rigor that has allowed it to eclipse its rivals in the commercial arena. One chapter of Sankar’s just-published book, Mobilize: How to Reboot the American Industrial Base and Stop World War III, is called “The Factory Is the Weapon.” Both Sankar and CEO Alex Karp believe that American industry, especially in Silicon Valley, has shown insufficient patriotism. Their hope is that Palantir’s example will inspire other corporations to produce national defense products in addition to their consumer work.
Karp’s introductory remarks at the conference emphasized how defense work defines the company, especially now that America is at war. Atypically garbed in a blazer (“This is to convince my family I have a job,” he jokes), he says that normally, he would be talking to commercial customers about how to make them wealthier and happier and help them destroy their competitors. (He refers to rivals as “noncompetition” because in his mind, they don’t rank in Palantir’s class.) But with an active battlefield in Iran, the company’s sole priority is now supporting the troops. “At Palantir we were built to give our warfighters … an unfair advantage,” he says. “It was, ‘Yeah, we’re going to really F- our enemies.’ And I take great pride in that.”
Karp claims that the Palantir culture is broad enough to allow disparate political views—with a key exception. “The one thing I tell Palantirians is you can be on any side of an issue, but if you’re expecting us not to support warfighters when they’re in battle, you’ve got the wrong company.” Now that the US is at war, he says, “We’re not interested in debating. We are very proud to have our role in having American men and women come home safe. That sometimes means that people on the other side don’t go home.” (The remark came after at least 175 Iranian civilians died when a girls’ school was hit by a missile. The incident is under investigation and Palantir won’t comment on whether its products were involved.) Karp implies that if his customers aren’t on the same page with Palantir on this issue, they’ve also got the wrong company. “You are engaging in proxy when you are engaging with us,” he says. His remarks are greeted with applause.
At no time does Karp mention Anthropic. Yet his remarks seem to intentionally contrast Palantir with the AI company now sanctioned by the Pentagon for attempting to set what it considers moral and practical limits to the use of AI in battle. To Palantir, that is immoral. When I mention I’m writing a lot about AI to Sankar, he goes on a rant, telling me that people who invent things are the last ones who understand it. The leaders of AI companies, he says, have holes in their hearts where God should be, and they are trying to fill it with AGI. Sankar and Karp clearly have little patience for the goo-goo scenarios Dario Amodei outlines in his ultra-optimistic essay, “Machines of Loving Grace.”
That’s Palantir’s differentiator: a jingoistic chip on its shoulder and a belief that both virtue and success lie in pushing AI technology to help America win. It attributes its corporate success to the mastery of that effort. “There’s a gravity to the defense mission,” says Sankar. “You could ask, would we have ever conceived of [forward] deployed engineering if we didn't feel some sort of moral weight that our software has to fucking work?” Instead of being a hurdle to winning new customers, Mabrey says the company’s notoriety acts as a useful filter, narrowing the field to those who are most culturally aligned with Palantir’s values. “We tend to have relatively fewer customers and relatively much deeper relationships with those customers,” he says, “We don’t come in and tell them what to do—and they don’t tell us what to do,” he says.
It also resists passing judgement on its government customers when its software is used for questionable means. When I ask Sankar why Palantir has continued to work with ICE after the agency’s violent surge in Minnesota, he says, “The specifics are a tragedy, but the ballot box and the courtrooms work. You have to make a very fundamental call—do you believe in the system or not?”
The snow was still falling when I left the Palantir conference site, returning to a world where the company is regarded with skepticism. Outside the conference bubble, a debate is raging about how AI should be used. Palantir has found energy and wealth in bypassing that conversation, instead devoting its complete attention to using AI to win. Loving Grace is for noncompetitors.