If you have never heard of the tech company Palantir, or its CEO, Alex Karp, then bear with me. Although “Orwellian” has become an overly dramatic cliché, no better word describes his company.
To get a sense of Palantir’s power and how American citizens can get caught up in its surveillance technology, check out some of these State of the Day Substack articles I previously wrote on the subject. Surveillance states have already been built in China and Gaza, and the United States is on track to adopt something similar. It’s chilling stuff, and Americans should be wary of anyone who pretends AI will never be used to chip away at our freedoms. (RELATED: The Long, Cold Totalitarian Winter Is Coming)
In particular, Americans should be wary of Karp. During an interview Tuesday at the World Economic Forum (WEF) in Davos, Karp claimed that his company’s AI technology will actually “bolster” civil liberties.
“Despite what people may want to believe, it also bolsters civil liberties, because now you can see, well, I mean, just simple questions — Was someone processed based on economic considerations, or were they processed based on their background? Like those things are impossible to see, unless you have, like, there’s a huge civil liberties betterment side of this that typically people don’t believe we care about or, but it’s actually exactly the opposite,” he said.
Karp has previously defended Palantir against allegations of civil rights abuses.
A view of the Palantir building is seen during the World Economic Forum Annual Meeting 2026 in Davos Switzerland 2026/01/20 (Photo by Laurent Hou / Hans Lucas / AFP via Getty Images)
“People don’t believe it’s possible, but that’s wrong! It is generally believed that to solve the problem of terrorism, it is necessary to capture the entire amount of thousands of millions of pieces of disparate data. That is what an undemocratic society would do, but you would be smothered alive underneath under this pile of data, and you wouldn’t learn much … And there are things you don’t have access to. Palantir only takes the relevant data subject to local laws, and puts additional rules in place to define who sees what. And that’s enough to allow our customers to fight against terrorism or improve their industrial processes without restricting civil liberties or violating personal data, etc.,” he said in a 2023 interview.
However, at the 2023 WEF, Karp admitted out loud that Palantir kept the European far-right from gaining a “position of dominance.”
“I am progressive, and I think the left is wrong to hate on us sometimes, because without Palantir, the far right would have been in a position of dominance, because Palantir, single-handedly with the police forces, stopped major terror attacks,” he said.
Maybe it’s true they did stop actual terror attacks. But maybe not. Maybe the threat of terror attacks was used as a pretext for censorship and surveillance. But let’s take a step back and look at the bigger picture.
CEO of Palantir Technologies Alex Karp speaks during the World Economic Forum (WEF) annual meeting in Davos on January 20, 2026. The World Economic Forum takes place in Davos from January 19 to January 23, 2026. (Photo by Fabrice COFFRINI / AFP via Getty Images)
If you do not think new technologies like AI will ever be used for nefarious means, then you need to brush up on some history. History is littered with a billion examples of new technology being used by evil people to murder and enslave populations. Since the first men roamed the Earth, humans have had an infinite appetite for killing, and technology has allowed us to satiate it.
Here’s a prime example: the Holocaust. The Holocaust was so catastrophic because of technology. Nazis used railroads, the telegraph, modern engineering, and poison gas to facilitate their mass murder of Jews. Without these things, they would not have been able to kill at such a horrific scale.
The technologies themselves aren’t bad, necessarily. It’s the humans who use them who can be evil. And human nature certainly hasn’t changed since the 1940s, and it never will.
AI is no different. It’s a tool that smart people will use to perhaps make the world a better place. Doctors will use it, and so will entrepreneurs. But bad actors who want more power and control over normal people will also abuse it.
Taking Karp’s word for it — that AI will protect civil liberties, rather than erode them — is severely mistaken. AI will be turned into a weapon, if it hasn’t already. Technology has progressed a lot since man thought to strap a rock to a piece of wood. Human nature, however, has not.
Sign up for John Loftus’s weekly newsletter here! Follow John Loftus on X: @JohnCFLoftus1








![Florida Officer Shot Twice in the Face During Service Call; Suspect Killed [WATCH]](https://www.right2024.com/wp-content/uploads/2025/12/Inmate-Escapes-Atlanta-Hospital-After-Suicide-Attempt-Steals-SUV-Handgun-350x250.jpg)







