I recently read a fascinating paper that got me thinking. It wasn't about startups, or programming, or even venture capital. It was about how AI thinks about money. And it confirmed a suspicion I've had for a while: AI is probably better at gambling than you are.
Not in the way a card counter is better, by memorizing what's been played. But in a more fundamental way. AI lacks the baggage that makes humans so consistently bad at making bets.
The study was simple. Researchers took a standard set of financial questions—the kind of "would you rather have $100 now or $150 in a year?" questions that economists use to measure how people make decisions—and they asked them to a bunch of the latest AIs. Then they compared the AI's answers to a massive dataset of human answers from 53 different countries.
The results were telling. When it came to simple, lottery-style bets, the AIs were ruthlessly logical. They didn't get greedy or scared. They just calculated the expected value and picked the option that was mathematically superior. They were perfectly "risk-neutral."
Humans are the opposite. We are walking bundles of cognitive biases. The most powerful of these is loss aversion. For most people, the pain of losing $100 is far greater than the pleasure of winning $100. This makes us timid. We turn down good bets because we're terrified of the downside. We sell our winning stocks too early and hold on to our losers for too long, hoping they'll turn around. It's irrational, and it's deeply human.
The AIs have none of this. They don't feel a loss. It's just a negative number in a calculation. They can look at a bet with a 60% chance of winning $10,000 and a 40% chance of losing $8,000 and make the right call, instantly, without a knot in their stomach. They just run the numbers.
This is why AI is better at gambling. It's not that it can predict the future. It's that it's free from the past—from the millions of years of evolution that have wired our brains to avoid risk at all costs. Our ancestors weren't trying to optimize their stock portfolios; they were trying to not get eaten by lions. A rustle in the bushes might be a lion, or it might be the wind. The cost of assuming it's the wind when it's a lion is death. The cost of assuming it's a lion when it's the wind is a wasted afternoon. So we're wired to assume the worst.
In the modern world, this wiring is a liability. It makes us bad investors, bad poker players, and bad decision-makers in any field that involves uncertainty.
Now, the study did find something strange. While the AIs were perfectly logical on simple bets, their reasoning got a bit fuzzy on more complex questions about time and money. The paper called it an "Illusion of Thinking," suggesting the AI was just matching patterns without truly understanding. But I think that's the wrong way to look at it. It's not a bug; it's a feature. The AI isn't getting bogged down in the messy, contradictory, and often irrational ways humans think about the future. It's sticking to the data it was trained on.
And here's the most interesting part. The researchers found that the AI's financial "personality" didn't match that of people from rich, Western countries, as you might expect. It most closely resembled the answers of people from Tanzania. The theory is that this is because many of the people who do the grunt work of training these AIs—the human feedback raters—are from East Africa. The AI has, in a sense, absorbed their cultural outlook.
But even this is a kind of purity. The AI isn't a confused mix of all the world's biases. It has a clean, consistent, and—in the case of simple bets—a ruthlessly logical worldview.
This leads to an obvious application. If you could build a system that lets an AI make bets based on pure, unbiased probability, you could potentially outperform the market. You could build an AI that trades derivatives.
Derivatives are the ultimate form of gambling. They are pure bets on the future price of an asset. And the market is full of humans making bad bets because they're scared, or greedy, or just plain irrational. An AI with no fear, no greed, and a superhuman ability to calculate expected value could, in theory, clean up.
It wouldn't be easy. The AI would need to be trained on the right data. It would need to be fast. And you'd have to be willing to trust it, even when its decisions seemed counter-intuitive or just plain weird. But the potential is there. The same quality that makes AI feel alien is what could make it a formidable gambler. It doesn't think like us. And when it comes to making bets, that's a very good thing.
The ideas in this post were inspired by the paper: Artificial Finance: How AI Thinks About Money.