11 Comments

I personally agree that the future is getting less predictable & AI in particular is going to falsify a huge overhang of extant long-term predictions which have misleadingly-good track records because they are predicated on AI never happening. But to bracket my personal opinions, how would one measure such things as 'people *think* the future is getting more/less predictable' or 'the future *has* been getting more/less predictable than people thought'? Some possibilities:

- prediction market forecasts being systematically overconfident: have PredictIt or Metaculus contracts been expiring with worse Brier or other proper scoring rule results when grouped by year? Is GJP's accuracy decaying? Are the old historical predictions from Tetlock's expert surveys getting worse over time?

- prediction market prices being higher variance: prices reflect information & certainty, so if the future has been getting harder to predict, then price time series should be increasingly volatile

- similarly, futures markets: volatility and the risk premia of long-dated options can either be higher or lower than in earlier time periods. (The future may be richer or poorer, but any change in levels would be reflected in the base price.)

- insurance costs: ditto---the riskier the future, the higher premiums must be charged. (eg.can you still buy cheap longevity insurance or annuities? If so, then insurers apparently don't believe that anti-aging research is going to pay off.)

- interest rates: interest rates reflect the desire to trade less money now for more money later, and incorporate a lot of things like nominal inflation or demographic trends like retirement planning, but also include risk such as expropriation or disaster or just the opportunity cost of being locked into the wrong investment. So at a simple first look, higher interest rates imply a more unpredictable future and vice-versa.

- conversely, debt loads: debt is dangerously fixed, so one will prefer debt to equity if the future is predictable and you can count on the cash flows to service it. On an individual level, things like student loans or house mortgages are commitments to a predictable future.

- social/geographic mobility: people will prefer buying houses to renting or staying put the more predictable the future, because the less the optionality is worth. (If you know the future, you either have moved to the right place already or you know there's nothing better out there.)

- creative destruction and turnover in the economy, especially of large corporations

- increased contracting and regulation, redistributive politics, larger governments as % of GDPs etc

- demographics: the older and more female a population, the less you expect any radical revolutions or uprisings which would spoil your predictions; violence is a young man's game.

Again bracketing AI out, my impression is that over the past 2 decades, pretty much all of these have trends consistent with believing a more rather than less predictable future. What we see is a world where aging populations and governments invest ever more into various kinds of 'insurance' and avoiding any consequences or major changes, spending however many trillions of dollars it takes to satisfy risk aversion. In the 1990s, people thought the future like 2022 would be a lot crazier than it is. In 1995, it was easy to imagine how IBM (or Microsoft) might not exist in 2005; in 2022, can you imagine Google (or the rest of FANG) not existing in 2032? I can't. Stuff like CRISPR is cool and benefiting a few people, but again, stagnant compared to the hopes & dreams of what would happen once the Human Genome Project would finish, and to things that happened back in the 1990s like Dolly or three-parent babies. People were looking forward to debunking Fukuyama with the rise of Russia (eg. Esther Dyson) or China, but he's having the last laugh as they prove to be hollow corrupt authoritarian states which are struggling to maintain middle-income status, and the only people they appeal to as 'a new post-liberal-democratic paradigm' are would-be authoritarian strongmen. Or consider the lockdown response to COVID. So it looks like people expect a more predictable future, have thus far been largely right, and have tended to do things that would cause that.

The counter-arguments here are mostly anecdotal and not even great ones. SBF incinerated $8b? Fine; but the consequences of FTX has thus far been mostly some embarassment. Meanwhile, some dude over at AT&T incinerated up to $100b in a bad merger, which is a lot bigger, while 1MDB was a lot smaller ($1b) and the Guptas ('who?') stole a lot more (>$20b), and those had actual geopolitical consequences.

Some industrialist took over some relatively minor advertising company? Yeah, that's something that used to happen a lot - how nostalgic to see an instance in our latter days, reminds one of the youthful American economy, before all the poison pills (which Twitter had) were legalized and other measures put into place to stop takeovers (and did).

Trump was elected? Sure, that was surprising, but the signature feature of Trump's administration was that it incompetently passed the time for 4 years, so it affected few meaningful predictions, and elections like Trump's forecasting error were within the total forecasting error historically so statistically, a loser candidate like Trump winning isn't even unusual. Let's remember how many shenanigans there have been around presidential elections historically, whether it's Watergate or Literary Digest or the Compromise of 1877 or JFK beating Nixon credited to his (non-social) media savvy or hey remember how crazy 2000 was?

Xi/Putin are awful? Yes, they sure are; are they more awful - and unpredictable - than Mao or Stalin or Pol Pot or Kim Il-Sung or Adolf Hitler, or countless other dictators and tyrants throughout history, and is the awfulness and unpredictability of the current crop of dictators like Modi or Erdogan "increasingly unpredictable"? No.

Expand full comment
Dec 7, 2022Liked by Daniel Goodwin

Your strong random individual thesis reminds me of the fractal nature of causality in biology. In most of the universe, a molecular scale event gets washed out in the random thermodynamic fluctuations. Best of luck for the ambitious hydrogen atom in the center of the sun, but it probably will just get crunched into helium like its neighbors. But in biology the system is at edge of chaos and so micro influences have macro effects. Even a quantum event can have societal effects if it swaps a nucleotide in a cancer driver gene that kills Steve Jobs, or adds 20 IQ points to the next John von Neumann. Similarly, our model of evolution is a single fit individual whose selfish gene quickly takes over the population.

The counterargument from biology might be emergence, multiple causality, or both. Things might have a distribution of causes and it's hard for people to fit all the causal factors into their mental models. But a bigger machine learning model might be able to eventually be predictive in theory, with the right data. Or Wolfram might say it's all perfectly predictable, just creates random chaos when you run the program out.

On a tangent I've been thinking a lot about org design and see an interesting connection between Jane Jacobs:Robert Moses::chaos:planning story you tell and the top-down org chart vs Team Topologies and agile services approach.

Awesome post, love all the great references and new reading material. Cool that you used ChatGPT but AI is a long way off from great writing like this!

Expand full comment
Dec 5, 2022Liked by Daniel Goodwin

Considering alternatives to factory-education: you might want to check out Sudbury Valley School.

www.sudval.com

A warning: many people go to the website and read some stuff superficially and think they understand the model. Consider that might not be the case and you need to dive in deeper.

One of the incredible things from the school is to watch the students or alumni (on Youtubre), They radiate something very powerful.

Expand full comment

I'm not convinced. SBF is less influential as a result of his company failing, and Trump less influential as a result of losing re-election. Putin may still be in power, but he's far from a Napoleon. He can't even defeat Ukraine! Stalin was a Russian leader who really changed the world, in contrast. Xi is more powerful than Putin, but even he's having trouble with COVID lockdown protests. He certainly doesn't seem an influential as Mao.

Expand full comment

Listening to the Joe Rogan/Steve Jobs interview rn… idk doesn’t seem that good? Open to the possibility that the underlying tech can become eerily good in less than a year though.

Expand full comment