AI Culture Will Be Weirder Than You Can Imagine

There are two radically different visions of our AI future, and they depend on the cost of energy.

In one scenario, low energy prices lead to a lot of slack. At the margin, people don’t need to be so careful about how they deploy their AIs. Right now, for instance, I don’t pay extra for using my current LLMs more. So I am willing to play around with them a lot without worrying about whether any single use is going to achieve some concrete useful end. I just let things rip. The result is some silliness, some jokes and more indulgence of my random obsessions, in addition to help with my history and economics questions.

I call this the AI Future With Slack.

It is not clear how long the system can operate like this. As more institutions work with generative AI, demands on those services will increase. AI companies will have to invest more to meet the growing demand for computing power. AI services will also lose their initial venture-capital-funded runways and be forced to make a profit. Over the long term, each use of generative AI will cost a noticeable amount of money.

I call this the AI Future Without Slack.

Both AI usage and global economic growth will significantly boost the demand for energy, and thus energy prices. Using the vast computing powers of AI could mean significantly higher energy costs.

Of course, there are many different variables that figure into energy costs, ranging from the future of nuclear fusion, battery technologies and numerous regulatory decisions. I don’t have specific predictions other than to say that energy will continue to be relatively cheap for households (that is, voters) and will get relatively expensive for business-owned AIs.