Updating The Hype Cycle

Updating The Hype Cycle

Warren Buffett and Charlie Munger are two rich philosopher kings whose purchases of candy stores and Coca-Cola shares have spawned aphorisms on life, psychology, and investing that are the closest thing to a modern day Ben Franklin that we have.

I've read Charlie's Almanack, I've read more than one Warren Buffett biography, I've read the Berkshire Hathaway annual letters. Though, for some reason, I'm still not a billionaire? Is there another form I need to fill out?

But in all that corpus of thought, there is one quote that they've both used and re-used that has stuck with me. I think it's maybe one of the most important ideas, and one that the lack of understanding it is causing a lot of people a lot of pain for a lot of reasons:

“When my information changes, I change my mind. What do you do?”

The concept of being willing to change your mind goes hand-in-hand with the willingness to learn lessons. To evolve, and change behavior. Unfortunately, that is something most of us struggle to do.

The Latest Hype Cycle

A few months ago, I wrote a piece called The Hype Cycles of Venture Capital with a graphic I was quite proud of with my deeply inadequate Canva skills.

image

While the world fell apart in 2022, capital dried up, interest rates rose, and layoffs mounted, there was still one sliver of 2021 that managed to persist. A sudden explosion of interest in artificial intelligence in its many forms arose. You saw companies raising $100M+ Series A rounds and $1B+ valuations for effectively no revenue. Then there's OpenAI getting valued at $20-30 billion with plans to go from ~$35M in revenue to $1B in a pretty quick period of time.

Now, there have been plenty of people writing about AI and the technology and the implications, use cases, and downfalls. I'm not as focused on that in my writing today. I've been reflecting more on the fundamental questions around whether or not this fervor makes sense. What, if any, lessons are we applying to this current craze that we wish we'd known in 2021?

The Supposed Lessons Learned

There was a lot of hand-wringing and tough reflections in 2022 as people acknowledged that "mistakes were made (but not by me)." Laser eyes and .eth disappeared from twitter profiles, and VC blogs became riddled with writing on the importance of profitability and being default alive. I kept thinking about how important it was for all of us to try and internalize the lessons from the hype cycle we had created.

While I thought, "we'll learn these lessons and hopefully spend years internalizing them, and then in 8-10 years, hopefully we learned something and it can be different next time," I didn't expect us to put that to the test within ~6 months instead. So what were some of the lessons that people supposedly learned? Here's just a couple that come to mind:

  • Hype doesn't automatically translate into value created (aka generated returns)
  • Valuations ought to be representative of future cash flows
  • Passing the bag only works for so long
  • Endless supply of money can hide a multitude of sins
  • Building long-term business value takes a long time, and requires extendible platforms, defensible moats, efficient investment of cash / resources, and effective distribution

So, let's apply some of these lessons to what's going on in AI. Did we learn anything? Is this different than what we've been steeped in for the last few years?

AI: Why The Hype?

There is no question that AI has reached an inflection point. In 2017, the famous Transformer paper was released that unlocked a new architecture in NLP. Since then, similar advances in reinforcement learning, deep learning, generative adversarial networks, and transfer learning have all contributed to a leap forward in the capabilities of different types of automation.

There is also a fascinating change that Sam Altman, Eric Schmidt, and others have talked about where open-source advances have kept much closer pace with closed-source advances in AI. At Contrary, we did an event with Amjad Masad, the CEO of Replit, last December, and he had this great point that AI/ML haven't really started to impact things like coding, writing, art, or music until basically last year.

There is a fundamental shift in technology happening that is powerful, fascinating, and will have wide-ranging effects, there's no doubt about that. But I agree with my friend Evan Armstrong in his hesitation about the implications of that innovation for the world of VC-backed startups:

"AI is probably the most exciting tech paradigm since the personal computer. But at the risk of sounding like the guy in Times Square yelling about the end of the world, I feel the need to scream, “Technological innovation does not equal investment opportunities."

Lessons Applied: The Definition of Value

VCs talk a lot about the difference between value creation and value capture. The TLDR is that just because you create value for someone doesn't mean they'll pay for it. One simple example is Facebook. There is no value capture from consumers. Only value creation. In the immortal words of Senator Orrin Hatch, "how do you sustain a business model in which users don't pay for your service?"

The value capture occurs, as Mark Zuckerberg explained on Capitol Hill, by "running ads." Advertising is the mechanism for capturing value after you've created it with a free product. Every business has these types of inputs and outputs. You accrue costs, resources, capital, and talent in order to create value. Then, you have to ask yourself if the value you capture justifies the costs exerted to create that value in the first place.

But one of the things that investors use to trick themselves into ascribing value to a company comes from imagined value. Investors are pretty affective storytellers, but like Medusa turning herself to stone looking in the mirror, those stories can sometimes turn on their tellers.

Some of the biggest investments in the world revolved around someone's "value imagination." Quibi, Google Glass, IBM Watson. These things happen because people can imagine the kind of value that could be created. Let's take one of these as an example: IBM Watson. Will Manidis had a great thread about the impact of GPT on healthcare, and he talks about the past promise of IBM Watson.

"The failure of Watson was staggering for the field-- MD Anderson, the cancer center at the University of Texas, spent $62 million dollars implementing the system in critical oncology workflows-- before mothballing it less then a year later."

In the world AI, there is an ocean of imagined value. By definition, when companies raise at exorbitant valuations, they're literally being funded on the basis of imagined value. A number of AI companies like Adept, AnthropicRunwayML, they're raising at valuations 100x - 500x their revenue, if not an infinite multiple given zero revenue.

The biggest question for all of these companies is "how much value can you actually create? And then what is the business model to capture that value?" I see two obstacles that will likely impact the majority of companies today that are entrenched in AI

(1) Creating Value

So much of the innovation in the space is happening as part of a complex interwoven web of players and research labs, all feeding off similar breakthroughs. I've worked with one of the best AI investors of the last decade, and his constant line was "your differentiation can never come from just having a better model." Creating value will be increasingly difficult as everyone works to push the field forward in similar ways.

(2) Capturing Value

While there may be some companies that capture value from foundational models (more on that later), the vast majority of AI companies aren't trying to build their own core foundation. They're building applications. They take open source models, or the same API that everyone else is using, and try to build specific use cases around it. But because everyone is building off a very similar foundation, it's very difficult to differentiate your product.

Jasper is the good example; they grew to ~$80M in revenue as a first-mover in generative AI for copywriting. But they were built on OpenAI, which led to a surprising revelation when OpenAI started releasing their own products:

"With a simple prompt, ChatGPT could craft a business proposal, write a resignation letter or explain the inner workings of quantum mechanics. In fact, it worked a lot like Jasper’s core product. But unlike Jasper, ChatGPT was free."

ChatGPT isn't the only threat. There is now an ocean of generative text products that could compete with Jasper; some powered by OpenAI, others not. This has left Jasper looking for ways to dramatically expand their product line, and available features. But when you have 100+ generative copywriting tools, do you think you can keep charging $500 a month?

Lessons Applied: The Value of Distribution

The most significant implication of the current wave of innovation in AI feels somewhat unique to this moment. If I were to broaden a lesson from 2021, it's that real long-term business value takes a long time and is difficult to create. AI is just illustrating that example by proving how much more valuable it can be for incumbents.

Several people have written about this idea that existing incumbents with established distribution will fair much better than newer startups:

Sam Lessin:

"I am very very very skeptical that the value of generative AI will be captured by startups vs. incumbents. Unlike a real platform shift, AI is just an accelerant to existing businesses and existing patterns of distribution - and is super-duper easy for incumbents to just slot in to what they already do."

Evan Armstrong:

"AI’s longest-term impact is that it would bring the cost of digital good creation close to zero, forcing companies to compete on distribution efficiency… Microsoft, Amazon, and Google will do quite well selling pure-play AI products because they invent or replicate the underlying techniques while simultaneously storing all of the fine-tuning data. As an added bonus, they already sell their products to every company on the planet, giving them a distribution advantage."

Tomasz Tunguz:

"Tech giants have amassed the largest data sets, leveraged massive balance sheets to finance the human and financial capital-intensive work of training models - which can cost millions of dollars for each run - and amassed years of experience working with large data sets.In addition to these the advantages, these incumbents have another edge: their distribution. Microsoft Office’s customer base spans more than 1m companies & 350m paid users. Microsoft will upsell new ML features across their product suite at a premium - just like Premium Microsoft Teams."

There are also dozens of examples of incumbents, both large and small, implementing the newest advances in AI to reinforce their existing business strengths. Canva launching image generation puts a quash on a lot of other standalone text-to-image models. But then Microsoft trumps that by integrating DALL-E 2 into their new Designer platform. Intercom launching ChatGPT functionality in their chatbot, and Buzzfeed using AI to automate some of their content.

So much of the value being created or captured will ultimately accrue to these incumbents. Even newer companies leading the charge, like OpenAI, have had to tie themselves to incumbents like Microsoft because of some of the inherent limitations that a startup would have, like the eye-watering costs of compute.

What are investors looking for in AI?

So much of the focus for investors are very similar to what they always look for in the earliest stages:

Talent

While the explosion of applications is everywhere, the pool of exceptional AI/ML founders is limited. Even if the person is building what feels like a project, you want to be working with the sharpest people as this space unfolds.

Proximity

The combination of open-source models, shared best practices, requirements for training data, and compute resources will keep a lot of these companies tethered (e.g. OpenAI + Microsoft). So you want to invest in companies plugged into that zeitgeist.

Community

The latest wave of AI will largely be played out among the platforms. The most interesting tools at the early stages are those that are able to build intense communities around themselves, often around a specific use case. Companies like Replit have an existing product surface area, an existing base of loyal users. Generative AI can augment their product to make it better, but they're not at the mercy of AI as their only value proposition.

What Does This Mean For Venture?

The unfortunate reality is, I think there are a lot of investors that didn't learn anything from the hype of 2021. And at the end of the day, this comes back to the structure of incentives and business models that many investors are employing.

For many of these firms, your job isn't actually to make the most pragmatic decisions about what can drive long-term business value. Your business model is to expose capital to assets with the most dramatic potential increases in value. That's what 2021 was all about; being exposed to the assets that have the makings of something that will continue to appreciate in value.

Does that appreciation always come with revenue growth, cash flow, or market share? Maybe, but not necessarily. This is another reason why I so often return to this idea of nonsensical valuations in private markets. The narrative. The imagined value can play just as big a role, if not bigger, in determining the valuation of a particular company. The most valuable companies will be those with the best stories to tell, and investors will continue to trip over themselves in pursuit of those stories.

For those of you who find yourselves feeling skeptical of this kind of "heat seeking," you should ask yourself what kind of investor are you? What kind of founder are you? Are you more focused on building long-term business value? Or are you focused on getting as wealthy as possible, as quickly as possible? Because those are not necessarily the same games. And whether its building long-term business value, or getting rich, it's certainly possible (even most likely) that you'll lose one of those games. But if you don't know what game you're playing? Well, then you're sure to lose.