No time to read the update? Watch/listen for free:
Edited by Brian Birnbaum.
Meta’s Q2 2024 earnings report suggests we are not in fact in an AI bubble. The asymmetry that I pointed out in my original Meta deep dive is only starting to unfold, with the stock up 415% since.
At present Meta’s business primarily consists of driving and better monetizing engagement across its Family of Apps. In my last Meta update I explain how the company saw an 8-10% increase in Reels watch time by transitioning toward a single AI recommendation model architecture. This shift, until recently, was contained only to Reels.
As of Q2 2024, Meta transitioned to a unified AI model for all video players across its Family of Apps. Thus, AI is clearly driving incremental engagement across Meta’s network, proving that AI is already a productive technology.
In the Q2 2024 earnings call Zuck said that Llama 4 will require ten times more compute than the current version, Llama 3. What surprised me is that only iteration of the model already requires more compute power by an order of magnitude. The continued scaling of LLMs (large language models) will drive exponential growth in the chip sector. During a recent AMD event, Lisa Su revealed that Llama 3.1 runs on AMD’s MI300 chips exclusively for live traffic. Thus, Meta’s advancements also suggest, as I hypothesize in my original AMD deep dive, MI300 is set to take share from Nvidia in the AI accelerator space.
AI is enhancing Meta’s value because, for the past five years, while many thought they were betting the house on the Metaverse, they’ve been building a highly fungible datacenter infrastructure. This setup allows them to capitalize on the highest RoI opportunities as they emerge. Meta can tweak the entire AI stack–from the hardware to the models and end applications. Take for example the aforementioned shift toward a single-model AI recommender for video, which will unlock more value from AI over time. Meanwhile, the market’s Internet Bubble inveterates masquerading as AI Bubble Cassandras have been singing the siren song of diminishing marginal returns.
In the Q2 earnings call Zuckerberg mentioned how Meta’s new AI-powered apps built on Llama 3 have notable long term implications. Here’s what Zuck said about this in the Q2 earnings call:
So Llama is the foundation model that people can shape into all kind of different products. So whether it's Meta AI for ourselves or the AI Studio or the business agents or like the assistant that's in the Ray-Ban glasses, like all these different things are basically products that have been built with Llama. And similarly, any developer out there is going to be able to take it and build a whole greater diversity of different things as well.
Fundamentally, Meta’s AI endeavors are becoming fractal: the number of sub-models that can emerge from the parent model is endless. Zuckerberg’s vision is for anyone to be able to create an AI model of their own, including businesses and content creators. These models will be able to automate any process within Meta’s network, such as conversations with customers, followers, and more. Most importantly, this fractal architecture will enable businesses to simply state operational goals and have Llama n create a model for them in a matter of seconds.
Long term, fractal AI promises to make the company more money than its current advertising operation. In the meantime, it was interesting to see Meta distinguish between core AI and generative AI in the Q2 earnings call. Certainly, much of the AI hype is yet to be materialized, but it’s appealing to see Meta tell the difference between AI that’s productive today and AI that will be unlocked from continuing CapEx. It seems that Meta’s fungible datacenter infrastructure allows them to prioritize capital allocation on AI initiatives that drive value today, without missing out on the long term opportunities:
Our ongoing investment in core AI capacity is informed by the strong returns we've seen and expect to deliver in the future, as we advance the relevance of recommended content and ads on our platform.
While we expect the returns from Generative AI to come in over a longer period of time, we’re mapping these investments against the significant monetization opportunities that we expect to be unlocked across customized ad creative, business messaging, a leading AI assistant and organic content generation.
As we scale generative AI training capacity to advance our foundation models, we’ll continue to build our infrastructure in a way that provides us with flexibility in how we use it over time.
This will allow us to direct training capacity to gen AI inference or to our core ranking and recommendation work, when we expect that doing so would be more valuable.
We will also continue our focus on improving the cost efficiency of our workloads over time.
- Susan Li, Meta CFO during the Q2 2024 earnings call.
The market’s concern about Meta open-sourcing Llama is another instance of its failure to understand how value is generated in networks over the long term. The value is not in the models themselves, but in applying them to a winner-takes-all network from which participants can then extract value. In isolation, the model in question can do cool things, but it’s of little value if you can’t point it at a 100,000 follower audience, for example. Indeed, by open-sourcing Llama, Meta is actually increasing the odds of the model maximizing value.
Also interesting to see Threads reaching over 200 million monthly active users. Meta is yet again waiting patiently for 1 billion monthlies before monetizing. Meta’s playbook is an instance of the scale economies shared concept that I refer to in my last Hims update, one typical of today’s global juggernauts. Instead of monetizing straight away, Meta is willing to share the fruits of scale (compute, fundamentally) with users until dominating. I’ve seen the market repeatedly misunderstand this concept when applied to networks, including the Uber and Spotify cases, for example.
Here’s what Zuck said about this in the Q2 2024 earnings call:
So I think that's something that our investors and folks thinking about analyzing the business, if needed to always grapple with, is all these new products, we ship them and then there is a multiyear time horizon between scaling them and then scaling them into not just consumer experiences but very large businesses.
Meta continues to strike me as an asymmetry factory. Threads started off as an experiment, but it’s now on a course to become one of the major internet platforms worldwide. The AI datacenter investments they’ve been making over the past five years, while dampening the financials in the short term, have given way to a fractal AI model setup that could yield revenue in excess of the company’s current total revenue. Incidentally, Reality Labs segment revenue was up 28% YoY driven “primarily by Quest headset sales.” This part of the business seems to be gaining traction and could evolve into a new computing platform, in combination with the AI infrastructure.
In Q2 2024 Family of Apps expenses came in at $19.4B, representing approximately 80% of overall expenses. When I wrote my original deep dive on Meta, somehow the market came to believe that Meta was betting the house on the Metaverse. Not only was this sorely misguided, but Meta was in fact investing most of its capital to supercharge its Family of Apps via rollout of its AI stack. Zuck’s philosophy of optimizing investment of retained earnings in AI reveals an asymmetry in early innings. When I wrote the deep dive I pointed out that we were at T8 and now I believe we are at T9.
What kept me from investing in this business in November 2022 is my concern that Meta may be tobacco 2.0. I broadly distinguish between screen time that adds and detracts value from your life and I believe Meta is largely in the latter camp. Since I formed this view, Zuckerberg has demonstrated to be an extraordinary capital allocator and Meta has proven to have the organizational ability to simply iterate faster than anyone else. Zuckerberg made special allusion to this component of Meta’s DNA in the Q2 2022 earnings call:
And culturally, we focus on moving and learning faster than everyone else. And I think that those are sustainable advantages.
But I think at the end of the day, what that really comes down to is just I try to push the company to be one that learns faster and just keeps iterating and moving faster than we did in the past and than others in the industry do. And I think if we can do that well, then we'll continue to succeed.
But I think the moment that we stop doing that, then we'll basically fall behind.
Focusing on the end customer first and iterating fastest has paid off once again, as has been the case with our other winners: Spotify, Netflix, Amazon, and, in my opinion, Hims. In November 2022 the market thought TikTok was going to kill Meta. Many smart people were convinced of this. This case therefore also serves as an additional validation for the mental model I teach in my Tech Stock Goldmine course, which is also behind my successful Palantir (6X) and Spotify (5X) picks.
If you haven’t, I recommend that you read my original Meta deep dive. By doing so, you will clearly see the mental models that I used to spot the investment opportunity. In this way, you don’t have to buy the course to take home some very powerful tools, in your pursuit of new investment opportunities.
Until next time!
⚡ If you enjoyed the post, please feel free to share with friends, drop a like and leave me a comment.
You can also reach me at:
Twitter: @alc2022
LinkedIn: antoniolinaresc