No time to read the update? Watch (or listen to) my video update for free on Spotify or Youtube:
Ad: My 2 Hour Deep Diver course, in which I teach you to analyze companies in depth on your own, has already transformed the research process of tens of satisfied students.
In their respective reviews, Eric and Alexander explain how the course is a market asymmetry in itself and how it has helped them become better analysts:
The price of the course has already gone up from $150 to $199 and it will likely continue increasing over time, so now is an excellent time to lock in a great deal.
By completing the course, you will emerge as a self-sufficient analyst with a distinct ability to do more research in less time, know when to buy or sell a stock and more easily spot undervalued companies, among other valuable skills that will help you compound wealth over your life, slowly.
Edited by Brian Birnbaum.
1.0 Contextualizing AMD’s AI Endeavors
AMD’s efforts in the AI space are asymmetric, and AI PCs are currently being overlooked.
The market is looking at AI as if it were only about selling GPUs. But all of AMD’s business segments are effectively distribution channels via which it can repackage and sell its core AI tech.
I believe this distribution advantage will pay off in the years to come, by yielding better unit economics for AMD than otherwise.
AI won’t be confined to GPUs; it’ll be absorbed into all computation platforms over the next decade. All the way from smartphones to desktop computers and laptops, cars, and fridges.
Expertise in chiplets uniquely positions AMD to connect disparate compute engines. By extension, this competence sets AMD up to infuse all of its products with AI capabilities.
Over the long run, this is a much better strategy than only going head to head with Nvidia in the game of selling GPUs.
By bringing chiplet-based GPUs to the market with a differentiated price/performance ratio and iterating on its ROCm software, AMD already has a great chance of taking GPU market share from Nvidia. By simultaneously re-purposing that tech across its various business segments, AMD increases its overall odds of success.
Incidentally, AMD seems to be making good progress on the software side:
For example, we are very pleased to see how quickly Microsoft was able to bring up GPT-4 on MI300X in their production environment [via the recently launched ROCm 6] and rollout Azure private previews of new MI300 instances aligned with the MI300X launch.
-Lisa Su, AMD CEO during the Q4 earnings call.
The potential upside in taking market share from Nvidia is huge, but so is the upside in becoming–just as one example–the number one provider of AI PCs. Better yet, AMD can take on both endeavors at a marginal cost because the competitive advantage in both cases stems from its chiplet platform, which can generalize across the aforementioned product lineups and beyond.
AMD already has the distribution channel on the PC (CPU) side. This means that even if the company does not succeed in taking market share from Nvidia, it can still obtain strong return on AI investment via PCs
Hence the asymmetry of AMD’s move into the AI space.
In Q4, we saw both the datacenter and client segments grow rapidly YoY. Datacenter revenue grew 38% YoY to $2.3B, with datacenter GPUs exceeding expectations of $400m. In turn, client revenue came in at $1.5B, which was up 62% YoY and flat sequentially.
Apart from the obvious technical complexity within the semiconductor business, much of the difficulty stems from the logistics. Manufacturers only want to partner with successful designers whose products are coveted in the market.
With AMD only just entering Nvidia’s turf, it naturally takes time for manufacturers like TSMC to commit capacity. In Q4, however, AMD seems to have made good progress on that front.
As a result, AMD has revised upward its datacenter GPU guidance, and, although not as explosive as expected by the market, I think the company is on the right track to take market share from Nvidia. Quarter after quarter, I believe both consumers and manufacturers will adjust to AMD’s presence in the AI market:
Based on the strong customer pool and expanded engagements, we now expect Data Center GPU revenue to grow sequentially in the first quarter and exceed $3.5 billion in 2024.
We have also made significant progress with our supply chain partners and have secured additional capacity to support upside demand.
-Lisa Su, AMD CEO during the Q4 earnings call.
Further, Lisa Su made some insightful comments on AI PCs during the earnings call. Specifically, notice how she talks about integrating a CPU with an AI engine:
To further our leadership in AI PCs, we launched our Ryzen 8000 G-series processors earlier this month, which are the industry's first desktop CPUs with an integrated AI engine.
Millions of AI PCs powered by Ryzen processors have shipped to date and Ryzen CPUs power more than 90% of AI-enabled PCs currently in the market.
Our work with Microsoft and our PC ecosystem partners to enable the next-generation of AI PCs expanded significantly in the quarter.
We are aggressively driving our Ryzen AI CPU roadmap to extend our AI leadership, including our next-gen Strix processors that are expected to deliver more than three times the AI performance of our Ryzen 7040 series processors.
The embedded and gaming segments are down due to their cyclicality. However, the above reasoning applies equally to them because they will both continue growing over the long term.
At the same time, we are rapidly driving leadership AI compute capabilities across the full breadth of our embedded product portfolio. This is an incredibly exciting time for the industry and even more exciting time for AMD as our leadership IP, broad product portfolio, and deep customer relationships position us well to deliver significant revenue growth and earnings expansion over the next several years.
-Lisa Su, AMD CEO during the Q4 earnings call.
2.0 AMD’s Actual Competitive Advantage in AI
AMD’s AI products have a memory advantage.
At a lower level, and as previously mentioned, AMD’s move into the AI space has been enabled by its chiplet expertise. But at the product level this structural advantage seems to be translating into an advantage in terms of memory capacity.
For an AI model to make an inference (which is half of the training process) you want the model itself to be hosted as close as possible to the GPU to minimize latency. To do that, you want to fit the entire AI model inside one memory chip so that data travels the shortest distance possible.
AMD’s chiplet expertise gives the company more granularity when it comes to connecting different engines within one chip, which should translate into more efficient management of the memory/compute equation over time.
Although my conviction on this matter has been growing for around a year now, it was interesting to hear Lisa Su reference the above in the Q4 earnings call:
I think the traction that we're getting with the MI300 family is really strong.
I think what's benefited us here is our use of chiplet technologies […] and that's how we get our memory bandwidth and memory capacity advantages.
3.0 Financials
Income and Cash Flow Statements
As mentioned, the progress in the datacenter and client segments is muted by the cyclical decline of the gaming and embedded segments. This is particularly visible at the cash flow level, with cash from operations and free cash flow trending down despite an uptick on the top line.
On a full year basis, revenue is down 4% YoY to $6.2B due to the above dynamic. In Q1 2024, both the embedded and gaming segments are expected to continue declining by double digit percentages, which means we should see a continuation of the above trend over the next quarter or two.
Balance Sheet
Meanwhile the balance sheet remains strong, with an ample net cash position.
4.0 Conclusion
It is frustrating to see the various declining segments muting the shinier ones this quarter. That being said, I believe AMD’s segments will converge to form a far greater computing platform over the next decade. This will be enabled in turn by AMD’s ability to connect computing units.
But I'd like to kind of reemphasize what I said earlier, even in the case of process parity, we feel very good about our architectural roadmap and all the other things that we add, as we look at our entire portfolio of CPUs, GPUs, DPUs, adaptive SoCs and kind of put them together to solve problems.
-Lisa Su, AMD CEO during the Q4 earnings call.
This stated approach will separate AMD from its competitors and increase the company’s free cash flow yield over time.
At the moment I do not have any specific advancements to discuss regarding this platform, but as has happened in the past with other companies, I see both the technology roadmap and the adequate organizational properties to make magic happen. Over the coming 4-6 quarters, I believe AMD will make great progress in this sense.
Until next time!
⚡ If you enjoyed the post, please feel free to share with friends, drop a like and leave me a comment.
You can also reach me at:
Twitter: @alc2022
LinkedIn: antoniolinaresc