This is an update of my original Microsoft deep dive.
Edited by Brian Birnbaum.
Microsoft’s AI CapEx has a large asymmetric component to it.
The primary concern now with Microsoft is whether the exorbitant AI CapEx will make sense long term. In their latest quarterly report, CapEx (including finance leases) came in at $19B, which accounted for just over 29% of revenue in the quarter. In the call management explained that nearly all of CapEx was directed towards cloud and AI initiatives with half of the spend going to infrastructure and the other half going to both CPUs and GPUs.
Speaking of which, Satya mentioned both AMD and Nvidia as sources of compute engines, bestowing upon both a similar level of importance:
We added new AI accelerators from AMD and NVIDIA, as well as our own first party silicon Azure Maia. And we introduced new Cobalt 100, which provides best-in-class performance for customers like Elastic, MongoDB, Siemens, Snowflake, and Teradata.
As you may know, one of my largest positions is AMD. I believe the company will take market share from Nvidia in the AI accelerator space. Over time I’m seeing AMD lean towards inference, a market that will grow exponentially over the coming decade. This isn;t to say Nvidia will falter, but rather the market is underpricing the imminent evolution of AMD’s business. In my last Meta update, I also outline how Llama 3.1 is now running exclusively on AMD’s MI300 chip for live traffic. I see AMD’s AI accelerators gaining meaningful traction. Meanwhile, the stock is down approximately 23% from all-time-highs.
Back to Microsoft’s CapEx concern, management seems to be thinking about deploying this capital similar to that of Meta’s. They’ve been architecting a highly flexible and fungible infrastructure that allows them to instantiate datacenters and hydrate components as they see demand signals come in. They build out the base infrastructure and then deploy components as AI usage increases. Microsoft intends to monetize datacenters over a fifteen year period. The long duration, coupled with the aforementioned flexibility, in theory gives them the capacity to deploy capital efficiently.
Here’s what Microsoft CEO Satya Nadella said about this during the call:
The asset, as Amy said, is a long-term asset, which is land and the datacenter, which, by the way, we don't even construct things fully, we can even have things which are semi-constructed, we call [cold] (ph) shelves and so on. So we know how to manage our CapEx spend to build out a long-term asset and a lot of the hydration of the kit happens when we have the demand signal.
CFO Amy Hood followed up on Satya’s remarks with even more insightful information:
Being able to maybe share a little more about that when we talked about roughly half of FY2024's total capital expense as well as half of Q4's expense, it's really on land and build and finance leases, and those things really will be monetised over 15 years and beyond.
And they're incredibly flexible because we've built a consistent architecture first with the Commercial Cloud and second with the Azure stack for AI, regardless of whether the demand at the platform layer or at the app layer or through third parties and partners or, frankly, our first-party SaaS, it uses the same infrastructure. So it's long-lived flexible assets.
Satya made particular emphasis on Microsoft closely monitoring AI demand signals and thus deploying incremental CapEx dollars proportionately. On the call, I saw some interesting signals representative of an underlying shift: Microsoft’s productivity tools are enabling more productivity gains in AI. Across the entire Microsoft product and service suite, AI’s impact is most visible within Github. As of Q4 FY2024, over 77,000 organizations have adopted Github Copilot, the AI service that enables developers to essentially automate code production. Copilot accounted for 70% of Github revenue growth YoY–now more than all of Github’s revenue when Microsoft acquired it in October 2018.
Github is a relatively small corner of the Microsoft productivity ecosystem. Still, the mechanics are fundamentally applicable to any other productivity tool. In highly simplified terms, Copilot picks up data on user activity and learns to automate tasks that are performed repeatedly by users in varying contexts. I don’t see why the outperformance in Github driven by Copilot won’t apply to all other productivity surfaces in time. Allegedly, management is seeing signals that point to office tools heading in the same direction as Github. Here’s what Satya said about this in the Q2 call:
So to me, look, at the end of the day, GenAI is just software. So it is really translating into fundamentally growth on what has been our M365 SaaS offering with a newer offering that is the Copilot SaaS offering, which today is on a growth rate that's faster than any other previous generation of software we launched as a suite in M365.
That's, I think, the best way to describe it.
I mean the numbers I think we shared even this quarter are indicative of this, Mark. So if you look at it, we have both the landing of the seats itself quarter-over-quarter that is growing 60%, right? That's a pretty good healthy sign.
I expressed a view similar to Satya’s on AI, explaining why Palantir will be bigger than Tesla. I explained why compute will continue growing exponentially with or without AI, as evidenced by the graph below, first shared by futurist Ray Kurzweil. The graph shows that humanity has proliferated compute per dollar since the 1930s and that, whether the current format of AI lives up to its hype or not, demand for compute will continue its exponential ascent.
AI confuses investors, who think of it as a mysterious entity detached from compute’s historical growth. On the contrary, Microsoft management correctly sees it as a software that increases productivity and simply requires more compute. That Microsoft will require exponentially more compute power a decade from now is a certainty. Hence, the buildout of datacenters is likely justified if they are designed to be depreciated over a sufficiently long time horizon. Within that context, the rate at which they hydrate new compute instances within each datacenter depends on compute demand signal, as always.
Microsoft is ideally positioned to understand the types of compute customers across the globe are demanding, as is the case with the other hyperscalers. Within Azure (Microsoft’s cloud service), Microsoft has an offering called Azure AI, which is a collection of artificial intelligence services and tools that enable businesses to build, deploy, and manage AI models and applications at scale. Other hyperscalers have similar offerings, providing a real-time assessment as to what extent businesses demand AI. In Q4 FY2024, Azure AI customers came in at a total of 60,000, which is up 60% YoY. Within Azure AI, the number of paid models-as-a-service customers has more than doubled QoQ, according to management.
Hyperscalers are unlikely to waste datacenter CapEx because the demand for compute is rising big time. Another thing that caught my eye in the Q4 FY2024 report–Microsoft is now aiming to produce movies and shows from its gaming IP (intellectual property). You may remember in my original Netflix deep dive, written in March 2023, I explain how Netflix could greatly increase its ARPU (average revenue per user) over time by increasing the surface area of its IP–namely, turning its original productions into games and then distributing them to their audience. If you’re a Netflix user you’ll know that they’re very much executing on this front. It will be interesting to track Microsoft’s progress in the other direction, which will yield valuable lessons pertinent to Netflix.
Folks are rather worried that it’s 1999 2.0. On the contrary, across the board I see companies paying close attention to whether AI is driving value. In my last Meta update I highlighted how AI is driving major engagement increments across its network. In Microsoft’s case it’s obvious that copilots are going to make the company’s existing and future productivity offerings more effective. Since the additional productivity stems from a unified hardware and software stack, from which they serve both their cloud and end-application consumers, I believe that Microsoft is likely to see enhanced margins over time as they continue to deploy more AI capacity.
⚡ If you enjoyed the post, please feel free to share with friends, drop a like and leave me a comment.
You can also reach me at:
Twitter: @alc2022
LinkedIn: antoniolinaresc
Great article! Reminds me of Amazon building DCs when the market considered a waste of CAPEX.