No time to read the deep dive? Watch/listen for free:
Edited by Brian Birnbaum.
SMCI is up 17X since early 2022. It begs the question, is the stock worth the hype? In my month long research of SMCI, I found some answers to this question.
1.0 Charles Liang’s Obsession
SMCI is far more focused than its two biggest competitors on decreasing time to value and total cost of ownership for AI server customers. This is why it’s winning and will likely continue to win.
However, SMCI’s exploding capital requirements pose a risk to investors, which I cover toward the end.
SMCI’s maniacal focus on increasing the performance and decreasing the cost of AI servers is unmatched. This has afforded the company an earning power that grants the stock vast upside over the long term–and has already done so for savvier holders–as AI goes mainstream. I reached this conclusion after an in-depth analysis of the AI server business of its closest rivals, HPE (Hewlett-Packard) and Dell.
Although servers are complicated machines, much of the information required to make an investment decision can be inferred by closely analyzing the attitudes of those involved in their creation, refinement, and distribution. I have noticed stark differences between the attitudes of the SMCI, HPE, and Dell management teams. In turn, these differences correlate strongly with the deltas in their respective financials.
The attitudes of the management teams determine the pace at which the respective value propositions improve over time. The management team most focused on the core value drivers of the AI servers is likely to achieve greater market share and make more money over time. Ultimately, customers only care about obtaining more computation, in less time, for less money.
In the graph below you can see how, over the last four quarters, SMCI’s revenue from AI server sales is considerably higher than HPE’s and Dell’s, accounting for 48.6% of all SMCI sales. Because the company has not disclosed the exact figure for each quarter, I have used this average as an approximation.
In their quarterly conference calls, the HPE and Dell management companies talk only about the potential for up and cross sells. They also repeatedly discuss how GPU shortages are slowing down the growth of their respective businesses. In their latest calls, both teams make zero mentions of giving AI server customers more computation in less time and for less money.
On the other hand, the founder and CEO of SMCI Charles Liang talks only about decreasing time to value and overall cost of ownership for customers. When an analyst asks a question, Charles is most likely to reply by simply re-emphasizing the above. Indeed, Charles and SMCI have been doing this for 30 years, while HPE and Dell joined the party only a few years ago.
This fast-growing quarter was driven by end users wanting to accelerate their deployment of the latest generation AI platforms. Through our Building Block Solutions, we provide optimized AI solutions at scale, offering a time-to-market advantage and shorter lead time over our competition.
-Charles Liang, CEO and Founder during the Q3 FY2024 earnings call.
Although Dell looks to be in better shape than HP, they both lack the focus on enhancing the value proposition of the AI servers for customers that is required to compete with SMCI. AI servers are an existential priority for SMCI yet seemingly only a strategic one for HPE and Dell. This qualitative impression of mine is backed up by numbers too.
In their latest quarter, Q3 FY 2024, 50% of all of SMCI’s revenues came from the sale of AI GPU platforms. In turn, Dell’s AI-optimized sever revenue came in at $800M in Q4 FY2024, just 3.8% of all sales. Further, AI server orders accounted for nearly 25% of all server orders for HPE in Q1 FY2024, which is down from 32% in the previous quarter.
SMCI is displaying yet another instance of the Innovation Stack–a small company successfully taking on two much larger companies by focusing far more on the task at hand. However, high performance computing has historically offered difficult profits–except for Nvidia, thanks to its software moat.
In the next section I analyze how SMCI is overcoming this challenge, in the face of skyrocketing capital requirements. As you can see in the graph below, SMCI’s CapEx has increased spectacularly in the last quarter and, per management’s comments during the Q3 FY2024 call, they will continue rising fast:
So, the way I would answer that is that, I hope that I need more capital, Jon, because that means that [that] we're growing revenues even faster.
-David Weigand, SMCI CFO.
2.0 Exploding Capital Requirements
SMCI’s capital requirements are exploding as the business grows, but shareholders will do well if SMCI finds a way to increase operating leverage regularly.
Increasing operating leverage with a regular cadence will enable SMCI to buffer the rising capital requirements, within reasonable limits. SMCI is already succeeding on this front thanks to its Server Building Block Solutions architecture, which allows them to more rapidly iterate, personalize and deploy their servers. This modularity is the source of SMCI’s time to value and cost advantage that Charles Liang mentions so often.
Our in-house design competencies, design control over many of the components used within our server and storage systems, and our Server Building Block Solutions® (an innovative, modular and open architecture) enable us to rapidly develop, build and test our compute platforms along with our server and storage systems, sub-systems and accessories with unique configurations.
-SMCI’s FY2023 10-K.
I know that SMCI is successfully increasing its operating leverage because its margins are trending up when the scenario is far less favorable for its competitor Dell, which is in far better shape for AI servers than HPE. Additionally, net income is growing exponentially which suggests that SMCI has no trouble with scaling the source of its operating leverage.
Reading Dells’s latest earnings call transcript (Q4 FY2024), I noticed that the company was guiding for lower margins next quarter as AI servers became a bigger part of the mix. During the quarter AI accounted for only 5% of overall revenues but management was guiding for margins to decline 100 basis points QoQ. An analyst remarked during the call that for such a small business volume to drive gross margins down, AI margins had to be highly detrimental.
The CFO’s response was inconclusive and diluted, which again points to SMCI’s competitive advantage. The modularity of SMCI’s architecture will continue to pay dividends so long as SMCI remains focused, with competitors seemingly far behind and focused on other businesses. My impression is that HPE is most focused on networking and Dell is keen on AI PCs.
Further, I notice a cyclicality in the progression of gross margins similar to that of Tesla’s. I therefore suspect that SMCI is periodically decreasing prices every time it attains a new efficiency threshold, thus passing back efficiency gains to customers in the form of lower prices. I have not seen management talk about this, but I will look for a confirmation of this hypothesis going forward. Sharing economies of scale makes it even harder for competitors to gain market share, strengthening the moat.
This may also be due to competitive pressures, but I need more time to know for sure.
Over the long term, SMCI can exponentiate its operating leverage and the strength of its moat by creating network effects. Making money in the high performance computing market is difficult because development costs are high and the market is relatively niche, which makes selling hard. Network effects can make selling easier for SMCI and harder for competitors.
SMCI can develop network effects by creating the default operating system for proprietary datacenters. Next gen datacenters are stateful, meaning that they hold information about their state. This data can be used to train neural nets in order to automate processes within the datacenter. If SMCI manages to create a software architecture that is analogous to its successful hardware architecture, much higher levels of operating leverage will ensue.
Nvidia is a prime example of how to exponentiate operating leverage in the semi space via software. By making it painless for developers to interact with Nvidia GPUs via CUDA, it grants its GPU installed based on network effects. CUDA effectively connects all of the deployed Nvidia GPUs and turns them into a network. Every additional GPU sold makes the network bigger which funnels more capital towards Nvidia, which then helps NVDA make CUDA better and so on.
3.0 Financials
Assuming its organizational properties remain, SMCI’s financial health depends on the ability of AI models to generalize as we add more parameters to them. This relationship seems set to persist.
Income Statement
SMCI’s financial performance is primarily a result of folks wanting their own datacenters to run AI operations, coupled with the aforementioned operating leverage. This desire has been driven by relationships between the number of parameters in a model and their ability to do things we didn’t train them for.
In the graphs below, you will notice that the inflection point in SMCI’s revenue coincides with the inflection point in the number of parameters for milestone models across the board and not just for language models. As the number of parameters in these models continue to grow exponentially, the data used to train them is gradually becoming the most important asset for organizations.
This is what is driving demand for SMCI’s servers and datacenters.
The industry knows how to continue exponentiating the number of parameters in these models: it is an engineering challenge going forward. For this reason, I believe continuing to exponentiate the number of parameters in these models will not be a roadblock. So long as the relationship between the number of parameters and the intelligence/ability to generalize holds, folks will likely continue demanding proprietary datacenters.
However, this does not mean that SMCI will not experience inventory corrections or that the evolution of AI will be a straight line up. AI may well go through a dot com phase and my analysis is framed within a ten year plus investment horizon.
I believe the intelligence of these models will continue going up over time, with organizations taking more measures to protect the data that fuels them. Additionally, just over 50% of SMCI’s demand is driven by GPU servers which companies use to train models. SMCI’s top line has considerable upside asAI inference gets even bigger than AI training.
The ultimate proof of SMCI’s advantage in the market is how both the top and the bottom line have scaled with the onset of these larger models, while competitors’ margins are faltering. Operating and net income are rising in tandem with revenue, proving that SMCI’s Building Block Solutions architecture is structurally superior to alternatives.
Cash Flow Statement
SMCI experienced a significant drop in cash from operations in the last quarter primarily due to increased working capital requirements, particularly a substantial rise in inventories and accounts receivable. In Q2 FY2024, the company reported a cash outflow from operations of $595 million. This is a great source of uncertainty in the thesis.
SMCI is expanding its manufacturing capabilities with a new factory in Malaysia. This facility is part of a broader strategy to enhance their global manufacturing footprint and increase production capacity. The cost of the Malaysia factory is estimated at $800 million, which adds to the weight of inventories.
Balance Sheet
As of Q2 FY2024 SMCI had a robust net cash position. The company reported cash and cash equivalents of $2.1 billion following an issuance of convertible notes amounting to $1.7 billion.
The short-term debt decreased to $81.6 million from $276.3 million in the previous quarter, while long-term debt also decreased to $85.6 million from $99.3 million.
4.0 Conclusion
Excellent engineering, uncertain execution.
I need more time to be certain, but I believe SMCI has a clear competitive advantage versus HPE and Dell. This is visible in the remarkably superior focus of the management team on AI servers and on the scalable income statement. I suspect both HPE and Dell are years away from having anything that resembles SMCI’s modular architecture.
So long as the demand for proprietary datacenters persists, this competitive advantage can lead to considerable top line growth for SMCI. However, the cash flow statement and the balance sheet are murky. Management is now faced with the task of deploying amounts of capital that are unprecedented in the company’s history. Initiating a position now requires making an assumption about management’s ability to do so.
At the moment, while I know I can rely on SMCI’s engineering excellence, I do not have enough information to assess their ability to scale the operation into the big leagues. As previously mentioned, SMCI’s modular architecture will be a big tailwind which should help SMCI buffer much of the capital requirements and mistakes it may make along the way.
Over the coming quarters, I will be analyzing the company closely to learn more.
Until next time!
⚡ If you enjoyed the post, please feel free to share with friends, drop a like and leave me a comment.
You can also reach me at:
Twitter: @alc2022
LinkedIn: antoniolinaresc