Edited by Brian Birnbaum.
Mongo’s database schema has an edge in AI applications. As more AI apps come to the market, Mongo stands to benefit considerably.
MongoDB seems to have a database architecture that’s uniquely suited to the requirements of AI apps. If AI takes off, MongoDB has considerable runway beyond its $1.96B revenue run rate in the twelve trailing months. Mongo’s competitive advantage is the flexibility of its schema, which is considerably greater than that of SQL databases. AI applications require complex and rich data structures, together with lots of contextual real-time data. While SQL databases are good for Web 2.0 apps, they don’t quite work for AI apps.
SQL stands for structured query language and allows developers to retrieve and store data into databases made of tables. These tables are rigid, and once you build an application, changing the structure of these tables is usually a nightmare. AI applications have to deal with different data types–source data, vector data, metadata, and generated data–right alongside the live operational data.
Additionally, AI applications run on inference, which means minimizing latency is critical. Optimizing latency for an AI application is a near-impossible task for SQL database, because it’s hard to modify the schema at all .This means that once AI applications take off we should see MongoDB rise in popularity.
As you can see in the graph below, SQL databases remain the most popular among developers. This is because Web 2.0 applications still account for the majority of apps on the market, and AI applications still haven’t taken off; the vast majority of AI CapEx is going toward infrastructure. CEO Dev Ittycheria explained this during the Q3 FY2025 earnings call:
At the start of the fiscal year, we told you that we didn't expect AI to be a meaningful tailwind for our business in fiscal year 2025, which has proven accurate.
[…]
Companies are currently focusing their spending on the infrastructure layer of AI and are still largely experimenting with AI applications.
It’s notable that, while we await AI applications to hit the market, Mongo remains the most popular database after the dominant SQL-types, which are PostgresQL, MySQL, SQLite and Microsoft SQL Server. Among developers, Mongo is far more popular than alternatives from larger companies such as Oracle, Amazon (DynamoDB), Snowflake, and Databricks. Tentatively speaking, such preference points to a potential Innovation Stack.
Indeed, in FY2016, MongoDB brought in just $65.3M in revenue. It has nonetheless managed to fend off competition from Oracle and Amazon, and now trails only Microsoft. In Q4 2017, AWS CEO Andy Jassy (at the time) said that Mongo was “kicking butt” on AWS. And since, Mongo has increased its yearly revenue 29-fold. Dev Ittycheria has been Mongo’s CEO since 2014 and, although he’s not a founder-operator, the company has done well under his watch.
The above track record is further indicative of the competitive advantage of Mongo’s database schema. It’s also indicative of an organizational ability to iterate on the schema and overall platform faster than competitors such as Amazon’s DynamoDB, which makes Mongo a company worth watching as AI apps make their way to the market. Mongo is also an interesting barometer to gauge whether AI infrastructure spend does translate into useful apps.
For now, AI apps are not gaining widespread traction. That’s worth keeping an eye on. Mongo CEO Ittycheria shared some valuable insights in the Q3 FY2025:
What we don't yet see is many of these apps actually achieving meaningful product market fit and therefore significant traction. In fact, as you take a step back and look at the entire universe of AI apps, a very small percentage of them have achieved the type of scale that we commonly see with enterprise-specific applications.
We do have some AI apps that are growing quickly, including one that is already a seven-figure workload that has grown 10x since the beginning of the year.
However, in Q1 FY2025 consumption growth of their database services grew slower than expected. In my view, communication of the slowdown was poor. At the start of the earnings call, management either couldn’t or wouldn’t pinpoint whether the slowdown was due to macroeconomic or internal factors. Towards the end of the call, the CFO defaulted to internal factors. In Q3 FY2025, CEO Ittycheria announced that the CFO was leaving the company.
The stock is currently down 47% since the call, even though cash from operations has been growing considerably per the graph above. Although this looks like an opportunity, the Q1 incident dampens my view of the management team. Growth has resumed following the Q1 slowdown, but the management of the situation was anaemic and makes it hard to invest in the company when there’s other companies with extraordinary management teams, such as Palantir, AMD, Tesla and Hims.
Despite the popularity with developers, Mongo still has a low single digit market share. By rolling out their community product first and then adding to their enterprise operation the features that work, Mongo fosters a great relationship with developers and is able to leverage a bottoms-up approach to sales. Developers have great power within organizations when it comes to deciding what tools will be used. The strength of this distribution channel is the main driver of the financial performance over the past decade, together with the superiority of the schema.
Generative AI paves the path for exponential growth for Mongo.
To change your database schema, you often have to rewrite your entire application code. The cost of rewriting apps has stalled Mongo’s growth, making it harder for companies to update their legacy systems. Generative AI is now making that process far easier, exponentially reducing costs over time. In turn, Mongo sees the highest RoI in the enterprise channel.
Mongo is likely to accelerate its distribution and increase its margins as generative AI improves. Much like Okta, we see here an instance of the Nvidia algorithm: the operational framework that explains for some of the most remarkable corporate triumphs of this era, like Nvidia and Palantir. This powerful mental model is part of the framework that I use to pick winners in tech early, which I teach to my students in my Tech Stock Goldmine course.
When an organization dedicates decades to advancing a technology that dramatically improves humanity's capacity to process information or leverage energy, a sudden technological leap that makes distribution far easier can ignite exponential growth that appears to arise out of nowhere. In truth, these companies laid the groundwork long before reaching their critical turning points.
Mongo is executing the Nvidia algorithm depicted below. They have been working on a superior database schema since 2007. Now, the rise of AI makes their schema even more useful and makes migration exponentially more convenient for customers, as the cost of writing code with AI drops. If Mongo’s database is as hard to replicate as progression of financials over the past decade suggests, generative AI is likely to multiply Mongo’s earning power over the coming years.
In Q1 FY2025, Mongo completed the modernization of two pilots enabled by generative AI, which allegedly greatly reduced the time, cost and risk of modernizing legacy applications. Management stated in the earnings call that they believed the pilots reduced the effort required to migrate to Mongo’s schema by 50%. In the Q2 FY2025 earnings call, Mongo CEO, Dev Ittycheria, shared some insights on how this changes the game for Mongo–and notice how he uses the term “sudden”, which perfectly captures how the Nvidia algorithm yields exponential growth as if by apparition:
So, what's been compelling about AI is that AI has finally created a shortcut to overcome that big hurdle.
And so essentially you can start basically diagnosing the code, understand the code, recreate a modern version of that code, and generate test suites to make sure the new code performs like the old code.
So, that definitely gets people's interest because now all of a sudden what may take years or multi-years, you can do in a lot less time. And the pilots that we have done, the time and cost savings have been very, very compelling.
According to management, the enterprise channel exceeded expectations in Q3 FY2025. Ittycheria shared during the call that the results from the pilots are driving “additional customer interest” that’s exceeding expectations. It’s interesting to see how the onset of the pilots correlates with the rapidly improving cash flow profile, which I include again below for your convenience.
Although OpEx as a percentage of revenue has declined from 85.72% to 79.20% YoY, the respective earnings calls make no mention of efficiency measures. However, in the last two quarters Mongo has also leaned on the enterprise channel by investing more in enterprise sales in an effort to move up-market. As such, the improved cash flow profile is primarily the result of stronger traction on the enterprise side, which bodes well for pilots.
The above dynamic should make the rise of AI apps far more accretive to Mongo by making distribution easier and thus converting more revenue to the bottom line. Over the coming years, pilots will likely become increasingly productized, as has been the case with Palantir’s AIP bootcamps. The latter started off as highly manual live events but now can increasingly be accomplished anywhere, anytime. Should Mongo’s pilots go down a similar path, generative AI should put Mongo on an exponential growth trajectory.
Mongo’s customers are pursuing proprietary data moats.
According to the CEO one of the main reasons for which companies are modernizing their applications is to leverage their proprietary data. One of the main premises I teach in my Tech Stock Goldmine course is that proprietary data will be the moat of the 21st century. Sole possession of such data enables you to train AI models that one else can and thus remain competitive. With AI’s exponential improvement by the month, the only way to stay alive is through the data advantage.
Here’s what Dev Ittycheria said about Mongo customers pursuing proprietary data moats in Q2 FY2025:
Fourth, because GenAI is so predicated on data and to build a competitive advantage, you need to leverage your proprietary data, people want to access that data and be able to do so easily.
Interestingly, Ittycheria believes general-purpose LLMs (large language models) will win. He also believes that people will differentiate their models on proprietary data. When I wrote my Chegg deep dive earlier this month, I was thinking more along the lines of companies training their own LLMs from scratch–so I welcome opening my mind up to this possibility, which will help me read competitive environments better.
Here’s what Ittycheria said about this during the Q2 FY2025 earnings call:
There are some questions about LLMs, whether a general purpose LLM or a fine-tuned LLM, what the trade-offs are.
Our belief is that given the performance of LLMs, you're going to see the general purpose LLMs probably win and will use RAG as the predominant approach to marry generally available data with proprietary data.
RAG stands for Retrieval-Augmented Generation, and it looks like we’ll be seeing plenty of it over the coming decade. A RAG is a method via which AI retrieves relevant information from external sources to generate more accurate and context-aware responses. After all, why build a new model from scratch when you can use one that already masters natural language and simply teach it a differentiating skillset? The cost of the latter option is bound to be lower, likely yielding better unit economics.
RAG works by combining two key steps: retrieval and generation. When a user asks a question, the AI first searches an external knowledge base or document store to retrieve relevant information. It then feeds both the original question and the retrieved data into a language model, which generates a response that combines the contextual understanding of the query with the factual content from the retrieved sources. This approach ensures more accurate, current, and context-aware answers while reducing the model's reliance on outdated or limited internal knowledge.
I like Mongo DB, but I need to learn more about databases.
Although Q1 FY2025 makes me question the strength of the management team, Mongo’s financial performance over the past decade is suggestive of a strong technical moat. In turn, the rise of AI is set to benefit Mongo in two key ways: by increasing the demand for its flexible database schema; and by making migration to said schema exponentially easier. This promises to more than 10X revenue again over the coming decade, while yielding higher margins.
As the generative-AI-driven pilots get productised over time, this should generate a proprietary sub-dataset that will likely enable Mongo to further automate migrations in a way increasingly harder to replicate. Over the long term, this flywheel can potentially increase Mongo’s earning power far beyond expectations.
The major risk is Mongo being disrupted by another player that provides a more flexible database schema at a lower cost. For now I don’t see the a notable competitor in this sense, but in technology there’s always a lot that we don’t know that we don’t know. I need to learn more about databases to truly discern Mongo’s competitive position. In the meantime, as previously mentioned, the data points to an Innovation Stack scenario.
In the meantime, I look forward to tracking Mongo quarterly.
Until next time!
⚡ If you enjoyed the post, please feel free to share with friends, drop a like and leave me a comment.
You can also reach me at:
Twitter: @alc2022
LinkedIn: antoniolinaresc
Nice write up. Data Engineer here and happy to answer any questions related to databases.
Is it Q3 FY2025 or Q3 FY2024