Nebius: The Winner in the Era of LLM Commoditization
$NBIS Q3 2025 Earnings Update
The content of this analysis is for entertainment and informational purposes only and should not be considered financial or investment advice. Please conduct your own thorough research and due diligence before making any investment decisions and consult with a professional if needed.
Nebius recently signed two major deals with big cloud companies: one worth $17.4–$19.4 billion with Microsoft and another around $3 billion with Meta. These deals strongly prove Nebius’s strong position in the AI world. They give the company the funds and trust to quickly grow revenue and build more infrastructure.
But the real future for Nebius is with Enterprise Customers. I hope they make up over 70% of the business at soonest.
To explain what I mean and why, here are some key quotes from Nebius leaders.
Marc Boroditsky: “For our business model, it’s really important for us to be able to not only service large tech companies, but also be able to support our AI cloud and a very diverse set of customers. As a matter of fact, servicing start-ups and software vendors and enterprises is not only about delivering on their capacity needs today. We want to build partnerships with these customers and help them to meet their capacity requirements in the future, especially with enterprises because they don’t want to actually have a multitude of vendors. They prefer to align with a strategic partner.”
Arkady Volozh: “The situation of unbalanced demand supply is temporary, of course, eventually demand supply will level up. And what we are doing in addition just to growing this raw capacity we are building our AI cloud, which will support real businesses, real industries, real enterprise market where AI will be creating value. And we are big believers that the AI industry in general in our sector specifically, it’s going to be okay.”
In short, Nebius beyond to working with big cloud companies, wants to be a big cloud company.
I expect this change to speed up in 2026, with several big business deals coming in.
Marc Boroditsky: “We are adding a number of key leaders to our organization and we are expanding the overall sales organization for coverage in enterprise software vendors and key verticals. It will take some time for the sales team to ramp, but we are building the foundation between the functionality that I mentioned and the overall team coverage that I think will set us up for a strong 2026 with enterprises.”
Table of Contents
Q3 2025 Update
Lorenzo2cents (L2C) take aways and performance
Business Ontology Framework by L2C
Business Ontology
4D Valuation Model
L2C portfolio strategy
Q3 2025 update
The Changing World of AI Providers
The more I learn about Nebius and how AI data centers are changing, the more sure I feel about this investment. Many people now think large language models (LLMs) will become basic tools that anyone can use cheaply—at least for most business AI tasks.
LLMs are moving fast toward becoming everyday items. Base models will be easy to get, all the same, and fought over on price and access—like cloud computing or computer chips today. This is speeding up in 2025–2026 because costs are dropping, open-source options are growing, and tools are getting better.
I believe the market is splitting into two. High-end, private models from companies like OpenAI, Anthropic, Google, and xAI will stay valuable for critical uses—like spotting cancer in scans right away or catching fraud in fast trading. But open-source models will take over for high-volume, low-cost tasks.
Nebius is certainly set to win big in that open-source space—and the future looks very good.
Why Big Cloud Companies Aren’t Meeting Business and Startup Needs
Picture this: You’re a big business or AI startup looking for an AI cloud service. What would make you stick with them long-term? No worries about prices jumping or switching AI models causing problems. Here’s what you’d want:
The lowest cost per token with clear pricing—so no slow price increases that hurt your profits.
Strong Enerprise-level setup with a plan that matches your future needs, like more power or security tools.
Easy access to top open-source LLMs, so you can switch models easily for different tasks.
Great speed and reliability in services and computing.
The top players—AWS, Azure, and Google Cloud—only got Silver or Bronze in the November 2025 ClusterMax 2.0 review of 84 GPU providers. They fall short on steady prices, lock-in risks, and easy model switches for businesses and startups. Their shared setups, built for all kinds of work, focus on bundling services to make money. This makes big fixes hard without shaking up their huge businesses.
How Nebius Will Lead in the World of Cheap LLMs
Nebius is ready to take a leading spot in this world of basic LLMs. It earns a Gold rating in ClusterMax 2.0 (just behind CoreWeave), and its product plans keep improving to fit what customers want. The new Nebius Token Factory launch? It’s a big win. This special tool helps businesses and AI companies use LLMs in the easiest and best way.
Here’s how Roman Chernin, Head of AI Cloud, explains it:
I will start a little bit from demand evolution. We fairly see now the next wave of AI demand growth. And it’s mostly driven by the companies, by the people who apply AI to real-world applications across all industries in B2C and B2B. It’s not necessarily foundational model builders like it was, let’s call it, in the first wave. And we, as Nebius realized that we needed our inference-as-a-service offering to make it serve a broader set of customers, including enterprises. So Token factory gives vertical AI product builders, ISVs and enterprises a platform to build, we call it Flywheel of applying LLMs and vertical AI use cases at scale, transforming -- we help them to transform open source models into optimized production-ready systems with guaranteed performance and transparent cost per Token.
We obviously leverage the underlying infrastructure to bring the most efficient, scalable solution to our customers when they can be sure that they get the best total cost of ownership and can confidently grow with us. So as a result, organizations can deploy and scale models such as OpenAI OSS, Qwen, DeepSeek, Llama, Nematron and many others on dedicated endpoints with guaranteed performance tuned for the super latency and 99.9% uptime. So in total, I must say we are excited about the opportunity of inference workloads. We believe that all companies will invest in inference to productize AI. And for us, like it will require significantly more compute and will support this wave of growth as well as we do for foundational model builders.
Nebius has built just what customers need: tools to create and grow AI workflows, smart systems, and even self-running businesses—without costs going wild or products staying behind.
L2C Takeaways and Performance
Nebius is on the right path, and this plan should bring huge rewards in 2026. Revenue grew over 30% from last quarter in Q3 2025, reaching $193M. They guide for $7–9 billion in yearly run rate revenue by end of 2026. If they hit that and keep triple-digit growth, the stock will explode—especially with strong AI compute demand.
As always, here’s the Deep Dive To Date (DDTD): Stock performance since my first deep dive and when I bought in on July 28, 2025, at $51.63.
+74% DDTDFrom here on, the content is restricted to L2C Premium Members, folks who’ve chosen to unlock this toolkit and support my independent research:
Business Ontology Framework by L2C
Business Ontology: My core blueprint for modeling and tracking company performance at every level.
4D Valuation Model: the valuation tool I use to value all my investments (Fair price is useless)
L2C Portfolio Strategy: My portfolio allocation and strategy in details
L2C Portfolio access & trades alerts: Real-time views into my holdings, plus instant notifications on buys, sells, and shifts.



