
Oracle conference call: AI business surges with $455 billion in orders, launching "AI Database" targeting the trillion-dollar inference market

Oracle disclosed a contract reserve of up to $455 billion, backed by AI industry giants such as OpenAI, xAI, and Meta. Chairman Larry Ellison clearly stated that AI is fundamentally changing Oracle, and a trillion-dollar market larger than model training—AI inference—is the company's main battlefield for the future. At the same time, the company launched the "AI Database," aiming to seize the AI inference market by leveraging enterprise private data
After the earnings report, the stock price surged 27%! Oracle is rewriting the market's traditional perception of it with its remarkable rise in the AI infrastructure field.
On September 9, during the latest earnings call, Oracle disclosed that its remaining performance obligations (RPO) have soared to $455 billion, a year-on-year increase of 359%, with $317 billion added in just the first quarter. This explosive growth is primarily due to the company's signing of large-scale cloud contracts with a series of top AI companies such as OpenAI, xAI, and Meta, making it a key infrastructure provider for AI model training. CEO Safra Catz stated, “Oracle has become the preferred destination for AI workloads.”
To meet the surging demand, Safra Catz announced an increase in the capital expenditure guidance for this fiscal year to approximately $35 billion, and provided an astonishing long-term forecast: Oracle's Cloud Infrastructure (OCI) is expected to grow 77% this fiscal year and continue to grow rapidly over the next four years, targeting a scale of over $100 billion.
Company Chairman and Chief Technology Officer Larry Ellison clearly pointed out during the call that AI is fundamentally changing Oracle and emphasized that the AI inference market will be “far larger” than the AI training market. He elaborated on the company's core strategy to seize the high ground in the inference market through its new “AI Database,” attempting to convince investors that Oracle's AI story is just beginning, with greater ambitions in the intelligent application of enterprise data.
Key Takeaways from the Earnings Call:
- Order Backlog Soars: Oracle's remaining performance obligations (RPO) grew to $455 billion, a year-on-year increase of 359%, with a quarter-on-quarter increase of $317 billion, reflecting extremely strong demand for AI-driven cloud infrastructure.
- Core AI Clients Confirmed: CEO Safra Catz explicitly stated that major AI companies such as OpenAI, xAI, and Meta have signed significant cloud contracts with Oracle, making it an important platform for AI training and inference workloads.
- Greater Potential in Inference Market: Chairman and CTO Larry Ellison suggested that the AI inference market will ultimately be “far larger” than the AI training market. Oracle is actively positioning itself in this larger market, leveraging its advantages in enterprise data management.
- Key Technology “AI Database”: Oracle has launched a new “AI Database,” whose core function is to enable AI models to understand enterprise private data through “vectorization,” allowing for advanced inference that combines private data with public information while ensuring data security.
- Building an Open AI Model Ecosystem: Oracle's cloud platform announced the integration of various mainstream AI models such as ChatGPT, Gemini, Grok, and Llama, providing enterprise clients with a wide range of options to connect their private data
- Significantly Upgraded Performance Guidance: Management has provided an extremely optimistic long-term outlook, expecting Oracle Cloud Infrastructure (OCI) to grow by 77% this fiscal year and to continue high-speed growth over the next four years, targeting a scale exceeding $100 billion.
- Corresponding Expansion of Capital Expenditure: To meet strong demand, Oracle expects capital expenditure (CapEx) to reach approximately $35 billion this fiscal year, primarily for the procurement of revenue-generating server equipment.
- Technological Differentiation: Ellison emphasized that Oracle's performance advantage in high-speed networking is key to achieving cost-effectiveness in AI training, stating that this can achieve "twice the speed at half the cost."
AI Orders Surge, RPO Soars by $317 Billion in a Single Quarter
The most striking data from Oracle's first fiscal quarter is not revenue or profit, but the astonishing growth of its Remaining Performance Obligations (RPO). RPO, a key indicator of future revenue, reached $455 billion this quarter, with a year-on-year increase of 359%. Even more shocking is that this figure surged by $317 billion compared to the end of the previous quarter, indicating the massive scale of its new contracts.
Behind this growth is Oracle's significant breakthrough in the AI training market. Safra Catz confirmed at the meeting that the company has signed important cloud contracts with "well-known companies in the AI field," including OpenAI, xAI, Meta, and others. The signing of these contracts means that Oracle's cloud infrastructure (OCI) has become a key platform for training cutting-edge AI models.
Based on strong demand and locked-in contracts, Safra Catz provided an extremely optimistic outlook for the future. She expects Oracle Cloud Infrastructure (OCI) to grow by 77% to $18 billion this fiscal year, and to sequentially grow to $32 billion, $73 billion, $114 billion, and $144 billion over the next four years. She emphasized, "Most of this revenue is already included in our $455 billion RPO."
Ellison's Ambition: From AI Training to the Trillion-Dollar Inference Market
Despite achieving great success in the AI training market, Larry Ellison's vision clearly extends further. He divides the AI market into two parts: AI model training and AI model inference, asserting that the latter's scale will far exceed the former.
"Training AI models is a massive market worth trillions of dollars," Ellison stated, "but if you look closely, you'll find an even larger market, which is AI inference." He believes that AI inference will be used in millions of business and government applications, including running robotic factories, autonomous vehicles, drug design, financial market betting, and automating legal and sales processes.
Ellison emphasized that while Oracle is actively pursuing the AI training market, it is also "actively pursuing the inference market." He believes Oracle has a unique advantage in the inference market because "Oracle is the largest high-value private enterprise data custodian in the world to date." This statement clearly reveals Oracle's next strategic step: leveraging its traditional strengths in the database field to combine AI capabilities with core enterprise data
"AI Database": The "Core Weapon" to Seize the High Ground of Enterprise Data
In order to dominate the AI inference market, Ellison detailed a core weapon—the newly launched "AI Database" by Oracle. Its core logic is to enable enterprise customers to securely and conveniently utilize large language models (LLM) to analyze their private data.
"We have added a very important new way to store data, you can vectorize it," Ellison explained. Vectorization is a key technology that can transform various types of private data (such as financial, customer, and supply chain information) into a format that AI models can understand.
The strategy is divided into several key steps. First, through the new database functionality, customers can "vectorize" all their data. Second, Oracle Cloud Infrastructure (OCI) uniquely integrates the world's most advanced AI inference models, including OpenAI's ChatGPT, Google's Gemini, xAI's Grok, and Meta's Llama. Customers can directly connect their vectorized data to any LLM of their choice.
Once the connection is established, customers can pose complex business questions, such as "How will the latest tariffs affect next quarter's revenue and profits?" The large language model will combine the customer's private enterprise data with public data for advanced reasoning and provide answers. Ellison emphasized that the entire process "will not compromise the security of the customer's private data at all." He believes this is a feature that customers have been eager to achieve since the advent of ChatGPT, and Oracle will be the first company to realize it.
Significantly Upgraded Performance Outlook Accelerates the Arms Race
To support the massive AI contracts and future growth expectations, Oracle's management has provided extremely optimistic growth guidance for the coming years.
Safra Catz announced, the company expects Oracle Cloud Infrastructure (OCI) revenue to grow by 77% to $18 billion in the current fiscal year, and to continue to grow rapidly over the next four years, with a final goal of reaching an annual revenue scale of $144 billion.
To support this growth target, Oracle is making corresponding capital investments. Catz expects capital expenditures (CapEx) for the current fiscal year to reach approximately $35 billion, emphasizing that the vast majority of these investments are for "revenue-generating equipment, rather than land or buildings," to ensure investment efficiency.
In the face of fierce competition, Ellison also elaborated on Oracle's technological moat. He claimed that Oracle has won large AI training contracts because its gigawatt-level data centers are "faster and more cost-effective than any other company in the world" in training AI models. He attributed this to Oracle's underlying technological advantages, especially its networking capabilities: "Our network moves data very quickly... If we are twice as fast, the cost is half."
Additionally, Oracle is also leveraging AI technology to transform itself. Ellison revealed that the company's latest applications are being automatically generated by AI, which puts it far ahead in the application development field. He stated that Oracle will not charge separately for AI features in applications because "our new applications themselves are AI." The following is the full transcript of the earnings call, translated by AI tools:
Event Date: September 9, 2025
Company Name: Oracle
Event Description: Q1 FY2026 Earnings Call
Source: Oracle
Presentation Segment
Conference Operator:
Hello everyone, thank you for your patience. I am Tiffany, and I will be your conference operator today.
At this time, I would like to welcome everyone to Oracle's Q1 FY2026 earnings call. All lines have been muted to prevent any background noise. There will be a Q&A session following the speakers' remarks. (Operator instructions)
Now, I would like to turn the call over to the Head of Investor Relations, Ken Bond. Ken, please go ahead.
Ken Bond, Head of Investor Relations:
Thank you, Tiffany. Good afternoon, everyone, and welcome to Oracle's Q1 FY2026 earnings call. Copies of the press release and financial tables, including GAAP and non-GAAP reconciliation tables and other supplemental financial information, are available for viewing and downloading on our Investor Relations website. Additionally, a list of many customers who have recently purchased Oracle Cloud services or gone live with Oracle Cloud will also be available on the Investor Relations website.
Joining today’s call are Chairman and Chief Technology Officer Larry Ellison; and Chief Executive Officer Safra Catz.
Just a reminder, today’s discussion will include forward-looking statements, including forecasts, expectations, estimates, or other information that may be considered forward-looking. Throughout the discussion, we will highlight some important factors related to the business that could potentially impact these forward-looking statements. These forward-looking statements are also subject to risks and uncertainties that could cause actual results to differ materially from those expressed today. Therefore, we caution you not to place undue reliance on these forward-looking statements and encourage you to review our latest reports, including the 10-K and 10-Q forms and any applicable amendments, for a complete discussion of these factors and other risks that may affect our future performance or stock market price.
Finally, we are not obligated to update our performance or these forward-looking statements based on new information or future events. Before we take questions, we will first have some prepared remarks. At this point, I would like to turn the call over to Safra.
Safra Catz, Chief Executive Officer:
Thank you, Ken, and good afternoon, everyone.
Clearly, we are off to an amazing start this year, as Oracle has become the preferred destination for AI workloads. We have signed significant cloud contracts with well-known companies in the AI space, including OpenAI, xAI, Meta, (inaudible), and many others.
At the end of Q1, the remaining performance obligations (RPO) have now exceeded $455 billion. This is a 359% increase from last year and an increase of $317 billion from the end of Q4 Our cloud RPO has grown nearly 500%, building on last year's growth of 83%.
Now let's look at the performance calculated using fixed exchange rate growth rates. As you can see, we have made some changes to the presentation of our income statement to better reflect how we manage the business. This way, you can more directly understand the dynamics of our cloud business. So let's get started:
Total cloud revenue (including applications and infrastructure) grew by 27%, reaching $7.2 billion.
Cloud infrastructure revenue was $3.3 billion, growing by 54%, building on last year's first quarter report growth of 46%.
OCI consumption revenue grew by 57%, and demand continues to significantly exceed supply.
Cloud database services grew by 32%, with current annualized revenue approaching $2.8 billion.
Autonomous database revenue grew by 43%, building on last year's first quarter report growth of 26%.
Multi-cloud database revenue (i.e., OCI regions embedded in AWS, Azure, and GCP) grew by 1529% in the first quarter.
Cloud application revenue was $3.8 billion, growing by 10%.
Our strategic backend application revenue was $2.4 billion, growing by 16%.
Total software revenue for the quarter was $5.7 billion, down 2%.
In summary, total revenue for the quarter was $14.9 billion, an 11% increase over last year, higher than the 8% growth rate reported in last year's first quarter.
Operating income grew by 7%, reaching $6.2 billion. Internally, we have also been accelerating the adoption of AI to improve operational efficiency. I expect our operating income to achieve mid-teens growth this year and to grow even higher in fiscal year 2027.
Non-GAAP earnings per share (EPS) were $1.47, while GAAP earnings per share were around $1.00. The non-GAAP tax rate for the quarter was 20.5%, higher than the 19% guidance, resulting in a $0.03 decrease in earnings per share.
Over the past four quarters, operating cash flow grew by 13%, reaching $21.5 billion, while free cash flow was negative $5.9 billion—capital expenditures (CapEx) were $27.4 billion. Operating cash flow for the first quarter was $8.1 billion, while free cash flow was negative $362 million, with capital expenditures of $8.5 billion.
At the end of the quarter, we had $11 billion in cash and marketable securities, with a short-term deferred revenue balance of $12 billion, growing by 5%.
Over the past 10 years, we have repurchased one-third of our outstanding shares at an average price of $55, which is currently well below a quarter of our current stock price. In this quarter, we repurchased 440,000 shares for a total of $95 million. Additionally, we paid $5 billion in dividends over the past 12 months, and the board has again declared a quarterly dividend of $0.50 per share.
Given our RPO growth, I now expect capital expenditures for fiscal year 2026 to be around $35 billion. Just a reminder, the vast majority of our capital expenditure investments are in revenue-generating equipment that will go into data centers, rather than for land or buildings
As we bring more capacity online, we will convert the massive RPO backlog into accelerated revenue and profit growth.
Now, before I dive into the specific guidance for the second quarter, I would like to share some overall views on fiscal year 2026 and the coming years.
Clearly, this has been an outstanding quarter, with continued growth in demand for Oracle Cloud Infrastructure. I expect we will sign more customers worth billions of dollars, and RPO could grow to over $500 billion. The enormous scale of RPO growth allows us to significantly revise our financial plan for the cloud infrastructure segment upward.
We now expect Oracle Cloud Infrastructure to grow 77% in this fiscal year, reaching $18 billion, and then increase to $32 billion, $73 billion, $114 billion, and $144 billion over the next four years. Most of this revenue is already recorded in our $455 billion RPO figure, and we have had a very strong start this year.
While much attention is focused on our GPU-related business, our non-GPU infrastructure business continues to grow at a much faster pace than our competitors. We are also seeing our cloud applications targeted at specific industries driving customers to use our backend cloud applications.
Finally, the Oracle Database is thriving, with 34 multi-cloud data centers now running on Azure, GCP, and AWS, and we will deliver another 37 data centers, bringing the total to 71. All these trends indicate that revenue growth will accelerate.
For fiscal year 2026, we remain confident and committed to a total revenue growth of 16% for the full year at constant currency.
Beyond fiscal year 2026, I am even more confident in our ability to further accelerate top-line and bottom-line growth rates. As mentioned, we will update our long-term financial goals at the Oracle AI World Financial Analyst Conference in Las Vegas in October.
Now, let me talk about my guidance for the second quarter, which I will review on a non-GAAP basis and assume exchange rates remain stable. The exchange rate may have a positive impact of $0.03 on earnings per share and a 1% positive impact on revenue. However, the actual exchange rate impact may vary as it did in the first quarter. As follows:
Total revenue is expected to grow 12% to 14% at constant currency, and 14% to 16% in U.S. dollars at current exchange rates.
Total cloud revenue is expected to grow 32% to 36% at constant currency, and 33% to 37% in U.S. dollars.
Non-GAAP earnings per share are expected to grow 8% to 10%, ranging between $1.58 and $1.62 at constant currency; non-GAAP earnings per share are expected to grow 10% to 12%, ranging between $1.61 and $1.65 in U.S. dollars.
Finally, my guidance for earnings per share in the second quarter assumes a base tax rate of (inaudible). However, one-time tax events may cause the actual tax rate to fluctuate as it did this quarter
Larry, it's up to you.
Lawrence J. Ellison, Chairman and Chief Technology Officer:
Thank you, Safra.
Ultimately, AI will change everything, but for now, AI is fundamentally changing Oracle and other parts of the computer industry, although not everyone fully grasps the scale of this impending tsunami. Look at our quarterly data. Some things are beyond doubt.
Several world-class AI companies have chosen Oracle to build large-scale GPU-centered data centers to train their AI models. This is because Oracle has built megawatt-level data centers that are faster and more cost-effective in training AI models than any other company in the world.
Training AI models is a massive multi-trillion-dollar market. It's hard to imagine a larger technology market than that. But if you look closely, you'll find an even larger market. That is the AI inference market. Millions of customers use these AI models to operate businesses and governments.
In fact, the AI inference market will be much larger than the AI training market. AI inference will be used to operate robotic factories, robotic cars, robotic greenhouses, biomolecular simulations for drug design, interpret medical diagnostic images and lab results, automate laboratories, bet in financial markets, automate legal processes, automate financial processes, and automate sales processes.
AI will write—specifically generate computer programs called AI agents that will automate your sales and marketing processes. Let me repeat that. AI will automatically write computer programs and then automate your sales processes, legal processes, and everything else, including in your factories, and so on. Think about it. AI inference, it is AI inference that will change everything.
Oracle is actively entering the AI field, by the way, we are not doing poorly in the AI training market. But the inference market is larger. Oracle is also actively entering the inference market as well as the AI training market. We believe we are in a quite favorable position in the inference market because Oracle is the largest custodian of high-value private enterprise data in the world to date.
With the launch of our new AI database, we have added a very important new way for you to store data in our database. You can vectorize it. By vectorizing all your data, all your data can be understood by AI models.
Then, we enable customers to easily connect all their databases, all new Oracle AI databases, and cloud storage (OCI cloud storage) to the world's most advanced AI inference models, such as ChatGPT, Gemini, Grok, Llama, all of which are exclusively available in Oracle Cloud.
Once you vectorize your data and connect it to your chosen LLM (large language model), you can ask any question you can think of. For example, "How will the latest tariffs affect next quarter's revenue and profit?" You pose this question, and the large language model will then apply advanced reasoning to your private enterprise data combined with publicly available data You can get answers to important questions without compromising the security and safety of your private data.
Again, I want you to think about this. Many companies say we are heavily investing in AI because we are writing agents. Well, guess what? We are also writing a whole bunch of agents.
However, about three years ago when they launched ChatGPT, what you could do was have conversations and ask questions. You weren't automating a process with agents. You could ask any question you wanted and get a well-reasoned answer, along with all the latest, best information and high-quality content.
Who is providing this (Q&A based on private and public data) to customers? When we deliver and demonstrate at AI World next month, we will be the first. This is what our customers have been asking for since ChatGPT 3.5 was launched nearly three years ago. "I want to ask questions about anything. So, you need to understand my business data and all publicly available data. Then you can answer the questions that matter most to me." Well, now they are going to ask those questions.
Back to you, Safra.
Ken Bond, Head of Investor Relations:
Thank you, Larry. Tiffany, please prepare the audience for questions.
Q&A Session
Conference Operator:
(Operator instructions) Your first question comes from John DiFucci of Guggenheim Securities. Please go ahead.
John DiFucci, Analyst:
Thank you for taking my question. Listen, even I am shocked by the future outlook. This question, I think, is intentionally open-ended.
So Larry and Safra, Oracle has become the de facto standard for AI training workloads, and you are making money from this, I am very confident about that. But clearly, this is not just about AI training. I know that is a big part of it. You talked about it. But can you elaborate on what other factors are driving these quite remarkable forecasts?
Safra Catz, CEO:
Larry, you go ahead. I think you were just covering this.
Lawrence J. Ellison, Chairman and Chief Technology Officer:
Yes. Well, many people are looking for inference capacity. I mean, people’s inference capacity is running out fast. I mean, that company called us, I mentioned, I think it was last quarter or the quarter before, someone called us and said, "We are going to take all the capacity you currently have unused anywhere in the world. We don’t care." I have never received a call like that. That was a very unusual call. That was for inference, not training. The demand for inference is enormous
If you think about it, ultimately, all the money we invest in training must translate into products sold, and this involves reasoning. Again, the reasoning market is much larger than the training market.
Yes, like others, we are building agents with our applications. But we are doing much more than that. About three years ago, when ChatGPT 3.5 was released, you could simply converse with your computer, ask questions, and receive reasoning-rich, high-quality answers based on the latest and most accurate publicly available information, of which there is a lot.
However, if you combine publicly available data with enterprise data, which companies do not want to share, you must handle it in a way that keeps your private enterprise data private while still allowing large language models to use it for reasoning. This is to answer questions like “How do the latest tariffs or steel prices or anything else affect my quarterly performance? Affect my ability to deliver products, affect my revenue, affect my costs?” and answer such questions.
To answer such questions, we must, and we have done so. We must fundamentally change our database so that you can vectorize all data. This is how large language models understand information after it has been vectorized. Then allow people to ask any questions they want about anything. We—this is exactly what we are doing.
However, unless you have a secure and reliable database connected to all popular LLMs, which we have done, unless you have these—then tell me who else besides Oracle has these?—otherwise, it is difficult to provide a ChatGPT-like experience on top of your data and publicly available data.
This is a unique value proposition for Oracle. This is because, again, we are the custodians of data, and we have much more data than any application company. They have their application data. They have tens of thousands of customers. Our customers number in the millions of databases. Therefore, we believe we are better positioned than anyone to leverage reasoning (market).
Safra Catz, CEO:
By the way, aside from our GPUs and so on, we have become the de facto cloud for many customers. Similarly, they want to place something in our public cloud or our competitors' public cloud to work in conjunction with Oracle databases.
But at the same time, there are many reasons they want what is called a dedicated region or Oracle Cloud@Customer, and we give customers so many options that we rarely cannot meet their needs in some way.
Of course, we have every part of the stack. We have the infrastructure, we have the databases, and you will hear a lot about it; it is the only reasonable repository for the data you want to analyze using AI models. Then we have all these applications that are just starting to take off. So we just have many different layers They are all moving in the same direction, and when used together, they can benefit our customers. Listen, my...
Lawrence J. Ellison, Chairman and Chief Technology Officer:
Please go ahead, John. I got John's praise, and then I interrupted you, so I apologize for being rude.
John DiFucci, Analyst:
I was about to say, it's unclear) I've been in this business for a long time, and I've told my entire team to pay attention to this, even those who don't study Oracle, because this is a career event happening, and it looks—it's amazing, and I really feel happy for you all, congratulations. It's incredible.
Lawrence J. Ellison, Chairman and Chief Technology Officer:
Thank you.
John DiFucci, Analyst:
But there's still a lot of work to do. Keep it up.
Lawrence J. Ellison, Chairman and Chief Technology Officer:
It takes a lot of work. Well, let me mention two actually shocking things.
First, we have condensed the entire Oracle Cloud, every feature and every function of Oracle Cloud, into something we can fit into just a few racks, three racks, which we call "Butterfly," costing $6 million. So we can offer you—we can provide you with a private version of Oracle Cloud, with every feature, every security function, everything we do, for just $6 million. I believe the cost for other hyperscale cloud providers is over 100 times this.
So we can actually provide our customers Cloud@Customer, a complete cloud customer. We have some companies like Vodafone, and I'm not sure which companies I can mention or not. We have some large companies purchasing—basically buying their own Oracle Cloud regions. In fact, multiple Oracle Cloud regions, because they don't want any neighbors in their cloud. They don't want other companies in their cloud. But they want a complete cloud. They want to pay as they consume, they want all features, all functionalities, all security. They don't want to buy it. They want us to purchase and own the software and hardware, they want us to maintain it, build the network, provide all of this, and they just want to pay for consumption. We can do this at an entry-level price that is 99% lower than what competitors can offer. That's one point.
Another thing. Let me give you another example, and I'll stop here. We also have the most advanced application generator among all companies. Interestingly, we are both an application company and a cloud infrastructure company. Therefore, we build applications. When we build applications, we want to improve efficiency The way to improve efficiency is to build AI application generators, and we have been doing just that. The latest applications we are building are not built by us; they are generated by AI. We believe we are far ahead of any other application company in generating applications. So this is another very important advantage we have.
Of course, it's interesting. I comment that we do not charge separately for our AI and applications because our applications are AI. They are entirely AI. The new applications we are building are just a bunch of AI agents generated by us, connected together through workflows. That's it. How can you charge separately for that? Every application we have is like this. But the applications are better, and we hope to sell more, which is how we get compensated.
I understand. John, thank you very much for your compliments.
John DiFucci, Analyst:
Thank you, Larry. Thank you, Safra.
Safra Catz, CEO:
Thank you, John, for being so kindly attentive to us over the years. Thank you. Have a great day. It might be time for the next question now.
Conference Operator:
Your next question comes from Brad Zelnick of Deutsche Bank. Please go ahead.
Brad Zelnick, Analyst:
Great. Thank you very much. I think we are all a bit shocked, in a very, very good way. Larry, there is no better testament to the seismic shift happening in the computing space than the results you just announced. Oracle has a nearly 50-year track record of navigating transformation and ultimately winning.
However, when we think about enterprise applications, investors are quite pessimistic these days, and I would love to hear your thoughts on what all of this means for the industry. Will market share flow to companies that do not have databases and do not have the advantages you have extended into the silicon layer? Could this be an extinction event? I'm very curious about your thoughts.
Lawrence J. Ellison, Chairman and CTO:
Well, I think we have a tremendous advantage because we are both an infrastructure company and an application company.
Two things have happened. As an application company, we — we need — we know we must start generating our applications. We can no longer rely on a manpower strategy. We still need people, don’t get me wrong, but the number of people we need has been drastically reduced. And we can build/generate better applications than can be built manually. We have been studying these AI application generators for some time, and we are actually using them.
But the key is, we are not just building application generators. We are building application generators, and then we are building applications, which gives us insights to improve the application generators You — being on both sides of this equation is a huge advantage, as both an application builder and a builder of application generation technology, as well as an underlying AI application code generator. This is a tremendous advantage.
Let me give you another advantage, which is often a disadvantage. We are very large. We no longer sell individual discrete applications. We sell application suites. We decided to enter the healthcare industry, believing we can solve more problems because we are much larger than they are (referring to competitors). By the way, we are much larger than Workday or ServiceNow. We are addressing a broader range of issues. We can provide all ERP, and then we can add all CRM, but all components are designed to work together. This makes it much easier for customers to consume.
Therefore, we believe that being good at application generation and underlying technology enables us to build better applications, allowing us to build more applications to solve more problems. So customers do not have to integrate all systems across multiple vendors. We can directly build a suite where all components are designed to work together. I think we have a huge advantage in the application space.
We have a huge advantage in AI reasoning, and we can — to reiterate, what we will showcase next month at Oracle AI World is that we have acquired all customer data, all data. I don’t want to reveal all the details right now. You can ask any questions you want. “Who are your salespeople? Who are the top potential customers in my region? What products should I pitch to them next? What reference cases are best for me to convince them to use our products?”
If you are a salesperson, you can get immediate answers to all these questions. Engineers can look at “which features of the Oracle financial system do people make the most mistakes with when using them, and I need to fix and make them easier to use.” You just need to ask the questions because all this data is available to the AI model. We are the only ones — is anyone else doing this? As far as I know, no. This is a huge advantage.
Brad Zelnick, Analyst:
(Unclear)
Lawrence J. Ellison, Chairman and Chief Technology Officer:
Thank you.
Brad Zelnick, Analyst:
For AI World, Larry. Thank you. This is an amazing day for Oracle. It’s an extraordinary day for the industry. Thanks again and congratulations.
Lawrence J. Ellison, Chairman and Chief Technology Officer:
Thank you. Thank you very much.
Conference Operator:
Your next question comes from Derrick Wood of TD Cowen. Please go ahead.
Derrick Wood, Analyst:
That's great. Thank you for answering my question. I also want to congratulate you on this significant quarter. Safra, in fact, you delivered over $300 billion in new RPO in the first quarter, which is truly amazing, but it will require a lot of infrastructure build-out.
Can you provide more information on how much capital expenditure and operational cost structure is needed to fully fulfill these contracts, and how should we think about these costs relative to the revenue growth expected in the coming years? Overall, how should investors think about the return on investment (ROI) for this expenditure?
Safra Catz, CEO:
Certainly.
First of all, as I mentioned in my prepared remarks and have been very clear about before, we do not own real estate. We do not own buildings. What we own and design are the equipment that is optimized for Oracle Cloud. It has very special networking capabilities. It has the technical capabilities from Larry and his team that allow us to run these workloads at a much faster speed. Therefore, depending on the workload, it is much cheaper than our competitors.
Because of this, what we do is only deploy this equipment when the timing is right, and usually very quickly, assuming our customers accept it, we start generating revenue immediately. The faster they adopt the system and meet their needs, the faster they start using it, and the sooner we get revenue. To some extent, I don't want to call it light asset from a financial perspective, but it is relatively light in terms of assets. This is indeed an advantage for us. I know some of our competitors who prefer to own buildings. That is really not our expertise. Our expertise lies in unique technology, unique networking, storage, and the entire way we assemble these systems. By the way, they are the same and very streamlined, again allowing us to maintain very high profitability while still being able to offer customers very attractive prices.
I have pointed out that capital expenditures are expected to be around $35 billion this fiscal year. But because we are monitoring this, we are actually installing the equipment as soon as we receive it, and then handing it over to generate revenue immediately. So we are very— we have a very clear view of our ability to deploy this equipment and essentially spend this capital expenditure before we start generating revenue. But at this point, I expect this year to be $35 billion. I think— I mean, it could be a little higher, but I think— if it is higher, that is good news because it means more capacity is being handed over to me in the form of floor space.
As you also know, we are embedded in our competitors' clouds. Similarly, what we are actually responsible for paying for is just our equipment, and this equipment will be put into use immediately. There, we will ultimately move to 71 data centers embedded in our competitors or (inaudible, possibly referring to "flash" partners, or specific partners).
Lawrence J. Ellison, Chairman and Chief Technology Officer:
Let me add a few very brief things.
First, we just handed over a huge data center to a client. The acceptance period could take months. But it only took a week. From the time we officially delivered the equipment and conducted tests to when they started paying, it only took a week. One week. We have an extraordinary team that has done an excellent job ensuring the equipment runs quickly and that the client—our client—can accept it. They wanted to accept it as soon as possible because they want—they want to get to work, they want to train their models. This huge data center was accepted in just one week. That is extraordinary.
Additionally, we are very large consumers of network equipment, GPUs, etc. Because we are very large consumers, I believe we can obtain better financing terms from suppliers than others. So I think we have an advantage in this regard as well. I believe we will do very, very well financially. We have an advantage there too.
Derrick Wood, Analyst:
Great. Thank you, Larry. Thank you, Safra.
Safra Catz, CEO:
Of course.
Conference Operator:
Your next question comes from Mark Moerdler of Bernstein Research. Please go ahead.
Mark Moerdler, Analyst:
Thank you very much, Larry and Safra, and frankly, the Oracle team, great job, congratulations to you all.
I want to focus on the AI training business that you have been winning. Could you please explain to us how Oracle is able to create enough differentiation (inaudible) to ensure that this business does not commoditize, and how you continue to drive strong revenue and free cash flow from the training business even if training slows down? I think people really need to understand this.
Lawrence J. Ellison, Chairman and Chief Technology Officer:
Well, let me—I mean, I can finish it in one sentence. Our network transmits data very, very quickly. If we can move data faster than others, if our GPU superclusters have a performance advantage. If you pay by the hour, if we are twice as fast, our costs are half.
Mark Moerdler, Analyst:
I agree. Thank you.
Conference Operator:
Your final question comes from Alex Zukin of Wolfe Research. Please go ahead.
Alex Zukin, Analyst:
Hey, guys, thank you very much for squeezing in time for my question. I was going to ask you if the new Oracle AI database really opens up the general enterprise inference market, and based on your remarks, it sounds like the answer to that question is “absolutely” (hell yes)
So my follow-up question is, how do you see the pace of this situation developing in the coming years? After the launch of the Oracle AI database, how long do you expect it will take for your seasoned enterprise clients to truly be willing to view their enterprise data in this way? To what extent is the currently constrained supply environment hindering this demand, or is it stagnating?
Safra Catz, CEO:
Larry, I don't know if you want to continue. You covered it. Mark (probably should be Alex), please go ahead.
Lawrence J. Ellison, Chairman and Chief Technology Officer:
Please go ahead. Okay.
I think—who wouldn't want this? I mean, I think everyone says they want to use AI. I mean, every—CEOs say they want to use AI. Heads of state, government leaders say they want to use AI. We've never had customers like this. I mean, typically, we don't deal with CEOs, and now we are dealing with CEOs. Now we are dealing with heads of government and heads of state on this matter because AI is so important. People want to use AI on their own data, that's exactly what they want to do. But they don't know how to do it safely. They don't know how—well, they don't know how to do it at all. And then a big risk is, "Oh my gosh, I can't share my— you know, JP Morgan can't share all its data. Goldman Sachs can't share all its data with OpenAI. They are—they won't do that." Or xAI or Llama or Meta, they won't—they have to keep it private.
So we have to keep your private data private. We have to keep your private data secure. But we have to make it available for the latest and best reasoning models from OpenAI, xAI, and all the other companies. And we—because we have the database, because we can vectorize all the data in the database, because we have very fine-grained security models in the Oracle database. We can do all of this. We can deliver all of this.
Then what we choose to do is—through the AI database, not only ensuring that we can vectorize all the data so that AI models can understand it, but we also bundle it with all AI models. That's why we made agreements with Google. That's why we did all these deals, you can get Gemini on Oracle Cloud, you can get Grok on Oracle Cloud, you can get ChatGPT on Oracle Cloud, you can get (unclear, possibly referring to specific models like Claude or Cohere), I can keep listing.
So we bundle them together. This way, our clients can very easily use these large language models on a combination of publicly available data and all enterprise data, which is exactly what they want. It is the combination of publicly available data and all their enterprise data that enables them to ask and get answers to any questions they can think of, any questions that are important to them Everyone wants it. I believe the demand will be insatiable, but we are well-positioned to deliver a large amount of databases and AI through our cloud in the coming years.
Safra Catz, CEO:
This will be one of the reasons for the eventual migration of Oracle databases (which still account for a large portion of the enterprise market and have a significant share) to the cloud. Many will use Oracle AI databases to migrate to the public cloud. But many of the largest enterprises will want to have their own dedicated areas or Oracle Cloud@Customer. Again, they can ultimately benefit from AI due to their own data, using any LLM they want, as they are all in our cloud too.
Alex Zukin, Analyst:
Sounds like high-margin AI revenue, guys. Congratulations.
Safra Catz, CEO:
Thank you. Thank you.
Conference Operator:
Okay.
Ken Bond, Head of Investor Relations:
Thank you, Alex. A replay of this conference call will be available on our investor relations website within 24 hours. Thank you for joining us today. I will now hand the call back to Tiffany for closing remarks.
Conference Operator:
Ladies and gentlemen, this concludes today's conference call. Thank you for your participation. You may now disconnect.
(Legal Disclaimer)
This transcript may not be 100% accurate and may contain spelling errors and other inaccuracies. This transcript is provided "as is" without any express or implied warranties.
This article is from WeChat Official Account "Hard AI". For more cutting-edge AI news, please click here.