Introduction to Oracle's AI Revolution
Oracle is driving a transformative AI revolution, projected to create millions of jobs and add trillions in economic value by architecting AI and data together. Their approach ensures enterprises can trust AI-generated answers and applications, focusing on security, correctness, and dependability.
Oracle AI Database: Next-Gen AI-Native Architecture
- Oracle AI Database 26AI: Revolutionizing AI-Driven Applications and Analytics: Latest release integrating AI deeply with existing Oracle databases.
- AI Vectors: New database data type representing semantic content (documents, images, videos) as numeric vectors to enable fast similarity search beyond exact matches.
- Vector Search and Traditional Search Combine: to find relevant business data effectively, enabling retrieval-augmented generation (RAG) for precise AI answers.
- The Future of Business: Leveraging Autonomous AI Agents: Supports multi-step workflows, action-taking capabilities, and integrates with multiple AI frameworks.
- Security and Privacy: Enterprise-grade enforcement at the database level to ensure data privacy and compliance.
AI App Development with Oracle’s Generative Development (GenDev)
- Focus on Solution-Centric Languages: SQL and Open Application Specification Language (Open AppSpec) to ensure understandability and trustworthiness rather than opaque code.
- Trusted Data APIs: JSON-relational duality enables consistent, business-rule-enforced, evolving, and privacy-constrained data access.
- Data Privacy Rules Embedded in Database: Ensures no unauthorized data exposure regardless of AI usage.
- Trusted Answer Search & Apex Interactive Reports: Allows natural language querying with transparent filters and verified answers.
- Apex AI Native App Generator: Converts natural language app descriptions into secure, evolvable enterprise apps using Open AppSpec language.
Open AI Analytics & Oracle Autonomous AI Lakehouse
- Open Standards & Multi-Cloud Support: Apache Iceberg for open lakehouse data formats, enabling cross-platform read/write access.
- Oracle Autonomous AI Lakehouse: Combines iceberg's openness with Oracle’s optimized SQL, indexing, data caching (Exadata), and scaled-out serverless data access.
- Federated Catalog: Unified discovery and access across multiple catalogs and data stores.
- ETL and Real-Time Data Ingestion: Integration with Golden Gate, Delta Sharing, Kafka for feeding data into the lakehouse.
Oracle AI Data Platform: Integrated AI & Data Foundation
- Unified Data Foundation: Brings structured/unstructured, historical/real-time data under one AI-ready catalog.
- Developer Workbench: Integrates AI models, frameworks (OpenAI, Langchain), and open-source engines (Spark, Flink) with unified tools for AI app/agent development.
- The Future of Business: Leveraging Autonomous AI Agents: Low-code environment to build AI agents performing complex data workflows with multi-tool orchestration.
- Agent Hub: Single-pane interface allowing business users to access, coordinate, and trust AI-powered agents across applications.
- Enterprise-Grade Security & Governance: End-to-end lineage, access control, and auditing.
Real-World Impacts and Partner Ecosystem
- Customers across healthcare, manufacturing, and other sectors use Oracle AI Data Platform for better decision making, operational efficiencies, and predictive insights.
- Extensive global partnerships with major firms investing billions in training and developing AI use cases leveraging Oracle’s platform.
- For insight into enterprise AI transformation strategies, see How Infosys Drives Enterprise AI Transformation and Innovation.
Conclusion: Oracle Empowering AI Leadership
Oracle’s comprehensive AI ecosystem, from AI-native databases and trusted app development to open lakehouse analytics and the AI Data Platform, facilitates enterprise AI transformation. By embedding security, trust, and open standards, Oracle enables organizations to innovate confidently and maintain competitive advantage in the AI era.
For more details, visit Oracle’s AI platform resources and upcoming technical sessions.
The AI revolution is here and it's changing everything. Within the next 5 years, AI stands to create 78 million
net new jobs and add trillions of dollars in value to the global economy. The core of the revolution is data. Both
your data and the data that others entrust to you. But the question becomes, can you trust AI? Is it safe to
trust the answers it gives? Are the applications it generates ready for the enterprise, secure, correct, and
dependable? You can if you bring AI to your data and use a database where AI and data are
architected to work together. Please welcome to the stage Oracle's executive vice president of database
technologies, Juan Laisa. [Applause] [Music]
All right, thanks everyone for joining today. Uh talk about the AI for data revolution and it is a revolution. So um
as we know our AI is changing everything and it's it's the next big thing in data management without a doubt. Um and it's
not a revolution any of us can afford to ignore. So you know the the quote I like the best about AI is this one says AI
won't replace humans but humans with AI will replace humans without AI. And that applies to enterprises also. And so
um the key to thriving in this age of AI is to transform yourself and your enterprise into an AI leader
uh and use AI to deliver breakthrough insights, innovations, productivity before your competitors do.
And Oracle AI for data is all about helping you do that. U and the great news is AI for data is both easy to
learn and easy to use. And you'll see that u when I describe things today. Now before we jump into AI, I want to talk a
little bit about what I'm going to say about the Oracle AI for data strategy. Uh what we're doing, why we're doing it,
what's different about it. Uh most importantly, how it's going to benefit you and your enterprises.
Um, and I'm going to talk both about what we have today and about a few things that we will have very soon. Um,
and it's important to understand AI is a huge focus for us in the database team. We have over a hundred different
projects going on in AI. We're using AI throughout everything. Uh, it's the most rapidly changing field ever and we're
going to keep uh putting out AI innovations on a quarterly basis. So this is this this is going to be
changing in real time. Uh but before I launch into AI, I want to discuss briefly our overall strategy for Oracle
database and it's pretty straightforward which is we design our products for customers to think and act strategically
um and that understand that gluing things together causes a lot of problems. It creates a lot of costs. It
creates a lot of complexity. We also really believe in open standards and we know that you do also. Uh why? Because
uh open standards are key to creating that competitive ecosystem that benefits everyone. It benefits vendors and
customers. So we've been focusing for many years on a converged architecture and what this means it's an open
standardsbased architecture that uh puts together best of breed support for all the different data types
and workloads and makes them all work together seamlessly. And AI for AI we're doing the exact same thing. We're
putting it all together with everything else and making it work seamlessly. So that's the overall strategy. And there's
going to be three parts to my talk tonight. So first of all, and it talks about what we're believing, how we're
architecting these things together. So we believe that architecting AI and data together is going to enable simpler and
better results. And I'm going to show that. That's the first part of my talk. Then I'm going to talk about uh
architecting AI data and appdev together uh to to enable faster and better innovations everywhere.
uh and innovations that you can trust. And we'll get back to this whole trust topic. And finally, I'm going to talk
about architecting AI data and open standards together. So that's a that's a big deal for us. We're going to be
making some announcements there. Um and again, that's all about enabling AI insights for all your data because it's
open uh everywhere. Okay, so the kind of the three big parts. So let's start talking about how we architect AI and
data together for the our four key products. I'm going to start with AI database, AI appdev, AI lakehouse and AI
data platform. Okay, so first up is Oracle AI database. So this is a new thing. So uh today we launched Oracle AI
database 26AI. So you notice two things here. One is it's Oracle AI database now. It's not Oracle database, it's
Oracle AI database. And we have a new release number 26 AI. It's available now. You can download it today. you can
start using it today. The idea of Oracle AI database is that we've architected AI and data together to create a next
generation AI native database. This is something that we've been working intensely on. And the two key things
that we focus on are the two key AI breakthroughs. One is LLM, which we all know about. The other one's AI vectors,
which is a new data type in in databases, which I'm going to talk about. Uh but in addition to that,
there's dozens of other AI improvements. Now, this new release 26AI is a long-term support release um that's
fully compatible and it replaces Oracle 23AI. So, after today, there's no more Oracle 23 AI. Uh everything going
forward is Oracle 26 AI. And it's very easy to adopt because basically you apply the OR uh October patch, this
month's patch to your database and you're on Oracle AI database 26 AI. It's adding AI functionality on top of our
previous Oracle 23i. That's what it's doing. It's not changing any of the architecture. It's just adding AI on
top. And if you're running Oracle Database 19c, it's also gaining AI capabilities because a lot of the tools
that I talked about today, the AI tools run on top of Oracle database 19c. Okay, so let's dive into it. Um, so first I'm
going to start with AI vectors. Some of you know this already, but it's new to a lot of people. That is the key new AI
data type in databases. So Oracle database now has a new data type called AI vector. And what is an AI vector?
Well, an AI vector is a sequence of numbers that represent the semantic content or meaning of a complex object.
So it could be like a document or an image or a video or or a pattern of data. Uh so this is a brand new data
type enabled by AI and Oracle generates these AI vectors uh from objects using the AI model of your choice. So you hand
Oracle a document or an image. We run the AI model. We generate an AI vector for the object.
Okay. Now that's what an AI vector is. But what good what's it good for? Well um Oracle AI database can both store
vectors which is easy but but more importantly it can quickly find similar objects using AI vectors. So up until
very recently, the only thing that databases were able to do with things like uh documents and images is find
exact exact matches. So the exact same pixels, the exact same words. Uh but if you already know the pixels, then
there's no point in searching for something. You already have it. So what you really want for these complex
objects is the ability to do a similarity search, find things that are similar. uh it like that picture you see
there of the couches, you don't want if you handed the picture of the gray couch on the left, you don't want the exact
same picture back. You want to find the couch that's more most similar to it if you're trying to buy something similar.
And what you can see there is some couches are more similar, some less similar. And the more similar ones have
similar numbers in their vectors. So that's the key the key attribute of vectors. The more similar the numbers in
the vector, the more similar the objects. uh and what we can do in Oracle is we can find similar objects by just
looking at the numbers and we have very fast vector AI indexing so that in milliseconds we can find the closest uh
the closest uh similarity to any document image video anything like that so that's the key idea behind uh AI
vectors in the database and that's the new technology that it brings to databases um now if we want to answer
business questions with AI. What we need to do is find relevant business data in the database and then use the LLM to
answer the the uh questions. And to find relevant business data, you need to do two things. Number one is you need to do
traditional database search and you need to do this new AI vector search and you have to combine them together. So that's
the key and I'm going to be talking about that. I'm going to show you how easy it is to do that now with Oracle AI
database because we have AI architected into the database. Okay. So here's an example uh where we've architected the
vectors and the AI to work together. So the example here is an employee asks a question. So the in in this case the
question is does my dental plan cover braces for my 19-year-old? Okay. So that's an a question written in
natural language. And what we're going to do is we're going to take that question and vectorize it, convert it
into a vector, and then we're going to search our documents in the database to find the closest document that matches
that question. Okay? And then we're going to verify using normal database search that that benefits document
applies to that employee. So that's a simple example of how you combine the new age vector search to to actually
find documents that answer a question and then make sure that the that the uh document is relevant to that particular
user. So it's the combination of the two. Now how hard is it to do this? Well, it's very simple. U here's a
little simple SQL statement that does exactly that. Uh what I I'll read you what it says. It says select the
documents from the benefits documents table where that benefits document applies to
this employee. And then that last line, the order by vector distance says find the closest uh matching document. So
that's all it takes to answer that question in SQL in Oracle AI database. Now, uh, so you see that with one line,
one extra line of that SQL, we're able to answer these complex, uh, questions that were never possible before in an
English language question, finding a document and returning it. And because both the documents and the and the
business data are in the same database, it's completely consistent. Uh, everything is is current and we return
that answer in milliseconds because of our our vector indexing technology. So that means it's good for both OLTP and
analytics because something that returns in milliseconds is very good for OLTP. Okay. So we've we can search across
vectors and all types of of data in Oracle text data, spatial, JSON, XML, you name it, we've we've uh made it work
together with these new AI vectors. Okay, so I just talked about finding the relevant document, but what if you what
happened to the answer to the question? Well, you could go read the document, but it's even better if the if we pass
that result, if we take the result of that of uh searching for that document and pass it to the LM along with the
question, then the LLM can generate an exact answer that you see there in purple at the bottom. And this is called
retrieval augmented generation or rag. And in Oracle AI database, we've automated that whole process. So, you
can either invoke it through APIs or you can run the whole thing as a single SQL statement. So again, that's builtin,
makes it super simple because it's architected into the Oracle database. Okay, let's hear what a few customers
have to say about using Oracle AI vector search. Number one reason we wanted to use
Vector on Oracle database is because it keeps it next to all the other data. It's converged and that allows us to add
the same security policies and not have to have a whole separate system. What we find today is that a lot of our
customers data are distributed across multiple different technologies. Oracle autonomous database helps them to
automate their entire deployment and also bring AI to the data much quicker. We have chosen Oracle autonomous
database because of the ability to use all the different functionality to access data. That means vector search,
select AI, geospatial analysis, blockchain table and so on and so forth. What we found, we compared Oracle
Database Vector Search to a lot of the other options out there. Just the sheer scalability seemed larger. We know it
scales higher than limits that I've heard about elsewhere. And it's faster. It's a great solution. It's cutting
edge. It's scalable. It has the security you need. It just plugs into so much. [Music]
>> All right, that's great. Um, so yeah, thanks. Um, so that's Oracle AI vector search. Um,
that's now we have a an even newer technology now uh that goes beyond that rag uh that we just talked about called
AI agents. So vector searches you hand a question and some data to the LM. The LM answers the question. Now what you can
do with AI agents, you can the agents can take multi-step workflows. So they can plan, they can try multiple
approaches to answering the question, they can use tools, uh they can take actions. So that's the new generation of
AI technologies which we've also built into the Oracle database. So one thing we've done there's a lot the AI agents
is a huge deal across the industry and we're integrating with all the different AI agent frameworks that you see listed
there. Everything from OCI to GCP, Azure, AWS, lots of other things like uh Lang Graph. But in addition to that,
we're also architecting AI agents directly into the Oracle database. So we provide frameworks to easily build,
deploy, and manage in database AI agents and and these in database AI agents will be number one more secure because your
data never leaves the database, number two faster, and number three much easier to use. U so three examples that we have
there, select AI agent, private agent factory, and SQL CL MCP server. So, there's kind of three different
approaches to AI agents in the Oracle AI database. Uh, and now I'm going to have Chris Rice show us how LMS can use the
Oracle SQL CL MCP server to easily answer questions about data in an Oracle database.
>> Standardiz context to an LLM. MCP allows an LLM to answer natural language questions about your database in a very
similar way to a SQL expert, but much faster. The LLM first automatically queries the database metadata to find
relevant tables and columns. Then the LLM writes and execute SQL via MCP to provide an answer to the user. It's like
having a SQL expert that is ready to assist you at any time. Oracle SQL is now accessible via MCP to
enable powerful and simple agentic AI. SQL or SQL CL commands can be invoked by any MCP client such as Klein, Claude or
Copilot. SQL CL MCP server even works for Oracle database 19. Note that we saw the client identified as well as the LLM
in use was identified. These two pieces of information are captured and logged into the database
audit trail. This allows for the DBA or administrator to keep track of which LLM are in use and which clients are in use.
Now, I've further refined it and asked it to summarize the employees by departments.
Now, I'm going to ask it what are the most expensive departments? which salaries are the highest
here. It had to do a multi-table join to come up with the department names and salaries.
Now I'm going to ask it if the location has an impact which again it writes the query and the
query is a multi-table join to come up with the correct answer. And finally, I'm going to ask it to
produce the report in a markdown format so that I can save and share that report with anyone that I need to share this
information with. All right. So, that's a quick demo of the Oracle MCP agent.
It's basically an a SQL expert on demand uh that you can you can use and MCP agent you can download it today this
Oracle MCP agent and use it against any database 19C any any version of Oracle database that works you can download it
for free use it today so it's great way to get started with AI okay uh actually I hit the button ahead of time back it
up please there we go so um we talked about uh how uh AI can can answer questions about data in your database,
but it's not limited to data in your database. It can also combine its own knowledge and things that it finds in
public domain like what by doing a web search to answer questions. And here's an example. You ask how my ABC product
is doing against the market. Uh the AI agents will look through your database, find the answer for how ABC is doing in
your product. then it'll look out on the web, find out how other products are doing and give you a very detailed
answer. So, it's amazing just amazing that that's this is even possible. Uh, and it's actually super si simple to do.
Also, it's not just possible, it's super easy. All right, so I talked about uh generating answers using AI and how we
built that into the AI database. But there's a lot more that we've done with AI. Uh and one thing we've done is we've
also architected AI to help with database management pain points. Uh so just like we have human specialists in
different fields, uh we are also providing specialized AI chatbot assistants to help with different kinds
of database management. An AI management assistant, a security assistant, a knowledge assistant, and a diagnostic
assistant. And each of these uses different data and different strategies to get answers to help you manage your
databases. So it's another giant area that we're investing in. In addition to that, uh we're also transforming data
development using AI. In fact, we're transforming every step in the data uh data development workflow. And I'm going
to go quickly through each of these steps to give you an idea of the benefits of this AI. So first step in
data development is you got to create a schema. You got to go create your tables. And now uh you can easily
generate a schema from a natural language description with the SQL developer AI assistant. You just tell it
what you want. It generates the schema. You can also the next step is a new step which is you need to declare the intent
and semantics of your schema of your tables and columns so that LLMs can understand them. Uh because the LLM if
it can't understand your schema if all it has to go on is cryptic column names or cryptic columns or cryptic comments I
should say then it's not going to do a good job of doing anything. So we've added something called annotations to
the database. So you can basically annotate your tables, your columns, your procedures, tell AI what the semantics
of it are, what what does this data do, what does it mean, when do you use it? U and annotating schemas is a really
important thing to do. So those of us that have been around for for decades in database know that uh things like SQL
tuning was the thing that we used to do. The new version of that in the AI world is annotating your data, describing the
data to the LLM. And again, uh, Chris Rice is going to demonstrate a tool that we've developed that you can use to
annotate your database data and schema. >> We're going to annotate the schema in this database. And to get started, all
we have to do is select the AI enrichment in the tree. It will install the necessary PL/SQL procedures to begin
annotation. These procedures have been backported, so they will work across database 19 and 26 AI.
We're going to switch to a database that I've already begun annotating and creating AI enrichment. As we can see,
the AI enrichment has already begun. There is a group called human resources and there are tables awaiting
enrichment. If we open the enrichment dashboard, we'll see that the schema itself has an enrichment added to say
that this is a combination of multiple things from personnel to accounting to sales to logistics. We also get a
dashboard of how annotated this schema is. There will also be feature functions to import export this for portability.
If we look at the group which is the human resources group, we'll see that the PRS tables actually make up the
human resources schema. We can see here that these columns have a simple description to make the LLM
more aware of what the actual values are. The annotations are free form. So things like aliases or business terms
can also be added. These annotations are a powerful and standard way for the developers to
declare the intent of the data in the semantics. So the LLM is more aware of the data structures and intent of the
data. Here we're making a new group for shipping. These are all our LOG underbar which stands for logistics. But by
adding an annotation, the LLM may not confuse log for server logs, but instead it'll know it's for shipping and
logistics. In addition, of course, the Oracle AI database also automatically provides highle data usage and
information to assist the LLMs. All right, great. So that was creating a schema. Now we've annotated the schema.
We told the AI what you know what the schema is all about. Next thing is to load data. So now you can easily load
data uh with AI assisted creation of ETL pipeline. So again you tell it what the pipeline you want is it'll it'll uh
create the syntax you need to create that pipeline. Um next step well you got to do some testing. So now we've also
added AI assistance for generating synthetic test data that closely matches the production data. All right. And
after that you're ready to go. Now we're ready to use the data. So uh now we have a feature called select AI that lets you
access the data using natural language. And in fact it works with a select statement but the select statement
instead of having SQL has a natural language expression. So you can see that example on the right. It says select AI.
That's a select statement you can hand to the database from any tool. How many employees have not submitted their
benefits plan choice? So from that natural question language question, it's going to go to the database. It's going
to figure that out. It'll return it to you. And then you can add ask follow-up questions. You can say show counts by
department. So it remembers where it was and it can add to that question. Uh and again you can run this from any tool
that that accepts SQL. Um this is also really useful for generating uh uh SQL statements as a first draft for
developers and it does much more than this even. All right. So we've we've created the we we're now uh quering the
data, but there's a lot of different models and a a lot of different parameters to these models. So the other
thing that's important to do is to test and optimize uh the embedding models, the LLMs and
the parameters for your data and your use case. And we've provided a tool for that also called Oracle AI optimizer and
toolkit. All right. So that's the data development workflow. Um, and the other important thing to
understand is all this AI technology is engineered and architected into the database. And what that means is all of
Oracle's mission critical capabilities just work seamlessly. So disaster recovery works, transactions work,
security, analytics, parallel SQL, everything works. Why? Because it's all architected together by us. So you can
just use it and you don't have to worry about any of these things. And this is technology that we've been maturing for
decades. So it's very mature technology that you can run in missionritical databases.
Uh okay. So that was a very brief overview of Oracle AI databases. Um Oracle AI database u as I mentioned
there's dozens of additional AI capabilities way more than I have time to talk about today. Uh for example uh
Exodata AI smart scan uh Golden Gate distributed AI and dozens of more capabilities.
Okay, so that's the first section of the talk, AI database. Huge advancements in AI database. It's really changing the
way databases work. Now, let's move on to AI appdev. Let's talk about what's going on there.
Okay, so this is what we want the future of AI appdev to look like. You say what you want an app to do and the LM
generates the app for you. Okay, it's pretty simple. Uh, we've all seen videos of this happening. It can do this. It
can do this. It can generate apps in seconds with thousands of lines of code. Uh, which is orders of magnitude faster
than we can. Now, what's the catch? The catch is these are enterprise apps that we're talking about and enterprises need
to be able to trust those apps. So, it generates thousands of lines of code. How do you know you can trust that? How
do you know it's secure, correct, dependable? uh you can tell the app, hey, make sure this is correct. Make
sure this is dependable. Uh but you know, you can't trust that either. You can try to put guard rails around it,
but that's kind of trying to fix things after the fact. So, what we're focused on at Oracle is designing trust into the
core of the architecture so that it just works and you can really trust it. And we call these technologies and best
practices that maximize both the innovation and the trust by aring architecting the whole thing together AI
data and appdev. We call these technologies generative development for enterprise or uh gendev for short. And
I'm going to give you a walk quick walkthrough of what we're doing in this space. So the first thing to know about
Gendev is that it's focused on highlevel solution centric languages like SQL. SQL is a solution ccentric language. Um why?
Because enterprises can't trust these thousands of lines of code that no human understands and nobody can evolve.
Nobody could the code is never static. Um and if you think about it, it's a lot easier to understand something that's 30
lines of SQL than thousands of lines of code which does the equivalent of those 30 lines of SQL. That's why these
solution centric languages are really important. So that's what we're focused on. uh so in our agenda methodology we
generate solutions using solution centric languages like SQL and this open application specification uh language
which I'm going to talk about later not so much on traditional code all right so that's the first step understandability
and understandability is key to trusting the generated solution but by itself is not enough there's other risks that we
have to deal with things like data level risks and application level risks so we have to resolve those risks before we
can trust the application. And we're going to start by talking about the data level risks.
All right. So there's three big data level risks that you can have even if you're using SQL. One is uh the
generated SQL can break data correctness. Why? Because it might violate data consistency or business
rules for the data. The second is data evolvability. Um if the SQL depends on the underlying format of the data then
you're not going to be able to evolve the underlying format. So we have to take care of that. And the third and
most important is uh data privacy. Um so we don't want to allow a user to view another user's data. That can't happen
under any circumstances. So I'm going to talk about these three things and and the approach that we're taking and this
is really important. Generating this trust with AI is super important. Okay. So, how does Gendev address data
correctness and evolvability? Well, we do it by combining SQL with these new trusted data APIs to access data. And
the trusted data APIs use this new technology that I've talked about here before called JSON relational duality.
And what that does, it provides a JSON object interface on top of SQL that allows each app to read and write
exactly the data it wants. So uh in that little uh screen you see there that's a purchase order and the API would take
all the data in the purchase order everything and either read it or write it from the database uh exactly the data
that that particular application wants. Now the keys here are that these APIs ensure asset consistency for all the
data in that the API accesses. So the entire purchase order has asset consistency and it doesn't use locks
because if you use locks you can abuse locks and lock up the whole database. Another key thing it does is allows you
to validate business rules on the full business objects. For example on a full PO you can define the the rules around
that and it can validate that on every change no matter what. If you change it through SQL, you change one row, one
column, we can validate the entire business object to make sure it's valid. And the third key is that that data API
doesn't expose the underlying schema. So that enables both app independence. You can write different apps without them
interfering with each other and schema evolution. You can evolve the schema under the app. So again, these
technologies are really key to to architect trust into the underlying database so that you don't have to worry
about AI messing it up. All right, so we talked about those. The next one and probably the most important one is data
privacy. This is a huge risk in the age of AI. Uh why? Because you can't expose other uh users private data. You can't
expose people's medical records. You can't expose their financial records. You can't expose their buying records.
uh you actually go to jail for doing that. Uh but the the the hard part is the privacy rules are very complex and
specific to the role of individuals in each company in each industry. You a simple example here is that if you have
an employee record, well there's different people in the company that can see different aspects of this. Like the
payroll department can see part of the employee record, the human resource department can see a different part.
Managers can see parts of it. So there's very complex rules about who can see what, where, when and how. Um, and the
database has never understood these rules. Um, so and this has been the problem. So today these data privacy
rules are implemented in applications and enforced by custom application code. Um, and this creates two big AI risks.
One is when you're using AI to generate new apps, how do you get it to enforce the rules? You can tell it here are the
rules. make sure when you generate the apps, you enforce the rules, but you can't be sure that it's going to do it.
Uh the second and even bigger risk is that to really gain from AI, you want AI to be able to directly access the data.
Um and the trouble with that is of course it bypasses these application privacy rules. Um and that MCP example
that we just saw was a good example of that. The AI was going directly against the database and accessing data. How do
you know it's not going to reveal private data? So these are two giant risks that have to be resolved in order
to make AI useful in a real enterprise. And the only way this is going to happen, the only way we can fully trust
data privacy enforcement is to implement it down at the source in the database itself. You can't trust the application
because it can be bypassed and you can generate applications that that don't follow the rules. So what we've done now
is we've built into the database very sophisticated rules very sophisticated rules engine uh that can specify exactly
what each end user in a company can which data that end user can access and the database enforces these rules. So
when an end user uh uses SQL to access data or an AI generates SQL to access data for that end user that end user is
only going to see the data that he's supposed to see. He won't see any other data. SQL won't return columns and rows
of things that that user is not authorized to seize. So that whole enforcement mechanism is built into the
database where it can't be bypassed. And I think this is this is absolutely vital for using AI uh in the future without
going to jail because you're you're exposing private data. Okay, so that's kind of data level risk. Let's move on
to app level risks. Uh now there's a lot of app level risks. There's kind of three categories. One is uh the LMS can
be tricked into malicious actions or answers. Another one's LMS can just misunderstand what you ask for. You you
ask for something and maybe what you said is ambiguous, maybe it misunderstands it. Another thing, as we
all know, LMS can hallucinate, can make things up. It can actually make mistakes in generating solutions. So all these
things are are very risky when you're dealing with enterprise data. Now there's kind of two kinds of
applications. One is internal applications that are used by experts that understand the risks and can deal
with them. Uh and there's another kind of applications that you have to have full trust for. For example,
applications that are used by the general public that you have no control over and you can't train. Uh and to to
deal with that kind of application, it requires limiting how AI can be used in the app. And there's two things that we
have to do to do that. We have to restrict the LM's inputs, usage, and results. And we have to validate uh the
reasoning behind the results. That's the only way we can really trust these things for um uh AI to generate these
things. And this is a this is actually a very um it's a very sophisticated area and I'm only going to show an example
here today because it get it can get more complicated. But I'm going to show you how we have a combination of two
things. One's called trusted answer search. Another thing called Apex interactive reports that you can use to
deliver trusted answers to natural language questions. So you you can make sure that the answer you're getting is
correct. Okay, starting with trusted answer search. The way this works is a user asks a question. Okay, that's
similar to what we saw before in natural language. And instead of having the user, the LLM try to answer that
question, what we do is we use vector search to match that question to pre-created reports. So you might have
100, 200, 500 pre-created reports. We match the question to the nearest uh the nearest um report, pre-generated report.
We don't use the LM to directly answer the question. Now you can use LMS to generate these reports or to help
generate these reports or to guide the answers. So natural language question use vector search. We've matched it to a
report. Now Shakiba is going to demonstrate how Apex interactive reports provides trusted answers within the real
time report. >> Millions of Apex applications rely on interactive reports to help people
explore data on their own. They surface trusted information in a way anyone can customize. And now with natural language
support, you can simply talk to your reports. Let me show you. Show me all employees that are managers. One
question and the interactive report applies the filters for me. What's happening here is important. The AI
takes my intent and turns it into report settings I can see and change. It's not generating arbitrary SQL behind the
scenes. Everything is applied as transparent filters, so I know I can trust the results. Group these managers
by department and show me their team sizes. Now the report organizes them by department with each team size right
there. I can even ask imprecise or ambiguous questions like show me high performing managers with a tenure of
four years or more. This is interesting because high performing isn't a column the data. The system interprets my
intent a rating of four or higher and applies right filters for me. I can also customize the appearance and shape of my
report. Highlight those in the sales department in green. And just like that, the top sales managers are easy to spot.
Now, let's pivot to see how departments are staffed. Full-time, contract, remote. This gives me a straightforward
comparison of staffing across all departments. Now, let's go and visualize this. Show me a chart of full-time
versus contract employees. Sometimes I may have a question that goes beyond what interact reports can do
with a single filter, pivot, or group by. For example, which departments have the highest average tenure and within
those, how many managers are rated outstanding? That's more complex. It requires multiple layers of aggregation.
In that case, Apex brings up the analysis assistant. It shows a thinking behind each answer so I can understand
exactly what's happening. The AI still isn't generating SQL. Instead, it interprets my intent and runs safe,
deterministic operations in the background. When possible, it refreshes the report with new chips. This extends
the power of interactive reports to answer open-ended questions with clarity and trust. And from here, I can keep
going asking follow-ups or diving deeper. This is the next chapter for interactive reports in Apex. Natural,
conversational, trusted. And the best part, millions of existing reports can be AI enabled just by upgrading.
>> All right, it's very impressive. Um, it's amazing what AI can do now. And again, this is trusted answers. You know
that these are correct. It shows you what it's doing. Um, and it's limited. It we've limited the LLM, so it can't do
crazy stuff. All right. So, that's trusted answers. Uh, we're also applying this Gendev methodology of uh
architecting AI data and app together to generate full applications. And we're doing that in Apex application
generator, which is a new generation of Apex. Now, what's Apex? Apex is Oracle's very popular lowode IDE that allows you
to visually create apps. Um, now Apex creates missionritical apps that scale to any level and it's very popular. We
have over 3 million Apex apps in production today and thousands of new ones are generated every day. It's also
important that Apex is completely free both to develop and run. So there's no charge for it. Okay.
Yeah, there we go. Um we've rearchitected Apex to be an aa an AI native application generator. Uh
and what we've done is we're rearchitecting Apex to use a new solution ccentric language. Remember I
talked about that like SQL is a solution centric language called open application specification language for creating
apps. And the idea here is this allows AI to focus on specifying what an AI an app should do rather than how it should
implement it. Uh and that makes it much more reliable and much more succinct and like SQL this open appspec language is
orders of magnitude more succinct and understandable than traditional appdev languages.
So the way this works is you just describe in natural language the app functionality, the pages, the data that
you want, the features that you want and AIEX uses the LLM to convert that description, the natural language
description into a succinct and understandable open app specification for the app. And then Apex simply
compiles that specification into a runnable app. Uh, and this generated app will use trusted data APIs, the thing I
talked about a little while ago, to ensure that its access to data is correct, evolvable, and secure. So
that's how this works. It's very different from anything else out in the market. Uh, this is a very simple
example of this open appspec language. This just specifies the simple report. You can see it just says here's the name
of the report, here's the type of the report. You give it the SQL statement and it kind of automatically generates a
screen for all the data uh returned by the SQL statement. Okay, that's a trivial example. Okay, so that was a
quick overview of AI appdev. Um and you've seen what you've seen is the key idea of this Genddev methodology is to
enable ultra fast innovation by architecting enterprise level trust into the stack into the data and application
stack. So it really can't be violated by the LLM. This is an approach that's very unique for us because we're really
focusing on enterprise capable apps. Okay, so that's the second section. Now moving on to the next section which is
open AI analytics and our new AI lakehouse. Um now Oracle's always been about open
standards and we're continue to be about open standards. In fact we're bringing open standards and open platforms to the
age of AI. Um Oracle our AI for data supports all the leading models and frameworks. every LLM, every framework
customers can either call them via APIs or can deploy them as private instances for added security.
So we're open to all the models. Uh we're open to running these things everywhere. So in all the public clouds,
you know, our multi cloud strategy, we can run Oracle database in all the leading clouds. We can run Oracle
database on premises. We can run in this model called cloud at customer where the database runs on premise but it's in a
cloud model and because we've architected all this technology into the database you get to choose where where
you deploy it. Now this advanced AI capability I've been talking about it's not limited to
data in the Oracle database. You can use Oracle AI in any of your data stores uh using Oracle AI databases advanced
federated query capability to create a AI proxy database. So an AI proxy database sits in front of your other
databases and it performs AI action. So a simple example here is you say select AI what were the top 10 products by
sales uh per region that goes to the AI proxy database. AI proxy database understands these other data sources on
prem sources, cloud sources, gathers the data it needs, gives it to the LM. The LM produces answers. So this is great
for uh all kinds of environments where the data is not all stored in one database. And one thing that it's really
good for is running against older versions. So you can put this proxy database for example against older
databases, older uh Oracle databases or older anybody's databases. So that's the AI proxy database.
Uh in addition for many years we've supported uh sophisticated analytics on data in object stores. So we we've
already released things like native delta sharing parallel SQL materialized views all sorts of technologies against
data in object stores. Uh now and this is something we're announcing at this at this uh AI world. We're embracing Apache
iceberg to provide a truly open lakehouse. Um, and for those of you that know what
don't know what Apache iceberg is, it's a standard that's defines the common uh table format in an open catalog for
tables stored in object store files. So you store your data in an object store file in this format. And the whole goal
of this thing is that once it's stored in this common standardized format, then any product can access the data both
read and write the data. So iceberg tables can be both read and written by a multitude of databases and by things
like spark engines and directly by python programs. So it's a completely open standard where all vendors are
interoperable. Um now these iceberg tables have a lot of benefits. Uh they do things like data
versioning, partitioning, basic transactions, basic schema evolution. Uh but these kind of shared tables have a
bunch of drawbacks also. You can't have indexes. It's batch oriented. Uh transactions can't spend tables.
Rudimentary security, slow storage. So what we're bringing to market is a new version of our database, a new product
called Oracle Autonomous AI Lakehouse. And the idea of this is to provide the best of both worlds and resolve a lot of
these trade-offs. Uh so what we're we're delivering is a the vendor independence of a Apache iceberg. So we can read and
write this vendor independent data with all the power of the Oracle AI database. And in fact as of today the 23 AI uh
data warehouses that you already have are automatically converted into AI lakees. So now they're enabled to access
all this data in a lakehouse in Apache format. So with this autonomous lakehouse, we're providing all of
Oracle's best of breed analytics and SQL on top of the whole uh all the all your data lakeink data. So what do you get?
You get the AI vector search I talked about earlier for fast index semantic search uh on all the data lake. You get
the richest analytic SQL of Oracle. So that handles both advanced relational but also graph JSON on the same iceberg
data. uh you get exodata performance because we will automatically cache frequently accessed data in the data
lakeink in exodata and then access it with exadata performance while keeping it completely consistent. Uh you get
scale out data access with a a new technology that's a pay-per-view serverless uh data link accelerator and
you get the secure data access that I've talked about lately that ensures privacy provides comprehensive security
governance and sovereignty. Now along with that you also get a federated catalog of cataloges that that
comes with it. Uh and what that does it provides uni unified discovery and access to data across iceberg and dozens
of other data stores. So it's a catalog of catalog. It finds all your data and catalogs it. Uh and what it enables is
plugandplay access to iceberg data from Oracle SQL. So you just name the iceberg tables and their catalog in the from
clause of a SQL statement and it finds it and accesses it. And you can see the example there where you just put the
schema.t catalog. And this works with any iceberg catalog whether it's our catalog or
datab bricks catalog or snowflakes catalog. So no matter who cataloges the data we can find it we u make it really
easy to access and we can both read and write that data. Okay. Uh, not only can you find that,
you can also load data into iceberg. So, we've adapted our industry-leading ETL and industryleading Golden Gate
replication products so that we can easily move data from your existing systems into iceberg format that's
completely open uh with uh from both operational and analy systems in real time.
Okay. So, that's a big deal. This whole iceberg open data lake. uh in this section what I've talked about and I've
shown is the power and simplicity of Oracle's open architecturally converged everything works together approach to AI
uh and you've seen that this enables uh you to deliver trusted AI insights innovations productivity and apps for
all your data data in Oracle data in other uh stores data in the object store data in data lakes data in thirdparty
databases is um and Oracle one one of the unique things is we're we're running the same
AI across everything. So whether it's operational data or data lakeink data you get the same AI capabilities so you
can standardize and simplify your AI estate and also we're continuing what we announced last year which is all these
AI capabilities that I've talked about are included in Oracle database for free at no additional charge. So you have
these already if you're if you have Oracle licenses. Uh no additional charge. You already own everything I've
talked about. Okay. So that was AI lakehouse. Uh now we're going to transition to our last topic which is
the Oracle AI data platform. And to describe that, I'm going to invite Oracle EVP TK Anan to the stage to talk
about this Oracle AI data platform and how it extends the Oracle AI database that I just talked about by providing an
integrated platform to bring all your enterprise data together and use AI to solve real world business problems. So
please welcome TK Anant. Please welcome to the stage Oracle's executive vice president of AI data
platform, TK Anand. Welcome TK. >> Hi. Hi everyone. Hi everyone. I hope you're all enjoying AI world.
I'm TK I'm TKan. I'm excited to be here and talk to you about the AI data platform. You you you just heard you
just heard Juan Louisa talk about the Oracle AI database and all of the amazing AI innovation that's going on
there and the AI data platform the data platform builds on top of the database and provides an end-to-end platform for
data analytics and AI. All right, let's get into it. So, so, so AI is driving AI is driving the next industrial
revolution and we know it's going to disrupt every industry. It's just a matter of time,
months or years, not decades. The organizations that are going to survive and thrive in the AI era are those that
can augment and reinvent every aspect of their business with AI agents. We've all seen what foundation models can do with
the massive corpus of public domain data they've been trained on, but they know very little about your organization and
your business. The key to achieving AI transformation is to get these models to understand your enterprise data, your
business applications, and your workflows. So the Oracle AI data platform, the
Oracle AI data platform is a comprehensive platform that brings your enterprise data together with
industryleading foundation models to help you build agentic applications and experiences for your business users.
Firstly, firstly, it provides a firstly it provides a data foundation that brings all your enterprise data together
under one roof. This includes all your databases and applications, your structured and unstructured data, your
historical and real-time data. All of this data, all of this data is made AI ready through a unified catalog, through
semantic enrichment, vectorization, etc. And then and then we have an AI platform and then we have an AI platform on top
of that data that en that enables developers to build AI applications and agents. The platform includes foundation
models, AI frameworks and developer tools. We also have agentic experiences for business users to consume these AI
solutions. So, so you can, so you can think of the AI data platform. You can think of the AI data platform as these
two systems that work in concert with each other. Getting your data ready for AI and then leveraging AI to transform
your business. Next slide. Yeah. All right. All right. What is All right. What is the data
platform? Let's look into the key capabilities. At the core of the data platform is an open lakehouse. It's an
open lakehouse built on open standards and there's a unified catalog that brings all your data and AI assets
together to enable integrated security and governance. On top of this data on top of the data we offer we offer on top
of the data we offer the best of Oracle and open-source data engines. You already heard you already heard one talk
about the Oracle II database and that's core to this platform. In addition, in addition, we also offer
Apache, Spark and Flink, which are popular open source engines for working with these data links. And then we have
industryleading AI models like OpenAI. Uh we have AI we have AI frameworks like Langchain. All of these are built into
the platform. And then and then on top of this data and AI foundation, we have an integrated developer workbench that
supports a variety of data analytics and AI use cases. And then we have an agentic user experience for business
users to work with AI agents using interfaces like chat, data visualizations, business workflows, etc.
So, so, so the AI, so the AI data platform, the AI data platform is a brand new pass service in OCI, but it
integrates, but it integrates multiple underlying services together into a cohesive experience. It it relies on it
relies on OCI. It relies on OCI infrastructure services such as CPU and GPU compute for for data processing,
model training and inferencing. It uses object storage to manage all of your structured and unstructured data in the
lakehouse. It it leverages autonomous database. It leverages autonomous database for all the amazing
capabilities that Juan talked about, but most notably high performance query processing and data retrieval for AI
applications, vector store for rag agents that Juan talked about, etc. It it leverages the OCI generative AI
service for all of the foundation models like OpenAI. It leverages Oracle analytics. It leverages Oracle analytics
for its semantic modeling and data visualization capabilities. And fin and finally, the air data platform comes
with a set of built-in services, a set of built-in services for things like the open source spark and flink engines, the
unified catalog, the developer workbench, um the business user experience, etc. So one of one of our
goals one of our goals in creating the A data platform was to ensure that we eliminate that we eliminate the
developer effort required to wire all these products and services together and instead developers can just focus on
building the solution. All right. All right. Now all right now let's talk about let's talk about how
let's talk about how you get your data ready for AI using the AI data platform. The air data platform comes with the air
data platform comes with an enterprisegrade data lakehouse that brings all your data together under one
roof. You can you can bring data from all your databases and applications into the lakehouse. Like I said, it's
structured and unstructured data, historical and real-time data. All of this is made accessible through a
unified catalog. You you can ingest you can ingest data into the you can ingest data into the lakehouse using a variety
of different techniques. You can use batch ETL or ELT pipelines. You can use streaming data pipelines using Golden
Gate or CFKA. You can also just you can also just leave your data in the source system and access it live through the
catalog um through zero copy data integration. And then you can implement you can implement a medallion
architecture on top of the lakehouse which is a pretty common pattern. So for examp so for example all your source
data from the source systems ends up populating the bronze layer. The silver layer is the silver layer is populated
by transforming, cleansing and enriching your data. And the gold layer typically represents the most curated version of
the data that you use for your analytics, your AI applications and so forth. The AI data platform, the data
platform supports open formats, namely iceberg and data lake for managing your data in object storage for the uh bronze
and silver layers. And the gold layer, the gold layer is typically managed natively in the autonomous database for
high performance retrieval within analytics and AI applications. The AI data platform makes it really simple to
it makes it really simple to manage this medallion architecture thanks to the unified catalog that provides integrated
security governance and lineage across all layers. Now, now on top of this data, on top of
this data layer, on top of this lakehouse foundation, the air data platform offers a developer workbench.
It offers a developer workbench where developers can implement all their solutions in one environment. This
includes uh this includes projects like data integration, data engineering, data science, agent development, etc. The the
workbench the workbench consists of the workbench has an AI assisted notebook interface that supports multiple
languages including SQL, Python, Scala and Java. We also have a visual drag and drop interface for low code developers.
The workbench is integrated with git to enable source control, versioning and team development. And you can and you
can use the workbench you can use the you can use you can use the workbench to create you can use the workbench to
create jobs that run on all of the supported data engines. So for example, you might for example, you might run a
you might create a job that runs on the autonomous database or on a Spark cluster or you can create jobs that run
across multiple of these data engines. So So now let's take a look at the AI data platform's developer workbench.
Here's a quick demo. Oracle AI data platform unifies AI models, enterprise data, and an
intelligent developer experience to turn data into value. Meet Jordan, a data engineer at a leading telecom company
tasked with predicting customer churn using customer profiles and customer reviews. In her AI data platform
workspace, Jordan opens a notebook with direct access to her organization's data catalog, which manages all assets using
Medallion architecture. From Bronze Layer's object storage volume, she drags and drops the customer
reviews file to begin exploring the data. With AI Code Assist, she joins customer reviews with detailed customer
profiles, bringing together structured and unstructured data into a single data set.
Next, with just a drag and drop, Jordan instantly receives sample code that demonstrates how to call an OCI Genai
model. This ready to use code gives her a clear starting point. With a few quick edits, she customizes it to analyze the
sentiment of each customer review. Then, she writes additional codes to save the results into a new table in the silver
layer. AI Code Assist can also create code comments and generate full documentation
explaining the code's purpose, logic, and use cases for developers. With profiles and sentiments processed
in the silver layer, Jordan applies a machine learning model from the catalog to predict churn risk. Again, she uses
AI code assist to run the model and save the output to the gold layer, powered by Oracle's autonomous AI lakehouse.
To validate the results, Jordan simply switches the notebook cells language to select AI, pointing directly at the gold
layer. Then she drag and drops the table and asks select AI how many customers are likely to churn. Autonomous AI
lakehouse generates the query, runs it, and delivers the answer. In just a few steps, Jordan has
transformed raw unstructured data into predictive insights by combining Genai, custom ML models, and enterprise data.
Finally, Jordan schedules the notebook to run automatically when new reviews arrive with notifications on completion,
providing end-to-end orchestration with ease. Oracle AI data platform empowers teams to turn data into insights faster,
smarter, and at enterprise scale, driving innovation across the business. All right. All right. Now, all right.
Now, let's talk All right. Now let's talk about Now let's talk about how you leverage how do you leverage all of that
enterprise data? How do you leverage all that enterprise data in the lakehouse to build AI applications for your business?
The AI data platform the data platform comes with a comprehensive set of AI models and frameworks that are ready to
use out of the box. We we have we have we have industryleading we have industryleading foundation models such
as OpenAI, Grock, Llama, Coher and soon Gemini. We have we have we have popular we have popular AI frameworks. We have
popular AI frameworks like PyTorch, TensorFlow, Langchain and Langraph. They're also integrated into the
platform. And these models and frameworks run on highly optimized CPU and GPU compute shapes in OCI. On on top
of these on top of these AI models and frameworks, the AI data platform has an agent studio. It has an agent studio
that's integrated into the developer workbench. The agent studio supports a variety of use cases such as gaining
insights from data semantic search over unstructured documents, orchestrating workflows, monitoring business
processes, raising alerts, etc. So let's see a quick demo of the agent studio in the AI data platform. Oracle AI data
platform reimagines how enterprises build and deploy AI applications. It brings together advanced AI models, a
modern developer experience, and enterprisegrade data security and governance.
Meet Alex, a developer at a soft drinks company. His task, build an AI agent to monitor new government regulations such
as restrictions on packaging or labeling. Inside the AI data platform workbench, Alex creates an agent flow.
He begins by setting the trigger. In this case, a daily schedule that executes the agent automatically.
Next, Alex adds an agent and starts assembling the tools it needs. First, he connects a rag tool to the agent and
assigns a regulation knowledge base that was built with Oracle 26 AI. This allows the agent to scan newly published
regulations, compare them to existing rules, and instantly spot any new proposals. With Oracle 26 AI vectorizing
the content, the agent has precise, reliable access to regulatory knowledge. Then, Alex adds an NL2 SQL tool. This
allows the agent to query the CX and supply chain systems to determine which products could be impacted by the new
rules, whether those involve packaging, labeling, ingredients, or other characteristics. to capture the details
for auditing and analysis. Alex adds a SQL tool that writes a new entry into the company's regulations database.
Finally, he connects Slack and email outputs so the relevant product and supply chain managers are notified as
soon as a regulation is detected. With just a few clicks and minimal coding, the agent is ready for testing.
From the editor playground, Alex can simulate triggers and inputs, review outputs, and monitor the agents
behavior, accuracy, and performance step by step. Once validated, the agent is deployed to a production environment
where it runs automatically on schedule. Just a few days later, the agent proves its value.
When California proposes new limits on plastic bottles, the agent identifies the regulation, logs the details in the
database, and sends a Slack alert. The business owner immediately reviews which products are affected and begins
planning next steps. Agents enable organizations to act faster, stay compliant, and uncover opportunities.
With Oracle AI data platform, enterprises gain a foundation for building and deploying trustworthy AI
applications that keep them agile, competitive, and resilient. So, so the AI, so the AI data platform,
the AI data platform has a unified catalog. It has a unified catalog for not just not just for all of your data
assets, not just your data assets, but also all of your data science and AI models and your AI agents and tools. All
of the AI agents and tools that you build in the developer workbench, they're automatically registered in the
catalog and they support open interop standards like MCP and A2A. But o but over time but o o over over time we over
time we expect most organizations to have numerous AI agents that run in different platforms and applications. So
for example you can have you can have agents running you can have agents running in your SAS applications like
Fusion or Salesforce. You can have agents running within a productivity suite like Office 365. You might have
custom agents that you build uh in a cloud platform like Azure or AWS. But the AI data platform will let you
register these agents into its unified catalog regardless of where they run. Now, why is this why is this
interesting? Why is this interesting? Because the the AI data platform the data platform comes with an agentic
experience for business users called the agent hub. In the in the same way in in the in the same in the same way that BI
tools and data warehouses enable business users to have a single pane of glass over their organization's data,
the agent hub enables business users to have a single pane of glass over the organization's agents. So, so for so for
example, so for example, the agent hub will let business users navigate and search the catalog of agents and have a
conversational interface with agents that enables things like enables things like getting access to information,
gaining insights from data, initiating tasks, chaining tasks into workflows, creating a team of agents to solve a
complex problem, etc. So, so our vision, our vision for the agent hub is to become a crossplatform and cross
application experience for business users to leverage AI agents to automate their work and improve productivity. So,
let's take a look at the agent hub. Oracle agent hub provides business users with a unified experience to interact
with all their agent powered applications and tools, delivering a central entry point for workflows,
tasks, and activities. Right from the start, users see their top tasks, key notifications, and quick
access to frequently used agents along with curated links to enterprise applications. When a user makes a
request, the system automatically identifies the right agents from the catalog and orchestrates them to
complete the task. For example, asking to send a compliance training reminder triggers HR agents to check completion
status. Once it confirms who hasn't completed training, it calls an email tool to send the reminder. The system
even suggests scheduling a daily reminder until the task is complete and sets it up automatically once approved.
The value comes alive even more with complex questions. A business user might ask, "Will AMIA be able to fulfill sales
of our top five products next quarter while keeping overstock under 20%." AgentHub identifies the sales and supply
chain agents, pulls together their insights, and delivers a consolidated answer. The response can include
narratives, visual data, recommended actions, and links to applications. Because the reasoning behind the answer
is transparent, users can review the details, verify the outcome, and build confidence in the system. From there,
AgentHub goes beyond analytics and into action. The user might ask, how can we address the identified shortfall? The
system then coordinates across agents to propose a supply chain strategy. It generates an actionable plan that can
be approved in one step or provides links into the supply chain application so leaders can review purchase orders
and other details before moving forward. By combining intelligence with execution, AgentHub helps organizations
not only understand their business but also act on it immediately. All right. All right. So,
all right. So, let's recap like we we've we've gone through we've gone through all of the layers. We've gone through
all of the layers of the ADA platform starting from the data and AI foundation to the developer workbench, the agent
studio and finally the agent hub experience for business users. I'm h I'm happy to announce that I'm happy to
announce that the AI data platform is now generally available and many many of the product features and capabilities
that I've shown you today are already available in the product while some like the agent hub are still in development
and should be released over the next 12 months. Now now leading leading up to general availability leading up to GA we
worked with several customers who use the product as design partners and gave us valuable feedback and here are two
customers uh here are two customers for which we'd like to share a brief video. This is University College Dublin and
Clope. >> My name is Colin Mcmahan. I'm the executive director of the UCD Clinical
Research Center. Our mission at the UCD Clinical Research Center is to translate research into impact for better patient
care. The challenge we faced previously was unlocking um significant amounts of unstructured and semi-structured data
and being able to combine that with open data sets to in this instance deliver a a use case around better respiratory uh
care specifically patients suffering with chronic diseases. We've been working with Fertise for several years
now. They brought expertise on the Oracle AI data platform. They manage the governance and the compliance aspects
and were able to prototype a a demonstrator within a number of weeks. In the pilot, we're able to use fully
synthetic data and open data sets to develop a decision support tool that ultimately we'll be able to scale in a
fully governed and compliant way. So the Oracle AI data platform has allowed us for the first time to take stock of
what's possible is allowed our clinicians to understand the art of the possible. The benefit of the Oracle AI
data platform is that it provides a governed fully integrated and scalable solution. We're excited about the
potential to take these uh unstructured data sets um and convert those to structured data sets to allow us to
perform longitudinal analysis and the impact that that might have on patients but also on clinicians in terms of how
their time can be managed by better understanding how patients can be managed in the community using this
data. [Music] >> Clope is the largest garage door
manufacturer in the US. We are based out of Ohio. We have multiple manufacturing locations all within the US. We are
basically a mass manufacturer of custom products, millions of SKUs. Every door is made uniquely for our customers.
Prior to implementing Oracle AI data platform, we had to get to this information manually, pulling that
information from OBI into spreadsheets and manipulating spreadsheets to really look at different SKUs, geographies,
pricing to really understand what the issue was. Using Oracle AI data platform, we are able to predict our
dealer churn much more accurately. It's a more database decision. This helps us uncover exactly what is going on and
it's a great tool. It really drives our bottom line and also the predictability. It helps us understand where they're
going. It's not only looking back, it also gives us a view into the future. Using Oracle AI data platform was
natural to us. Clope is an Oracle shop and for us this is a natural extension to continue to use Oracle tools into AI
to help us improve our business performance. [Music]
All right. Right. The the rapid the rapid growth that the rapid growth we've seen in the Oracle cloud over the past
few years has only been possible due to the support and commitment of our partner community. And we we've been
working we've been working with a number of our we've been working with a number of our global partners to get them ready
for the AI data platform launch. And I'm happy to note that some of that some of these global partners have committed to
over $ 1.5 billion towards comprehensive training and development of industry use cases and solutions. So here's a brief
video showcasing the work that these partners are doing uh towards AI data platform.
Oracle AI data platform represents an exciting leap forward for organizations looking to reimagine what's possible
with data and AI. I believe it's ideal for enterprises seeking to modernize their data and AI capabilities.
At Infosys, Oracle's AI data platform is a top strategic priority for investments, for talent development, and
for growth. Infos has invested over 140 million USD in R&D during FI25 as stated in our latest annual report and the
plans to make significant investment in Oracle's AI data platform capabilities over the next few years. Today I'm
excited to share that we are building several industry use cases that leverage generative or agentic AI. These use
cases built on Oracle's AI data platform will be part of Infosys to pass EI first offering. We remain dedicated to
empowering our customers to achieve transformative outcomes with AI. For over 30 years, we've been committed
to driving client outcomes, leveraging the power of data, and now more recently with AI. I am super excited that
Cognizant is one of the strategic launch partners of Oracle's AI data platform. We at Cognizant announced a billion
dollar investment in AI last year. Our commitment in shaping the AI and Agentic journey for our enterprise clients is
steadfast. We see this AI data platform as a strategic component of this journey that our clients are on. Our goal is to
train over a thousand associates within the next 24 months. We're working with Oracle and join offerings and use cases
in industries as broad as manufacturing, retail, technology, travel, hospitality, and utilities. We also aim to create
over 50 industry specific agentic AI use cases that leverage both the Oracle AI data platform as well as our own agent
foundry. We are thrilled to see Oracle announce its new AI data platform. A major
milestone in enterprise AI innovation. This advancement will allow Accentra's clients to unlock the full value of
Oracle's capabilities from day one. Since our $3 billion commitment in 2023, Accenture has been heavily investing in
AI technology and upskilling our people so that we're ahead of the curve for innovation. We've been excited to walk
this path handinand with Oracle. Quite frankly, as our partner together, we are embedding Oracle capabilities into
Accentur's AI refinery, a modern unified framework built on Oracle cloud infrastructure and powered by NVIDIA.
For years at KPMG, we've been focused on helping our clients make better decisions and automate business
processes through the use of data, AI, and analytics. We're very excited about the release of Oracle's new AI data
platform, allowing us to further use data and AI to help our clients make better business decisions and automate
business processes. We're also going to utilize that data in the form of industry analytics where we've worked
with our industry experts in KPMG and created a way to very quickly access that information as we continue to focus
on adding valuable solutions to our clients. We anticipate investments of $200 million over the next 3 to 5 years
as well as training an additional 1,600 employees globally. We are empowering organizations to
accelerate success using data and AI especially in decision making process against metrics that matters the most.
We are extremely proud to be a partner of Oracle at the launch of this new AI data platform. As part of our
billiondoll commitment to AI initiative, Oracle's AI data platform will serve as one of the key cornerstones in our AI
strategy. At PWC, we have thousands of consultants across the globe equipped to drive the strategy and deliver with
Oracle's latest AI technologies. Oracle's AI data platform will be a core enabler of our agent power performance
solution. Agent power performance solution is PWC's flagship IP that combines AI agents, automation, and
analytics. Leaders often wonder why do my AI pilots stall and value takes so long. The
answer is usually siloed data, lagging integration and timewasted managing different tools which drives up costs
and compliance risk. Oracle's new AI data platform changes this by uniting data governance, analytics and AI in one
solution. Now with LTI Minry's transistor, you can quickly migrate KPIs, models, and pipelines to AIDP and
gain value from day one. Today, I'm happy to announce that we're investing over 200 million and training thousand
plus experts over the next 2 years to support this journey and unlock value for our customers.
>> All right. All right. Before I before I wrap up, I want to make an important point about the air data platform and
how it relates to our SAS applications. For for all of our major for all of our major application suites like fusion,
netswuite, health, life sciences, etc. We will offer a tailored we'll offer a tailored version of the air data
platform that comes with pre-built integration with the SAS applications. This includes this includes data it
includes data pipelines, lakehouses, business semantics, analytics, and of course AI agents. This is this is just a
natural evolution. It's a natural evolution of our existing products like fusion data intelligence and health data
intelligence. And the idea the idea is that you can get started you can get started with the air data platform with
immediate value for your business users and still you have the full power of the platform in OCI to extend and customize
the solution. All right. All right. That's about it. So I just want to wrap up by saying the AI data platform is a
comprehensive and integrated platform for all your data analytics and AI use cases is built on a lakehouse
architecture supporting open data formats. The unified catalog enables end-to-end security, governance and
lineage. We bring we bring together we bring together the best of the Oracle AI database and industryleading open source
engines like Spark to give you high performance and scalable processing on your data. We have industryleading AI
models like OpenAI and Gemini already integrated into the platform. Oracle applications, Oracle apps customers can
get started with the pre-built AI data platform solutions that I just talked about. We have a we have a number of
sessions at AI world that get into the uh technical capabilities of the AI data platform. And so please check those out.
And I and I just want to I just want to reiterate I want to reiterate what Juan Loesa said at the start of this keynote.
The AI revolution is here and your organization needs you to step up and be an AI leader. The Oracle AI database and
the AI data platform can help you in this transformation journey. Please scan these QR codes to learn more and thank
you.
Oracle's AI Database introduces an AI-native architecture that integrates AI deeply with existing databases, using AI vectors to represent semantic content like documents and images as numeric vectors for fast similarity search. This enables more effective retrieval-augmented generation (RAG) for precise AI answers while combining vector and traditional searches, ensuring secure and reliable data handling for enterprise applications.
GenDev focuses on solution-centric languages like SQL and Open Application Specification Language (Open AppSpec) to create understandable, trustworthy AI applications. It embeds data privacy rules directly in the database, offers trusted data APIs with JSON-relational duality for consistent access, and provides tools such as Apex AI Native App Generator to convert natural language app descriptions into secure, evolvable enterprise apps.
Oracle's Autonomous AI Lakehouse supports open standards like Apache Iceberg for multi-cloud data formats, enabling seamless cross-platform data access. It combines these open data formats with Oracle's optimized SQL, indexing, data caching via Exadata, and scalable serverless data access. Additionally, it features a federated catalog for unified discovery across multiple data stores and supports real-time data ingestion through integrations with Golden Gate, Delta Sharing, and Kafka.
Oracle enforces enterprise-grade security and privacy at multiple levels, including database-level controls that embed data privacy rules to prevent unauthorized data exposure. Their AI Data Platform provides end-to-end lineage, fine-grained access control, auditing, and compliance measures, ensuring that AI-generated answers and applications are trustworthy, secure, and compliant with enterprise standards.
Oracle's AI Data Platform offers a unified data foundation combining structured and unstructured data with historical and real-time inputs under an AI-ready catalog. It includes a developer workbench integrating popular AI models and frameworks like OpenAI and Langchain alongside open-source engines such as Spark and Flink, enabling seamless AI app and autonomous agent development within a low-code environment.
Businesses across industries such as healthcare and manufacturing use Oracle's AI Data Platform to improve decision-making, operational efficiency, and predictive analytics. Oracle’s extensive partner ecosystem and multi-cloud support help organizations implement AI use cases confidently while maintaining data security and compliance, driving measurable value and innovation in enterprise AI transformation.
Heads up!
This summary and transcript were automatically generated using AI with the Free YouTube Transcript Summary Tool by LunaNotes.
Generate a summary for freeRelated Summaries
Oracle AI Database 26AI: Revolutionizing AI-Driven Applications and Analytics
Oracle AI Database 26AI delivers groundbreaking features that unify data models, enhance developer productivity, and enable mission-critical AI at scale. Its converged architecture simplifies application development, accelerates AI analytics on open data lakehouses, and safeguards data with advanced security, transforming how enterprises innovate with AI.
How Infosys Drives Enterprise AI Transformation and Innovation
Explore how Infosys is leading the AI revolution by enabling enterprises to modernize legacy systems, implement AI-first strategies, and harness scalable AI solutions across industries like manufacturing, financial services, and telecom. Insightful examples and frameworks highlight the journey from experimentation to enterprise-scale AI adoption.
Oracle Database Insights: Truncate Performance, UUID Defaults, and Online Reorganization
In this detailed Ask Tom Live session, Conor McDonald discusses key Oracle Database topics including reasons behind slow TRUNCATE operations, the new UUID function challenges and workarounds, and strategies for online table reorganization in Standard Edition. He also shares essential tips on memory management, log buffer sizing, and AI integration in Oracle environments.
The Future of Business: Leveraging Autonomous AI Agents
Discover how autonomous AI agents can transform the way businesses operate and increase efficiency.
The Revolutionary Impact of Claude AI: A Game-Changer for Software Engineering
Explore how Claude AI surpasses GPT-4 and revolutionary features that redefine productivity.
Most Viewed Summaries
Kolonyalismo at Imperyalismo: Ang Kasaysayan ng Pagsakop sa Pilipinas
Tuklasin ang kasaysayan ng kolonyalismo at imperyalismo sa Pilipinas sa pamamagitan ni Ferdinand Magellan.
A Comprehensive Guide to Using Stable Diffusion Forge UI
Explore the Stable Diffusion Forge UI, customizable settings, models, and more to enhance your image generation experience.
Mastering Inpainting with Stable Diffusion: Fix Mistakes and Enhance Your Images
Learn to fix mistakes and enhance images with Stable Diffusion's inpainting features effectively.
Pamamaraan at Patakarang Kolonyal ng mga Espanyol sa Pilipinas
Tuklasin ang mga pamamaraan at patakaran ng mga Espanyol sa Pilipinas, at ang epekto nito sa mga Pilipino.
Pamaraan at Patakarang Kolonyal ng mga Espanyol sa Pilipinas
Tuklasin ang mga pamamaraan at patakarang kolonyal ng mga Espanyol sa Pilipinas at ang mga epekto nito sa mga Pilipino.

