As enterprise AI adoption accelerates, the partner ecosystem is undergoing significant structural changes. Revenue models tied to traditional analytics, reporting, and large-scale data warehousing are evolving in response to rising demand for AI-led applications, domain expertise, and modern data architectures. This shift highlights a need to redefine value across data, analytics, and application delivery.
“When we go and talk to customers, and this is something our partners also see, the biggest area of focus and spend today is AI applications. However, customers also understand that to build these agents and applications, data is at the core,” said Srikanth Gokulnatha, senior vice president of AI Data Platform, Analytics, and Analytical Applications Products at Oracle, during an interaction with CRN India. In B2B use cases, enterprises face challenges in combining private enterprise data with frontier large language models, thus extending beyond traditional data warehousing.
Unlike previous data warehouse projects, which primarily focused on structured datasets, current AI-driven use cases require handling both unstructured and transactional data. Gokulnatha noted that nearly 70 percent of the effort in building AI agents is tied to data engineering, making data the first hurdle organizations must navigate.
For partners, the emphasis is shifting as organizations replace traditional data workloads with enterprise lakehouse models to modernize architecture and support AI applications. While customers are eager to implement AI, they often struggle to identify specific use cases. This gap presents an opportunity for partners to provide domain-focused agent portfolios and structured go-to-market approaches.
Traditionally, the partner playbook in AI centered on data preparation, where partners prepared enterprise data first and defined use cases later. However, Gokulnatha observed a shift where partners now engage at the business requirement stage, starting with the problem before building AI solutions around it. “There are two approaches,” he explained. “One is the bottom-up approach, which is getting the data ready, identifying the use cases, and then building the solution. But increasingly, I see partners starting with the business problem first.”
This evolution in approach reflects the rapid advancement of AI capabilities. Solutions crafted around the constraints of last year’s models may quickly become obsolete. As frontier models improve—particularly in areas like coding and natural language processing—capabilities once deemed complex, such as NLP-to-SQL translation, are becoming more automated. As a result, partners cannot afford static solution blueprints but must continuously adapt.
Gokulnatha pointed out that larger firms are increasingly identifying themselves as “forward deployed engineers,” teams closely embedded with customers to define AI-driven business outcomes and iteratively build, test, and refine domain-specific agents. This structural shift necessitates reskilling within partner organizations.
While 70 percent of AI project work remains rooted in data engineering—a domain where many partners already excel—the real investment is shifting toward building competence in agentic AI and advanced data science capabilities. Partners are establishing internal training programs and collaborating with platform vendors to enhance these skills. “Where partners need help, and where they are investing in building internal competence, is around how to use agentic AI, and how to deepen and broaden their knowledge of data science and related areas,” Gokulnatha said.
To support their AI strategies, partners are increasingly setting up Centres of Excellence (CoE) that serve as both labs and demonstration environments. These CoEs enable partners to build, test, and showcase agents developed for specific business use cases. Gokulnatha noted that this model is gaining traction as a preferred go-to-market approach for AI-led engagements, allowing customers to observe agents in action and assess their potential applications directly.
While traditional analytics and reporting projects are evolving rather than declining, Gokulnatha explained that these capabilities are becoming embedded within broader AI applications. In large enterprises, around 50 core reports are commonly utilized, but this number often swells to 1,500 or more due to a proliferation of niche reports. Maintaining this “long tail” of reports generates operational overhead.
AI-driven conversational interfaces can streamline this process, allowing organizations to keep standardized core reports while replacing less frequently used reports with chat-based access to insights. “The underlying demand for analytics does not disappear. Instead, reporting is being translated into AI-powered experiences,” Gokulnatha stated. He emphasized that while standalone dashboard projects may diminish in relevance, analytics will remain central, now integrated within AI agents and enterprise applications.
As partners look ahead, Gokulnatha identified two investment priorities for the coming 18 months. The first is modern data architecture, as the adoption of open standards is becoming crucial for managing data through an open lakehouse approach, which is foundational for AI agents and applications. The second priority lies in maintaining a deep and evolving understanding of large language models, where assumptions about model capabilities can change swiftly.
“Patterns such as multi-agent orchestration are becoming more relevant as organizations attempt to move AI into mission-critical workflows,” Gokulnatha added. Deploying AI in these environments requires not just experimentation but architectural depth. As AI adoption deepens, partner revenue will increasingly rely on reusable domain intellectual property and architectural sophistication, positioning those who blend modern lakehouse foundations with strong agent design capabilities to thrive in critical AI deployments.





















































