AI News Bureau
Written by: CDO Magazine
Updated 12:46 PM UTC, March 17, 2026
Schneider Electric operates at the intersection of energy and digital transformation, supplying technologies that help customers manage electricity, automate industrial operations, and run more efficient, resilient facilities, from homes and buildings to factories and data centers. The company positions itself as a partner in sustainability and efficiency, delivering connected products, software, and services across the energy lifecycle.
In Part 1 of the series, Philippe Rambach, Chief AI Officer at Schneider Electric, explored AI’s role in Schneider’s organizational strategy.
In this second part, Rambach continues his conversation with Dr. Julian Schirmer of OAO and HEC Paris, shifting from why AI matters to how it is operationalized inside the enterprise.
Rambach places Schneider Electric’s AI journey on a long timeline. “We have been doing AI almost since machine learning existed,” he says, noting that the company has maintained AI-related research teams since the 1980s’, though its role was largely centered on innovation and R&D.
The turning point came in mid-2021, when Schneider’s board chose to accelerate AI. For the organization, the decision is less about chasing a trend than recognizing conditions are ready: customers, data, and the company’s work supporting customer digitization.
As Rambach describes it, the board asks him “to build what we need to deliver AI at scale for our customers and for our employees.”
Responding to Schirmer’s question about governance and reporting lines, Rambach explains that he reports to Chief Digital Officer Peter Weckesser, reflecting a structure that connects AI leadership closely with both IT and the broader digital transformation teams responsible for deploying technology across the organization.
He also highlights an important structural element: the core building blocks for customer solutions reside within the digital organization. The foundational technologies and the core data platform that support customer-facing capabilities are owned by Weckesser’s organization, and AI is embedded within that structure through Rambach’s reporting line. The result, in his view, is a unified digital leadership model.
Within that framework, Rambach distinguishes between customer-facing business lines and those developing products, applications, solutions, and internal functions like HR, supply chain, and sales that support those activities.
Asked about the core principle behind Schneider’s AI transformation, Rambach returns to a single theme: scale. “To some extent, my team and I have been lucky because we started almost from a white page,” he says. And with that freedom comes an immediate priority: “Our obsession from day one was delivering AI value at scale.”
Rambach argues that prototypes are not the challenge. “With AI, it’s relatively easy to make MVPs,” he says. With generative AI, it becomes even easier to impress: “It is extremely easy to make demonstrations. People call them proof of concept.”
The real challenge, he insists, is what happens after the demo: “The difficulty is the scale.”
To solve for scale, Schneider Electric adopted a hub-and-spoke model, structured to concentrate AI expertise while ensuring it is fused with business and operational knowledge.
“We’re a central AI team,” he says, describing a pool of specialists across data engineering, natural language processing, agents, and more. But he is equally clear that this capability cannot operate as a detached center of excellence. The value comes from combining AI expertise with domain understanding.
A key design choice is where projects begin. “We never start from technology,” Rambach says. “We start from what we want to solve for our customers and employees.”
From there, Schneider organizes delivery through squads or cross-functional teams. The squad includes domain experts, product owners, IT talent, and specialists responsible for training and adoption. For customer-facing work, it also includes the go-to-market layer.
In Rambach’s framing, hub-and-spoke is less about central control and more about building critical mass and shared standards.
Next, Schirmer raises a familiar tension that even with a strong central team, the business must invest time, resources, and attention, often with uncertainty and possible failure.
Rambach’s approach is to keep the work grounded in outcomes the business already cares about. “That’s why we start from the business value and the customer value or the internal need,” he says.
He reiterates the example of how customers want “more energy savings and efficiency,” or how an overloaded service organization asks, “Can AI help?” By framing work around such problems, teams can see the vision of what they want to achieve, understand the outcome, and commit to the effort required to achieve it.
A second enabling mechanism is a small set of people inside Rambach’s team dedicated to a deep partnership with each business line and function. Their role is to understand what the business leaders want to deliver and manage an AI roadmap over 18 months.
This roadmap-driven approach reframes AI as ongoing product and transformation work rather than a string of disconnected experiments. It also creates a structured pipeline: what should be built, what should be delivered, and what needs to be sequenced to reach production.
“We try to avoid seeing AI as an innovation, and treat it as a change like many other changes,” Rambach stresses. That principle shows up in process design. Schneider uses a staged flow from early exploration and ideation through production.
By keeping the same squad accountable end-to-end, Schneider forces itself to confront scale constraints from the beginning: data availability, integration, adoption, and operational realities, rather than discovering them after a POC succeeds.
Rambach’s third pillar focuses on controlling technology sprawl. In an environment where innovation moves quickly, a company the size of Schneider Electric risks fragmenting its technology stack if experimentation is left completely decentralized.
“If we want to be able to scale, we cannot have everyone trying every technology in every direction,” Rambach says.
To address this, Schneider has built a coordinated technical platform that guides technology choices across the organization. The goal is to manage the underlying complexity of enterprise data and systems while allowing teams to access data, run models, and integrate results into products and software without creating disconnected solutions.
“Using APIs, we access data where it resides,” Rambach explains. “The model runs on the platform, and the results are delivered directly to the software or product that needs them.”
At the same time, Rambach acknowledges that the ecosystem of partners and tools evolves rapidly. Schneider therefore updates suppliers and technologies over time, but does so deliberately and in alignment with enterprise standards. As he puts it, the company aims to make “one decision for Schneider,” rather than a series of fragmented decisions across teams.
When asked how Schneider creates portfolio-level synergy and avoids rebuilding the same foundations across separate squads and use cases, Rambach points to a principle drawn from product development.
“We try to build a core platform and specialize as late as possible,” he says.
He illustrates the idea with customer-facing energy optimization. Flexibility optimization may look different for a home than for a commercial building, but the underlying intelligence is largely the same.
“At the core, the AI is the same module,” Rambach explains.
The same logic applies internally to knowledge solutions. Finance, customer care, and supply chain each rely on different knowledge bases, yet the underlying technology stack can still be shared.
For Rambach, the principle is strategic: “Differentiate as late as possible and maintain a shared technology foundation.”
CDO Magazine appreciates Philippe Rambach for sharing his insights with our global community.