Our world is in the midst of a profound electrical transformation, a foundational shift comparable to the industrial revolution’s transition from steam to electricity or the telecommunication industry’s leap from analog to digital. While this change is driven by many forces — from the electrification of transport to the integration of renewables — it is being supercharged by one in particular: the artificial intelligence revolution.
The computational power required to train and run today’s advanced AI models has an almost insatiable thirst for electrical power, creating an urgent and unprecedented challenge for our global energy infrastructure. The scale of this demand is staggering. The latest generation of AI-specific hardware, such as GPUs and TPUs, consumes immense amounts of energy, and the cooling required to keep these processors from overheating can account for nearly 40% of a data center’s total power load.
Consequently, a single large AI data center campus can require a power capacity of hundreds of megawatts, sometimes exceeding a gigawatt, equivalent to the consumption of a small city. This is forcing the tech industry and utility providers to plan and build new, dedicated electrical infrastructure, including entire substations, at a pace and scale never witnessed before. This explosive growth in AI is the primary catalyst fueling the need for a rapid expansion of our power grid.
And it is precisely here, at the critical intersection of urgent digital demand and physical grid construction, that the crisis begins. As we race to build the power systems to fuel the AI era, we are colliding with a wall of physical limitations and technological stagnation.
THE LOOMING CRISIS: A 20TH-CENTURY GRID UNDER PRESSURE
For any organization involved in building this vital infrastructure, the problem is immediate and acute. The construction of a new data center or the supporting utility substation faces staggering delays that can cripple a business case before the doors even open. The primary culprit is the extended lead times for essential power equipment. Ordering critical components like large power transformers or medium-voltage switchgear can mean a debilitating wait of up to 52 weeks, with some estimates now pushing 65 weeks or longer.
These delays have monumental real-world consequences. An AI data center, essential for a company’s competitive edge, might sit idle for months awaiting switchgear, its business case eroding with each passing day. This is not merely a supply chain issue; it is a systemic crisis rooted in decades of technological stagnation. It is exacerbated by a Great Crew Change in the specialized manufacturing workforce, where decades of tacit, hands-on knowledge in building electromechanical devices is retiring without being fully replaced.
At the very heart of this challenge lies an often-overlooked component: the sensor technology used in current and voltage transformers. This foundational technology — the basic senses of the grid — has remained fundamentally unchanged since Edison’s era. Its inherent physical and manufacturing complexity is the anchor weighing down our ability to build the grid of tomorrow at the pace the AI revolution demands.
The Achilles’ Heel: Deep Flaws of Traditional Sensor Technology
For over a century, the power industry has relied on conventional current transformers (CTs) and potential transformers (PTs). These devices are marvels of electromechanical engineering, consisting of heavy iron cores meticulously wrapped in thousands of copper windings and often submerged in barrels of insulating oil. Yet, the very principles that make them work are also the source of their most significant limitations.
- Their most critical technical flaw is magnetic saturation. A simple analogy illustrates the problem: Imagine trying to measure a flash flood with a one-gallon bucket. Once the bucket is full, you have no idea if there are two gallons or two hundred gallons more water flowing by. The iron core of a CT is that bucket. During a massive short circuit, the sheer magnitude of the fault current completely fills the magnetic capacity of the core. At this point, the CT stops measuring accurately and sends a distorted, clipped signal to the protective relay. This can cause a delayed trip command, allowing immense destructive energy to flow for extra milliseconds, or even cause the relay to misinterpret the fault entirely, leading to catastrophic equipment failure.
- Beyond saturation, traditional instrument transformers are plagued by other issues. They are susceptible to ferroresonance, a complex and dangerous phenomenon of electrical resonance that can produce extreme overvoltages, potentially destroying the transformers and any connected equipment. Their physical nature is a massive constraint; their immense weight and size dictate the entire physical design of a switchgear cabinet or substation bay. The maintenance burden is significant, requiring periodic oil testing for dissolved gases (DGA) to detect internal faults and replacing aging gaskets to prevent environmentally hazardous oil leaks. Finally, safety is a constant concern. An accidentally open-circuited secondary on a live CT can generate lethal voltages, a well-known hazard that makes routine testing and maintenance inherently risky work.
A PARADIGM SHIFT: THE RISE OF LOW POWER INSTRUMENT TRANSFORMER (LPIT) TECHNOLOGY
A revolution is underway to solve these deep-seated problems. The industry is rapidly shifting to a nimble, intelligent, and inherently safer electronic-based solution: the low-power instrument transformer (LPIT). This innovation, conforming to the IEC 61869 family of standards, is a complete rethinking of power measurement.
Instead of brute-force electromechanics, LPITs employ sophisticated electronic sensing components. They offer a cascade of benefits: radically reduced lead times, lower component costs, vastly simplified installation, and, for the first time, natively integrated temperature sensing.
This is how they achieve it:
- For current sensing. The heavy, saturable iron-core CT is replaced by an elegant Rogowski coil. This is a simple air-cored coil that measures the magnetic field around a conductor. Its output is a voltage proportional to the rate of change of the current, which is then digitally integrated with extreme precision. Because it has no iron core, it is physically impossible for it to saturate, ensuring a linear, highly accurate measurement from normal operation to the most severe fault conditions.
- For voltage sensing. The bulky wound PT is replaced by a precision-calibrated capacitive or resistive voltage divider, stepping down high voltages with exceptional stability and accuracy while avoiding the risks of ferroresonance entirely.
Crucially, an LPIT is a point-to-point device. It uses a standard, shielded Ethernet cable to create a direct, dedicated connection to a protective relay, adhering specifically to IEC 61869-10 and -11 for standardized, reliable communication. This is not a fragile IT network link; it is a hardened, industrial-grade data link, as robust as a dedicated hardwired connection but with the speed and precision of digital communication, ensuring that critical protection functions are insulated from any potential network failure.
The Universal Relay: The App Store for Grid Protection
Working in perfect harmony with LPITs is the universal protective relay. This concept represents another significant advancement, mirroring the smartphone paradigm where a single hardware device’s functionality is defined by various applications. Instead of requiring dozens of unique relay models for various apparatus types, a single universal relay can now protect motors, breakers, capacitor banks, transformers, and generators — all through a simple software configuration.
This approach, a form of software-defined protection, has transformative benefits. For large corporations, it enables global standardization. A single, highly optimized protection philosophy can be developed and then deployed identically in facilities across the world, ensuring consistent performance. Engineers can now confidently copy and paste one-line diagrams without concerns about hardware compatibility. Technicians can manage programming in the office or the field without worrying about ordering or installing incorrect part numbers. The entire ecosystem of protection and control becomes more flexible, scalable, and efficient.
THE ECONOMICS OF DIGITAL TRANSFORMATION: A TOTAL COST OF OWNERSHIP PERSPECTIVE
While the technical benefits are clear, a complete picture requires an analysis of the economic advantages, which extend far beyond the initial component price. The adoption of LPITs and universal relays offers compelling savings across the entire lifecycle of an asset, affecting both capital and operational expenditures.
Reducing Capital Expenditure (CAPEX)
The upfront savings begin with the equipment itself but ripple throughout the project. The dramatically lower weight and smaller footprint of LPITs lead to significant reductions in secondary costs. Civil works are minimized, as smaller foundations and less physical space are required. The need for heavy structural steel within switchgear and substations is reduced, lowering material costs. The replacement of thick, heavy-gauge copper secondary wiring with a single, lightweight Ethernet-style cable per phase results in substantial savings on copper and the labor required to pull and terminate it. Finally, the simplified, standardized nature of the system reduces the number of engineering hours required for design, integration, and commissioning.
Reducing Operational Expenditure (OPEX)
Over the 30-to-40-year lifespan of a substation, the OPEX savings are even more profound.
- Maintenance. Traditional CTs and PTs require a rigorous maintenance schedule, including periodic oil sampling and dissolved gas analysis (DGA), re-gasketing, and mechanical inspections. This is a labor-intensive and costly process. LPITs, being solid-state electronic devices, require virtually no preventative maintenance beyond periodic visual inspection, freeing up skilled technicians for more critical tasks.
- Energy efficiency. Conventional instrument transformers are themselves small transformers that continuously consume power (excitation losses) simply to be ready to measure. While small on an individual basis, this parasitic load adds up across hundreds of devices in a facility over decades. LPITs consume a fraction of this power, leading to measurable energy savings over the life of the installation.
- Safety and insurance. The inherent safety of the LPIT system — no oil, no risk of explosive failure, no lethal open-circuit voltages — creates a safer working environment. This can lead to tangible benefits in the form of lower insurance premiums and a reduction in costs associated with workplace accidents and safety compliance.
Cybersecurity in the Modern Substation: A Digital Fortress
The digitalization of the substation naturally raises questions about cybersecurity. Introducing intelligent electronic devices and communication protocols could be perceived as creating new attack vectors. However, the architecture of this modern protection system is designed with security at its core, following a defense-in-depth strategy aligned with standards like IEC 62443.
The first and most important layer of defense is the point-to-point nature of the LPIT connection. The critical measurement data travels on a dedicated, isolated cable directly to the relay. It is not on a routable network, effectively creating a digital air gap that makes it immune to network-based cyberattacks. The most critical function — sensing a fault — is hardened by its very architecture.
For the universal relay itself, which does connect to station-level networks for monitoring and control, security is paramount. These devices are hardened industrial controllers featuring:
- Secure boot. Ensures the relay only loads trusted, digitally signed firmware, preventing malicious code from being loaded during startup.
- Role-based access control (RBAC). Restricts user permissions, ensuring that only authorized personnel can change critical protection settings.
- Encrypted communications. Uses strong encryption for all remote management traffic, protecting commands and data from eavesdropping or manipulation.
This layered approach ensures that the digital substation is not only more intelligent and efficient but also robustly secure against modern cyber threats.
PROVEN PERFORMANCE AND A DAY IN THE LIFE, TRANSFORMED
This new digital ecosystem is not merely theoretical. Its power and interoperability were recently demonstrated at Siemens’ Frankfurt facility. Inside a state-of-the-art, high-power lab, engineers subjected the system to simulated faults exceeding twenty times the normal operating current. In a key validation, three different vendor LPIT sensors successfully integrated with a single Siemens universal relay, proving true plug-and-play functionality with a low-latency protection response.
The adoption of this technology fundamentally revolutionizes the day-to-day work of the people who design, build, and operate our power systems.
- The design engineer. Before, their work was dominated by physical constraints — wrestling with catalogs of heavy components and designing oversized cabinets to accommodate them. Now, their canvas is digital. They can design elegant, compact systems with logical precision, knowing the sensors are small, lightweight, and standardized.
- The commissioning technician. Their day used to involve carefully wiring hundreds of connections, performing complex, multi-step injection tests, and working with constant awareness of high-voltage hazards. Now, their primary tools are a laptop and a patch cable. They can run automated test scripts through a safe Ethernet connection, completing in minutes what used to take hours, with far greater accuracy and personal safety.
- The grid operator. Their role evolves from reactive to proactive. Instead of analyzing a cryptic cascade of alarms after an outage has occurred, the continuous, real-time temperature data from the LPITs allows them to see a busbar connection slowly degrading over weeks. They can move from forensic analysis to predictive maintenance, scheduling a repair during a planned, low-cost outage and preventing the failure entirely.
GAZING INTO THE FUTURE: THE INTELLIGENT, CLOUD-CONNECTED GRID
As powerful as this technology is today, its true potential is only just beginning to be unlocked. It provides the high-fidelity, real-time data needed to build a truly intelligent grid. The industry is moving toward even more sophisticated solutions that build upon this digital foundation.
- The digital twin and cloud-native engineering. This technology directly enhances real-time digital simulation (RTDS), enabling the creation of high-fidelity digital twins of the power system. Furthermore, cloud integration will allow engineers to test and validate entire protection schemes in a virtual environment. This brings the continuous integration/continuous deployment (CI/CD) model of the software world to the grid, enabling new protection logic to be fully tested on a digital twin and deployed remotely without costly physical factory acceptance tests.
- From predictive to prescriptive AI. Future AI-driven systems will move beyond just prediction. It won’t just say, “This motor might fail.” It will offer prescriptive guidance: “This motor shows waveform signatures indicating bearing wear. Based on the current load profile, we recommend scheduling maintenance within the next 45 days to avoid a critical failure, and the next planned site-wide outage is in 38 days, which is the optimal time for replacement.”
- The grid at the edge. Universal relays are powerful edge computers. This enables the grid to become decentralized and self-healing. Localized, autonomous control schemes can manage microgrids or distribution loops, isolating faults and re-routing power in microseconds without needing to communicate with a central control room for every event. The vision is of a scalable, software-based protection system that can handle hundreds of feeders simultaneously, adapting to the grid’s needs in real time.
- Mitigating new threats. These advances will also help address emerging challenges like the impact of solar flares. An intelligent system can use LPIT data to detect the tell-tale signs of geomagnetic disturbances and automatically adjust system parameters to protect vulnerable, high-value transformers from damage.
CONCLUSION: BUILDING THE GRID OF TOMORROW, TODAY
LPIT technology and universal protective relays are more than just incremental improvements. They constitute a fundamental shift in how we approach power sensing and control. This is the essential upgrade needed to solve the deep challenges of an aging grid facing the unprecedented and urgent demands of the AI age. As we continue to face the dual imperatives of decarbonization and electrification, these innovations — offering superior performance, lifecycle economic benefits, and a hardened security posture — are crucial for building a more resilient, efficient, and intelligent power infrastructure. We are finally giving our electrical grid the 21st-century digital nervous system it needs to securely and reliably power our future.

Simon Loo serves as the West Regional Sales Manager for Siemens’ Protection and Control department, bringing over 20 years of experience in relay systems and power protection. His expertise spans both traditional and emerging technologies in power system protection and control.
