Computing Sciences masthead Berkeley Lab Computing Sciences Berkeley Lab logo
Share/Bookmark

Exascale for Energy


The Electric Power Grid

The current U.S. power grid is a huge, interconnected network composed of power-generation stations, high-voltage transmission lines, lower voltage distribution systems, and other support components (Figure 16). Supporting the grid operation is an extensive communications and control infrastructure; decision making requires sophisticated computational tools.

U.S. power grid

Figure 16. The contiguous U.S. electric power transmission grid. New computational modeling and analysis tools are needed that integrate real-time information to optimize power flow and prevent outages.
Source: FEMA

The current system is under tremendous stress. Transmission system expansion has been unable to keep pace with the growth in energy demand. Consequently, transmission congestion is creating large electricity price differentials between geographic regions. For the New York Independent System Operator (NYISO) alone, these price differentials account for nearly 25% of the region’s electricity costs. Furthermore, the existing grid structure and centralized control philosophy were not designed to handle the proliferation of small power sources resulting from deregulation, the intermittent supply from alternative sources such as wind and solar power, or the demand that could be created by widespread use of electric vehicles.

The power grid is also in the midst of a transformation from the analog to the digital age, exemplified by the term “smart grid.” The smart grid will manage and deliver electrical energy through a combined centralized and distributed system, in which many nodes are capable of producing, consuming, and storing electrical energy; sensors at each node will automatically provide continuous data about energy use, flow, and system status. Managing transactions on this network will involve extensive communications networks, real-time data monitoring and analysis, and distributed, hierarchical control schemes. Designing and simulating such a network represents a grand computational challenge on an unprecedented scale.

Cascading failure simulation
Figure 17. Simulation results show how cascading failures would break the Electric Reliability Council of Texas system into three independent islands (each shown in a different color), leading to blackouts.
Source: “Computational Research Needs for Alternative and Renewable Energy”

Smart grid technologies, tools, and techniques include hardware, software, and business practices that will generate massive amounts of new data. Near real-time data and visibility of wide-area grid conditions will enhance electric system planning and operations; improve the integration of renewable and energy efficient technologies; increase demand response, electric reliability, and electric asset utilization; and provide better situational awareness and faster response to prevent local disturbances from cascading into regional outages (Figure 17).

On March 30–31, 2009 the U.S. Department of Energy’s Office of Advanced Scientific Computing Research (ASCR) and Office of Electricity Delivery and Energy Reliability (OE) conducted a joint technical workshop to discuss data management, modeling and analysis, and visualization and decision tools for the 21st century electric power system. The purpose was to “get ahead of the curve” and anticipate major computational and engineering challenges associated with the extensive amount of new and more detailed data on the status of grid conditions that are expected over the next decade as part of national grid modernization efforts.

The electric power system has undergone extensive change over the past several decades and has become substantially more complex, dynamic, and uncertain as new market rules, business practices, regulatory policies, and electric generation, transmission, distribution, storage, and end-use technologies have been tried and adopted. There are now hundreds of times more transactions at the bulk power level and emerging requirements for the two-way flow of power and information at the electric distribution level.

The availability of more detailed data about system conditions and advanced metering infrastructure for dynamic pricing and demand response can be a great benefit for improving grid planning and operations; but ways must be found to aggregate, organize, and keep the data secure so that it can be used to upgrade, extend, and replace existing modeling, analysis, and decision making tools.

There are a number of computational challenges that must be addressed to achieve this result. For example, the large volume of more detailed data creates information management challenges, but opens opportunities for grid operators to base decisions on actual measurements rather than estimates. Wide-area visibility that covers a much larger footprint of the power system than is currently possible will create data analysis and visualization challenges, but opens opportunities for interconnection-wide network and state estimation models. Access to information on grid conditions in near real time creates data analysis and error checking challenges, but opens opportunities to detect operating anomalies and disturbances so that actions can be taken to prevent them from becoming local problems or cascading into regional outages.


There is a need for algorithms, not just computers, to transform the grid modeling paradigm and enable proactive decision making, not just contingency response.

A major barrier to addressing the computational challenges is the insufficiency of today’s electric system models, and analysis and visualization tools. They cannot be counted on to provide accurate representations and information about the status of the grid, are not able to handle system dynamics properly, and cannot evaluate multiple major contingencies simultaneously. Advanced models are needed for anticipating multiple contingencies that could have severe consequences, for evaluating large-scale cascading outage scenarios, and for developing accurate measures of a system’s brittleness or susceptibility to outages from a variety of potential causes.

The ASCR-OE workshop focused on three broad areas: data management, modeling and analysis, and visualization and decision tools.

Effective power system planning, operations, and communications (intra- and inter-entity) requires power system operators to analyze vast amounts of data, such as automated and computer-assisted control data, grid telemetry, market information, environmental information, and others. With the advent of smart grid technologies, data will become richer and denser. More data can yield more precise forecasts and enable more robust active control. However, if not properly managed, data can cause the opposite effect if it has been compromised or if it inundates the system with unnecessary information. Having better data management environments is one of the keys for enabling decision makers to make correct inferences and judgments as they apply this information to making the grid more efficient, reliable, affordable, and secure.

Improving data management for the 21st century grid involves addressing a number of technical challenges. For example, massive amounts of data can introduce unacceptable latency (slow down communications) into energy systems if processed, stored, or utilized on today’s architectures. Furthermore, as redundant, missing, and poor quality data substantially increases, so will the likelihood of faulty readings, which could cause energy systems to crash. Concerns about system stability have kept most utilities with access to grid telemetry data from taking advantage of it, because the capabilities to extract good data do not exist today. In addition, achieving an adequate level of cybersecurity and protection of ownership and access rights will be increasingly difficult, because underlying protocols will need to constantly change as they respond and adapt to an increasingly complex energy infrastructure and a highly dynamic risk environment.

Significant effort will be needed in research, development, and demonstration to address these challenges. For example, multidisciplinary teams will need to develop a transition plan from legacy systems to future systems, including building and testing prototype architecture designs for planning, operations, and communications. Cybersecurity technologies and tools also must be developed to appropriately manage trust, security, and risk, and to aid compliance management and dynamic data sharing policies.

The transition to a smarter grid means that an enormous amount of data will be collected that will require analysis and verification so that it can be transformed into usable information. Computerized data modeling and analysis can provide a representation of the behavior of the grid system that will contribute to understanding the interaction of the parts of the electric grid and of the system as a whole.

Data modeling and analysis are already used by a number of stakeholders for a diverse set of applications, including operations, planning, training, and policymaking. But the existing algorithms and capabilities that have been used for data modeling and analysis are becoming dated and are not robust enough to handle all of the changes in the electricity and ancillary sectors.

There is a need for algorithms, not just computers, to transform the grid modeling paradigm from static, slow, and deterministic to dynamic, real time, wide area, and stochastic. This new paradigm will enable proactive decision making, not just contingency response.

Transmission congestion is one area that could benefit from proactive modeling and decision making. The Federal Energy Regulatory Commission (FERC) estimated that the economic costs of transmission congestion in the summers of 2000 and 2001 (Figure 18) ranged from less than $5 million to more than $50 million per incident. However, for one particular set of conditions in the eastern portion of New York during the summer of 2000, FERC estimated a cost of more than $700 million. Real-time modeling and proactive decision making may prevent or mitigate these costly transmission constraints in the future.

Transmission constraints

Figure 18. Transmission constraints in the contiguous U.S. from June through August 2000 and 2001.
Source: FERC

There are a number of technical challenges associated with data modeling and analysis for the future power system. For instance, one challenge is to conduct streaming data analysis for real-time detection of anomalies and improved forecasting. It is critical that the right kind of data is acquired and that uncertainties are handled properly so that there is confidence that the data is accurate. Another challenge is to develop new algorithms that are robust and scalable as the complexity of the problem increases.

Planning and operating the nation’s electric grid is far more complicated today than it was in the past because of institutional, regulatory, market, and technological changes over the past several decades. For example, the level of transactions in bulk power markets is higher; the number of market participants is higher; the level of peak demand is higher; the costs of outages are higher; and public awareness about the environmental consequences of electricity production, delivery, and use is higher.

The market introduction of smart grid technologies, tools, and technologies will provide grid operators, consumers, and policy makers with significantly more information for managing the grid and addressing national energy and environmental priorities such as global climate change, economic recovery, oil import dependence, and critical infrastructure dependence. Having better visualization and decision tools is one of the keys for taking this data, turning it into useful information, and applying it to make the grid more efficient, reliable, affordable, and secure.

Improving visualization and decision tools for the 21st century grid involves addressing a number of technical challenges. For example, the system itself is becoming more complex, uncertain, and dynamic. Grid operators—the primary users of visualization and decision tools—face daily decisions which require near-real-time responses to prevent local disturbances from cascading into regional outages. The grid of the future will also involve real-time decision making by consumers about consumption and their demand response to dynamic pricing. And state, regional, and national policy makers and regulatory officials are becoming more heavily involved with electric resource planners and engineers in assessing needs for new power plants and transmission and distribution facilities. All of these participants—grid operators, planners and engineers, consumers, and policy makers—will need better visualization and decision tools to plan and operate the grid of the future properly.

The planning and operation of the electric power grid involves the deployment of diverse resources under uncertain conditions with imperfect information. From a mathematical viewpoint, many of these problems can be posed as nonlinear stochastic optimization problems containing both discrete and continuous variables. However, in practice, a lack of advanced algorithms and software has necessitated the use of simplified models and heuristics. Decision-making is divided into a series of simpler steps, each of which considers only a subsystem, or focuses exclusively on a particular time-scale or geographic area. The operational and planning decisions that emerge are inevitably inferior to those possible in a more global, multi-scale optimization of the entire system.

Optimal power flow (OPF) modeling is an important tool for determining the most efficient and economical operation of existing power systems as well as for planning future expansion. However, full-scale nonlinear OPF has not been widely adopted in real-time operation of large-scale power systems. Instead, system operators often use simplified OPF tools that are based on linear programming and decoupled system models. Historically, this is due both to the lack of powerful computing hardware in the industry and to the lack of efficient and robust OPF algorithms. With the advent of fast, low-cost computers, speed has now become a secondary concern, while algorithm robustness and scalability are the primary issues.

A transmission system operator comparable to the NYISO might contain 10,000 buses, 12,500 transmission lines, 2,000 generators, and 2,000 equipment contingencies, yielding an optimization problem with on the order of 6 x 107 variables, 4 x 107 nonlinear equality constraints, 5 x 107 nonlinear inequality constraints, 2 x 107 linear inequality constraints, for a total of about 1 x 108 constraints of various types. This complex and large-scale problem would adequately represent the situation the NYISO faces every day in its planning and operations. A larger problem would surface when accurately modeling the eastern National Interest Electric Transmission Corridor. Yet the largest OPF problem reliably solved to date is a 2,935-bus reduced-order model of New York without contingencies.

This example illustrates that the most immediate issue for power grid modeling is algorithm and software development, not more computing power. Berkeley Lab researchers are currently collaborating with mathematicians and power engineers from Cornell University and the University of Wisconsin to develop optimization algorithms that detect vulnerabilities in the power grid, analyze cascading outages, and perform resource allocation across multiple locations and times; they will then combine these algorithms into an integrated optimization framework (see sidebar).

When real-time, wide-area grid modeling, optimization and control software is eventually developed to monitor and manage as many as 100 million distinct nodes, it could easily require continuous access to hundreds of thousands of processors.

 


<< Previous page