(GreenBiz) Last year, ComEd, the electric utility in northern Illinois and Chicago, initiated a collaborative group to study quantum computing for power systems. This effort included a team at the University of Denver along with representatives from the University of Chicago, Argonne National Laboratory and others.
Electric utilities continue to devote significant resources to predict, respond to and recover from extreme events — including extreme weather events, equipment outages and sudden loss of renewable generation.
To date, classical computing has aided in supporting all three steps. However, an increasingly complex grid, facing more frequent and intense weather events, is testing the limits of classical computing. It is a reasonable assumption to make that as customers become more dependent on a reliable and resilient supply of electricity, utilities will need to move beyond classical computing for monitoring and control on blue-sky days as well as in severe weather.
Forecasting the path, intensity and impact of a severe weather event on the grid is rife with uncertainty. There is also a great deal of uncertainty in the generation of renewable resources as well as availability of supply and delivery infrastructure. Classical computing uses predictive models that morph as new data is available, and the response timeframes for different environmental events vary significantly.
Only through collaboration will we be enabled to determine how quantum computing can best be applied to power grids. A multi-stakeholder approach is required, as every power grid has a unique network topology and a different geographic setting, hence each face specific environmental threats. A multi-disciplinary approach is needed, as applying quantum computing to power systems runs the gamut of scientific, economic and other disciplines.