Distributed energy sources can reduce cost of electricity up to 50%, study says. Traditional grids will have to change. Modeling can help find the best way forward.

Data-driven planning of distributed energy resources amidst socio-technical complexities inNature Energy 6:17112 · July 2017, DOI: 10.1038/nenergy.2017.112

Stanford researchers find that DER can halve electricity costs. https://www.smart-energy.com/regional-news/north-america/stanford-university-der/ 

 Researchers from Stanford University have published a paper based on a programme that explores how to model DER deployment in the most cost-effective manner.

The paper orginally published in Nature Magazine, describes the programme, “ReMatch” which leverages smart grid data to match groups of consumers with different kinds of distributed resources based on the customers’ energy use and the ability to construct resources in that area eg. solar panels, batteries.

In a release, Ars Technica UK explains that, “if a business district uses a lot of power around mid-day, for example, it might be worthwhile to offer incentives for that area to install solar panels. If a row of restaurants is open until 9pm, perhaps offering those businesses a solar-plus-battery option would be more cost-effective.”

It adds: “The modeling programme can also break down customer energy use by the hour. The software can, for example, pick out customers who use a lot of solar in the morning and customers who use a lot of solar in the afternoon. The utility can then use that information to balance the enrollment of each kind of customer, thereby evening out the demands on the grid.”

Programme trial

The Stanford researchers applied ReMatch to a 10,000-customer sample in California, using real hourly data gleaned from smart meters. They found that constructing DER infrastructure in a targeted way reduced the Levelized Cost of Electricity (in other words, the present value of the resource over its lifetime costs) by nearly 50%.

The paper concluded that through offering detailed data on intermittency, customer demand, and operating costs, utilities can take a targeted approach to incentivising DER infrastructure, which will help them meet renewables goals and reduce costs associated with indiscriminate buildout.

“[O]ur results suggest that in order for DER infrastructure to become a reality we must design smart and targeted policies, programmes, and incentives that facilitate the balancing of consumer type enrollment in DER plans and programmes with the existing grid,” the researchers concluded.

“Under such smart policies, the optimal mix of consumers could be selected to become part of emerging utility models of organized ‘prosumer’ community groups to preserve the cost effectiveness of model-derived DER infrastructure plans.”

By Megan Geuss, Ars Technica, 18 July 2017

Dramatic changes are coming to the old power grid. As infrastructure ages and policy dictates a move away from fossil fuels, utilities and governments are looking at Distributed Energy Resources (DERs) as potential alternatives to continually building out a centralized grid.

DERs include all kinds of hardware that the utility may not necessarily own directly—solar panels, natural gas-fired microturbines, stationary batteries, and alternative cooling. Demand-response schemes, where a grid operator shifts electricity consumer use (usually through incentives) away from high-demand times, are also considered DERs.

Planning for DERs makes grid management trickier than it was when a company simply built a huge new plant and connected a power line to it. Without a lot of data, it’s hard to know what kinds of energy resources will have the most impact economically and environmentally and what will be most cost-effective for utilities. But a trio of researchers from Stanford University is attempting to make this planning easier for utilities and policy makers to solve. The researchers published a paper in Nature Energy this week describing a program they built to model DER deployment in a way that will result in the lowest cost to grid operators.

The program, called “ReMatch,” uses smart grid data to match groups of consumers with different kinds of distributed resources based on the customers’ energy use and the ability to construct resources in that area (like solar panels, batteries, and so on). If a business district uses a lot of power around mid-day, for example, it might be worthwhile to offer incentives for that area to install solar panels. If a row of restaurants is open until 9pm, perhaps offering those businesses a solar-plus-battery option would be more cost-effective.

The modeling program can also break down customer energy use by the hour. The software can, for example, pick out customers who use a lot of solar in the morning and customers who use a lot of solar in the afternoon. The utility can then use that information to balance the enrollment of each kind of customer, thereby evening out the demands on the grid.

The researchers applied ReMatch to a 10,000-customer sample in California, using real hourly data gleaned from smart meters. The model found that constructing DER infrastructure in a targeted way reduced the Levelized Cost of Electricity (that is, the present value of the resource over its lifetime costs) by nearly 50 percent. This was, the paper states, due to a dramatic reduction in operating costs incurred by the utility.

By offering detailed data on intermittency, customer demand, and operating costs, utilities can take a targeted approach to incentivizing DER infrastructure, which will help them meet renewables goals and reduce costs associated with indiscriminate buildout. “[O]ur results suggest that in order for DER infrastructure to become a reality we must design smart and targeted policies, programs, and incentives that facilitate the balancing of consumer type enrollment in DER plans and programs with the existing grid,” the researchers concluded. “Under such smart policies, the optimal mix of consumers could be selected to become part of emerging utility models of organized ‘prosumer’ community groups to preserve the cost effectiveness of model-derived DER infrastructure plans.”

Nature Energy, 2017. DOI: 10.1038/nenergy.2017.112  (About DOIs).

Energy

Volume 117, Part 1, 15 December 2016, Pages 29-46

Energy

Recent trends in power system reliability and implications for evaluating future investments in resiliency

Abstract

This study examines the relationship between annual changes in electricity reliability reported by a large cross-section of U.S. electricity distribution utilities over a period of 13 years and a broad set of potential explanatory variables, including weather and utility characteristics. We find statistically significant correlations between the average number of power interruptions experienced annually and above average wind speeds, precipitation, lightning strikes, and a measure of population density: customers per line mile. We also find significant relationships between the average number of minutes of power interruptions experienced and above average wind speeds, precipitation, cooling degree-days, and one strategy used to mitigate the impacts of severe weather: the amount of underground transmission and distribution line miles.

Perhaps most importantly, we find a significant time trend of increasing annual average number of minutes of power interruptions over time—especially when interruptions associated with extreme weather are included. The research method described in this analysis can provide a basis for future efforts to project long-term trends in reliability and the associated benefits of strategies to improve grid resiliency to severe weather—both in the U.S. and abroad.

%d bloggers like this: