In Part I of our “Chasing Waterfalls” blog series, we introduced our new Virtual Irradiance Performance Waterfall, a tool that the Locus analytics team developed to disaggregate the causes of residential solar PV system underperformance. In Part II, we will share a little about what the Waterfall can tell us about the extent to which a fleet of tens of thousands of systems is affected by soiling, shading, and snow cover.
It is expected that the soiling loss rates experienced by PV systems are going to differ significantly based on where the system is located. To see whether our estimated soiling losses confirm these intuitive suspicions, let’s start by taking a look at the 2016 soiling levels of systems in the Los Angeles area broken down by month of year. The following image shows monthly snapshots of the percentage of site energy lost to soiling around Los Angeles.
Seasonal and geographic patterns in soiling rates have been widely suspected to be major shortcomings of relative performance metrics for some time because all sites in the area are affected together. This visualization confirms that those suspicions are very well founded.
Another trend we expected to see in the data is that soiling levels drop to zero during rainy seasons and then increase slowly over time. We also expected that some of the coastal microclimates in a particular region would experience lower levels of soiling than the drier parts of the region further from the coast.
To examine this, we used New Jersey as a test case. In contrast to the soiling rate patterns in the LA region, soiling losses in New Jersey appear to be essentially non-existent.
Looking at the shading trends in the LA area, we further confirm that our algorithms are working as intended.
However, having a severely soiled site down the street from an unsoiled site would most likely be due to industrious homeowners taking it upon themselves to hose down the panels on their roofs, which presumably only occurs relatively rarely. As expected, we see a much more even geographic mix of severely shaded and unshaded sites than was seen in the soiling data.
Our second expectation was that the data would show the most severe shading occurring when the sun is lowest in the sky (wintertime). Visual inspection reveals that we see the most shading in November, December, and January, which is just as suspected.
Consider Massachusetts in the winter of 2014-15, when it was hit with record-breaking amounts of snowfall. The following image shows the percent of energy sites in Massachusetts lost to snow in December, January, February and March (columns) for the last three winters (rows).
Just as we would have expected, residential PV systems had dramatically higher energy losses due to snow in Winter 2014-15 than in the following two winters.
Hopefully some of these visualizations have given you an idea of some of the interesting results that can be found by disaggregating loss factors when analyzing PV system performance across a very large fleet of systems. The next blog post is on applying this data to the task of efficiently diagnosing underperforming systems out of a large residential fleet.
Locus Energy's Virtual Irradiance provides high-accuracy irradiance data derived from satellite imagery, that enables asset managers, performance engineers, and operators to benchmark, model, and diagnose solar PV site issues. To learn more about Virtual Irradiance, please click here. The Virtual Irradiance Performance Waterfall feature in the LocusNOC platform provides a better understanding on how to detect discrepancies that arise between measured in modeled system performance that can allow users to reach new levels of efficiency and cost reduction. To request more information on the VI Performance Waterfall, please click here.