I am running a simulation comparing a Huawei system to a SolarEdge system with near identical panel layouts, and there is a fair amount of shading on certain sections of the array.
I would expect SolarEdge to outperform Huawei, however not only does it have a lower kWh yield, it also had a higher Mismatch (Configuration/Shading) loss? Is there anything I'm missing? I can't get my head around how this is the case.
SolarEdge losses below, with 0.56% mismatch losses.
Huawei simulation, with only 0.53% mismatch losses?
I also ran a Huawei option with a few TIGO optimsiers and this reduced the mismatch losses to 0.40% which makes sense, I just don't understand how SolarEdge is performing so poorly.
Thank you.