Temperature model parameter translation code
- [ ] Closes #1442
- [ ] I am familiar with the contributing guidelines
- [ ] Tests added
- [ ] Updates entries in
docs/sphinx/source/referencefor API changes. - [ ] Adds description and name entries in the appropriate "what's new" file in
docs/sphinx/source/whatsnewfor all changes. Includes link to the GitHub Issue with:issue:`num`or this Pull Request with:pull:`num`. Includes contributor name and/or GitHub username (link with:ghuser:`user`). - [ ] New code is fully documented. Includes numpydoc compliant docstrings, examples, and comments where necessary.
- [ ] Pull request is nearly complete and ready for detailed review.
- [ ] Maintainer: Appropriate GitHub Labels and Milestone are assigned to the Pull Request and linked Issue.
Open for a first round of comments! @jsstein @cwhanse @wholmgren @mikofski @kanderso-nrel @mtheristis
Is the basic idea here that GenericLinearModel acts as a neutral ground between the various temperature models? If I follow correctly: an instance is initialized, a set method is used to set the neutral model using a specific temperature model's parameters, then that instance's get_<target_model> method returns parameters for the target model. Is that right?
@cwhanse that's right, that's the basic idea.
I'm pleased that my class design has survived the first three days of intense scrutiny! :)
This is an interesting idea. As I commented in #1442 i think it might be helpful to start with explaining the different temperature models, how they differ, what they have in common, and why we need conversions.
Regarding this code, I think starting with a smaller PR of just the core functions that convert PVsyst to Faiman and v.v. would be better. Then we can consider a class structure that delegates methods to the core functions. This combo of procedural & OOP paradigms is a defining principle of pvlib, expressed in the user guide
also regarding set & get functions you may wish to consider python properties which is a built in and can eliminate some redundant boilerplate
I don't really see these as core simulation functions. In fact, temperature.py might not be the best location. An example script or notebook are other options.
From a user point of view, is it practical to consider the public code to a function like
newparams = temperature.convert(from='pvsyst', to='faiman', params=<pvsyst parameter dict>)
The neutral Class could operate behind the curtain. Maybe you have already considered and rejected that interface.
I was thinking of something similar but in reverse. From a computing standpoint I always think from simple to complex. So the core functions would be explicit:
def convert_temp_params(eta, absorptance, u0, u1):
“””Core function to convert temperature parameters”””
# do stuff
return new_params
Then a class would delegate methods to the core functions:
class GenericTempParams:
“””Class to convert temperature parameters”””
def convert(self, from, to, params : <dict>):
return convert_temp_params(self.eta, self.absorptance, **params)
this allows users to bypass the oop and use the procedural directly
Thank you for this discussion!
I would say that I've certainly considered and experimented with options in those directions. To say that I rejected them is too strong, but over some time I developed a preference for the solution I proposed.
I like having a user interface where the empirical coefficients explicitly named in the methods, which steered me away from a do-it-all function. But indeed it would be easy to add such a function that uses my class as @cwhanse suggests and I have nothing against that.
To first write all the conversion calculations as stand-alone functions seemed to me like an extra layer of code with little benefit. There would either have to be n*(n-1) functions to do conversions with a single function call, or 2*n functions requiring two function calls to make a conversion.
What led me to a class conceptually is the differentiation between the extra inputs efficiency and absorptance, which are properties of the PV module, vs. the empirical parameters. For potential users this class interface is really quite simple and clean it seems (to me).
What do you think about creating a module tmtools?
What do you think about creating a module
tmtools?
I do not like abbreviations that aren't common. iotools is fine because IO is a universal and well-known acronym. But what does tm mean? Based on the PR I assume it refers to TeMperature or Temperature Model? I would prefer something more easily understandable for the uninitiated, e.g., temptools or perhaps better yet temperaturemodels or temperaturecorrection.
pvlib.temperature.tools?
Well, the main question is whether to create a separate module. The name would be secondary.
I'm -1 on a new module; I think it's fine to put this code in pvlib.temperature. I don't think there's a lot of code required for the parameter translator, and it doesn't strike me as growing over time.
@adriesse thank you for proposing this interesting idea. My initial reactions were similar to @mikofski's:
- Can we use functions and if necessary wrap them in a class? But I think @adriesse made good points for why this class design is reasonable in this case.
- Can we use properties instead of
set_*andget_*? I too have an allergic reaction toset_methods in python, but it's not clear to me how this class would be refactored to support properties. The set methods are unique to the model and they set attributesu_constanddu_wind. The get methods are again unique to the model and return dicts with different keys. Maybe the property model would look something like this?
>>> glm = GenericLinearModel(module_efficiency=0.19, absorptance=0.88)
>>> glm.faiman = 16, 8
>>> glm.faiman
{'u0': 16.0, 'u1': 8.0}
>>> glm.pvsyst
{'u_c': 12.24, 'u_v': 6.12, 'module_efficiency': 0.15, 'alpha_absorption': 0.9}
Doesn't feel like a win to me.
Also, if I understand correctly, this is more than just converting between different model parameters. The class also has a __call__ method for calculating module temperature from the parameters.
Here's a mockup of what I was thinking, let me know if it's clear:
"""
Tools for converting between generic linear temperature models (GLM).
GLM defines 4 basic parameters for linear temperature models:
* ``u_const``
* ``du_wind``
* ``alpha``
* ``eta``
The parameters for Faiman, PVsyst, and other models can then be generated from
the GLM basic parameters.
"""
from dataclasses import dataclass
# basic functions
def _calc_net_absorptance_glm(alpha, eta):
return alpha - eta
def _calc_net_absorptance_pvsyst(alpha, eta):
return alpha * (1.0 - eta)
def _calc_absorptance_ratio(alpha, eta):
net_absorptance_glm = _calc_net_absorptance_glm(alpha, eta)
net_absorptance_pvsyst = _calc_net_absorptance_pvsyst(alpha, eta)
return net_absorptance_glm / net_absorptance_pvsyst
# Faiman
def from_faiman_to_glm(u0, u1, alpha, eta):
"""
Convert Faiman params ``(u0, u1)`` to GLM.
Parameters
----------
u0 : float
Faiman natural convection h.t. coefficient
Returns
-------
u_const : float
GLM natural convection h.t. parameter
du_wind : float
GLM forced convection h.t. parameter
"""
net_absorptance = _calc_net_absorptance_glm(alpha, eta)
u_const = u0 * net_absorptance
du_wind = u1 * net_absorptance
return u_const, du_wind
def to_faiman_from_glm(alpha, eta, u_const, du_wind):
net_absorptance = _calc_net_absorptance_glm(alpha, eta)
u0 = u_const / net_absorptance
u1 = du_wind / net_absorptance
return u0, u1
# PVsyst
def from_pvsyst_to_glm(u_c, u_v, eta, alpha):
# XXX: note in PVsyst eta comes before alpha!
absorptance_ratio = _calc_absorptance_ratio(alpha, eta)
u_const = u_c * absorptance_ratio
du_wind = u_v * absorptance_ratio
return u_const, du_wind
def to_pvsyst_from_glm(alpha, eta, u_const, du_wind):
absorptance_ratio = _calc_absorptance_ratio(alpha, eta)
u_c = u_const / absorptance_ratio
u_v = du_wind / absorptance_ratio
return u_c, u_v
# generic class
@dataclass
class GenericLinearModel:
alpha: float
eta: float
u_const: float = 29.0
du_wind: float = 0.0
def __repr__(self):
return self.__class__.__name__ + ': ' +vars(self).__repr__()
@property
def faiman(self):
"""Get Faiman model params matching GLM"""
u0, u1 = to_faiman_from_glm(
self.alpha, self.eta, self.u_const, self.du_wind)
params = {'u0': u0, 'u1': u1}
return params
@faiman.setter
def faiman(self, params):
"""
Set the generic model parms to match Faiman.
"""
# XXX: Will override existing GLM parameters!
u0 = params['u0']
u1 = params['u1']
u_const, du_wind = from_faiman_to_glm(u0, u1, self.alpha, self.eta)
self.u_const = u_const
self.du_wind = du_wind
@property
def pvsyst(self):
"""Get pvsyst model params matching GLM"""
u_c, u_v = to_pvsyst_from_glm(
self.alpha, self.eta, self.u_const, self.du_wind)
params = {
'u_c': u_c, 'u_v': u_v,
'module_efficiency': self.eta, 'alpha_absorption': self.alpha}
return params
@pvsyst.setter
def pvsyst(self, params):
"""
Set the generic model parms to match pvsyst.
"""
u_c = params['u_c']
u_v = params['u_v']
module_efficiency = params.get('module_efficiency')
alpha_absorption = params.get('alpha_absorption')
# XXX: Will override existing eta and alpha in GLM class!
if module_efficiency is not None:
self.eta = module_efficiency
if alpha_absorption is not None:
self.alpha = alpha_absorption
u_const, du_wind = from_pvsyst_to_glm(u_c, u_v, self.eta, self.alpha)
self.u_const = u_const
self.du_wind = du_wind
# tests
if __name__ == '__main__':
print('\nTesting GLM: create GLM(eta=0.19, alpha=0.88)')
glm = GenericLinearModel(eta=0.19, alpha=0.88)
print('\n\tGLM: %r' % glm)
print('\nTest setting Faiman {u0: 16, u1: 8}')
glm.faiman = {'u0': 16, 'u1': 8}
print('\n\tGLM: %r' % glm)
print('\n\tFaiman: %r' % glm.faiman)
print('\n\tPVsyst: %r' % glm.pvsyst)
print('\nTest setting PVsyst {u0: 29, u1: 0}')
glm.pvsyst = {'u_c': 29, 'u_v': 0}
print('\n\tGLM: %r' % glm)
print('\n\tFaiman: %r' % glm.faiman)
print('\n\tPVsyst: %r' % glm.pvsyst)
print('\nTest setting PVsyst {u0: 25, u1: 1.2, module_efficiency: 0.1, alpha_absorption: 0.9}')
glm.pvsyst = {
'u_c': 25, 'u_v': 1.2,
'module_efficiency': 0.1, 'alpha_absorption': 0.9}
print('\n\tGLM: %r' % glm)
print('\n\tFaiman: %r' % glm.faiman)
print('\n\tPVsyst: %r' % glm.pvsyst)
this yields the following output:
Testing GLM: create GLM(eta=0.19, alpha=0.88)
GLM: GenericLinearModel: {'alpha': 0.88, 'eta': 0.19, 'u_const': 29.0, 'du_wind': 0.0}
Test setting Faiman {u0: 16, u1: 8}
GLM: GenericLinearModel: {'alpha': 0.88, 'eta': 0.19, 'u_const': 11.04, 'du_wind': 5.52}
Faiman: {'u0': 16.0, 'u1': 8.0}
PVsyst: {'u_c': 11.404800000000002, 'u_v': 5.702400000000001, 'module_efficiency': 0.19, 'alpha_absorption': 0.88}
Test setting PVsyst {u0: 29, u1: 0}
GLM: GenericLinearModel: {'alpha': 0.88, 'eta': 0.19, 'u_const': 28.072390572390564, 'du_wind': 0.0}
Faiman: {'u0': 40.68462401795734, 'u1': 0.0}
PVsyst: {'u_c': 29.0, 'u_v': 0.0, 'module_efficiency': 0.19, 'alpha_absorption': 0.88}
Test setting PVsyst {u0: 25, u1: 1.2, module_efficiency: 0.1, alpha_absorption: 0.9}
GLM: GenericLinearModel: {'alpha': 0.9, 'eta': 0.1, 'u_const': 24.691358024691358, 'du_wind': 1.1851851851851851}
Faiman: {'u0': 30.864197530864196, 'u1': 1.4814814814814814}
PVsyst: {'u_c': 25.0, 'u_v': 1.2, 'module_efficiency': 0.1, 'alpha_absorption': 0.9}
Is this what you were expecting?
Naming nitpick: GLM is a well-used acronym for generalized linear model, and it does not require a leap of imagination to think a user might assume that temperature.GLM is akin to an ordinary linear regression that predicts temperature. We should call GeneralLinearModel something else: GenericTemperatureModel perhaps?
@mikofski I like those functions. I'm less sure about the API for the setters... looks pretty complicated for a setter. Another thought is a frozen data class with constructors for each model.
One thing I realized while mocking this is that the PR itself can be split into smaller chunks that are easier to review and blame:
- Basic docs, maybe in user guide
- helper & basic functions, no state
- the basic data class with properties for each model and related get/set methods that delegate back to basic fcn's in (1)
- finally the module temperature calculation method, currently
__call__()but could be more explicit likeGLTM.calc_temperature()
less sure about the API for the setters
Python property setters only take a single argument, so this was a compromise to use a dictionary, but I think a dataclass could go here too. EG:
# in GLTM class
@pvsyst.setter
def pvsyst(self, klass):
u_c = klass.u_c
u_v = klass.u_v
# same as before
# no return
@dataclass
class PVsystTemperatureModel:
u_c: float = 29
u_v: float = 0
module_efficiency: float = 0.1
alpha_absorptivity: float = 0.9
# test
pvsyst = PVsystTemperatureModel(u_c=25, u_v=1.2)
gltm = GeneralLinearTemperatureModel(alpha=0.9, eta=0.1)
gltm.pvsyst = pvsyst # <-- setter uses dataclass instead of dict
gltm.other_temp_model_like_faiman
# {u0: 30.86, u1: 1.48}
GLM is a well-used acronym
I was thinking GLTM?
I don't know what the eyes emoji means to you but to me it means I'm looking at all this stuff.
note I think the dictionary is pretty robust against errors:
>>> glm = GenericLinearModel(eta=0.16, alpha=0.89)
>>> glm
# GenericLinearModel: {'alpha': 0.89, 'eta': 0.16, 'u_const': 29.0, 'du_wind': 0.0}
>>> glm.pvsyst = {'uc': 25}
# ---------------------------------------------------------------------------
# KeyError Traceback (most recent call last)
# <ipython-input-14-b75df7ef15c2> in <module>
# ----> 1 glm.pvsyst = {'uc': 25}
# ~\Projects\pvl_temperature_glm.py in pvsyst(self, params)
# 119 Set the generic model parms to match pvsyst.
# 120 """
# --> 121 u_c = params['u_c']
# 122 u_v = params['u_v']
# 123 module_efficiency = params.get('module_efficiency')
# KeyError: 'u_c'
>>> glm.pvsyst = {'u_c': 25}
# ---------------------------------------------------------------------------
# KeyError Traceback (most recent call last)
# <ipython-input-15-c501e2033ccd> in <module>
# ----> 1 glm.pvsyst = {'u_c': 25}
# ~\Projects\pvl_temperature_glm.py in pvsyst(self, params)
# 120 """
# 121 u_c = params['u_c']
# --> 122 u_v = params['u_v']
# 123 module_efficiency = params.get('module_efficiency')
# 124 alpha_absorption = params.get('alpha_absorption')
# KeyError: 'u_v'
I had a funny feeling that using "set*" and "get*" might trigger an association with synonymous python constructs. I had previously used "from*" and "to*" and perhaps I should have stuck to them.
I can figure out how everything above works, but I'm just not sure how to evaluate what's better. One aspect is the user interface, how people will interact using a program, script or command line. The other aspect is code clarity, how people will understand it the code when reading it.
To pick up on the robustness illustration, I prefer the error message below to the key error above:
gltm.set_pvsyst(29)
Traceback (most recent call last):
File "<ipython-input-6-416ad2a2b952>", line 1, in <module>
gltm.set_pvsyst(29)
TypeError: set_pvsyst() missing 1 required positional argument: 'u_v'
Also, the __call__() function was kind of an afterthought and used for testing. It's small but it changes the class from a pure utility that's not part of a simulation to something that could be called in a simulation. It could be dropped for the sake of clarity.
I think the module temp calculate using __call__ is a cool feature, even if it was an afterthought, and imo it's a good use of OOP. My comment was to break it off into a separate PR, and perhaps use an explicit name like gltm.calc_mod_temperature(). IMO using dunder functions and other magic should be mostly avoided as overly clever.
how to evaluate what's better
I think what's easiest for the developers. The only way to reckon what the user wants is by deploying and listening. But ultimately we and our decendents will have to maintain this soup. We need to make sure it's readable. I hesitate to even use decorators, but hard to know what is "too clever". When in doubt I try to be pythonic and use the batteries that Python comes with.
how to evaluate what's better
I think what's easiest for the developers.
I think the developers will be able to deal with much greater complexity than the average pvlib user, so the latter should be given some consideration even if we don't have very clear description of what such a user is like.
I personally don't have a lot of experience with decorators so I still find that they make code harder to read and understand. I have less trouble with "dunder" methods (learned a new word today!) but I agree it would be clearer to give this one a more meaningful name than __call__.
I'm not quite sure how to move toward a consensus here. I think perhaps the key to gaining acceptance for this particular class design is to really position it as an auxiliary tool, and to that end I have added a separate generic_linear() module temperature calculation function that fit the pattern of the existing ones.
I comment not because I feel strongly or think my input is particularly useful here but because feedback was specifically requested :)
Using a class seems a bit unnecessarily heavy to me; I think of classes as being useful for objects that carry around state throughout a chunk of code, but here it seems that the stored state (self.u_const, self.du_wind) is in effect a temporary intermediate value just to connect the output of a use_x call to the input of a to_y call. Or put another way, if typical usage looks like newparams = GenericLinearModel().use_pvsyst(29, 0).to_faiman(), I wonder why bother having a class and an object if I never even store it to a variable.
I think it might be more idiomatic to use functions like @mikofski's https://github.com/pvlib/pvlib-python/pull/1463#issuecomment-1146269945, although instead of wrapping them in (data)classes, I think I'd prefer @cwhanse's suggested interface https://github.com/pvlib/pvlib-python/pull/1463#issuecomment-1140539159.
Anyway all that said I don't hate the current approach. It has the benefit of not requiring repeated eta, alpha parameters in the underlying functions. And it's friendly to IDE auto-completion for parameter names, something which would be lost with dataclass-style glm.faiman = {'u0': 16, 'u1': 8}.
An example script or notebook are other options.
This is an interesting point that I'm not sure anyone has explicitly responded to, but I don't have any thoughts beyond a general agreement that a gallery example could be a good alternative to new code in pvlib itself.
Thanks very much, @kanderso-nrel. Perhaps if no one else has strong objections, then we can move forward more-or-less as is? One way or another I need to bring this to a close (or rather a merge).
After reviewing all of the above, the only thing that I am moderately opposed to is dictionary input because it is not self-documenting.
I previously commented:
Another thought is a frozen data class with constructors for each model.
Maybe the "frozen data class" distracted people from the more important aspect: "class with constructors for each model". So the interface would be something like
glm = GenericLinearModel.from_pvsyst()
glm()
glm.to_faiman()
glm2 = GenericLinearModel.from_faiman()
# etc
Fine with me if we stick with the current interface, but I like multiple constructors and we use them in ModelChain.
Making multiple constructors would be ok with me. I had toyed with that early on, but thought using the single constructor I could better emphasize the difference between module properties and empirical parameters. I guess this is an educational justification and I don't think either way is perfect.
Without chaining, the usage is currently:
- Define module properties
- Specify empirical parameters
- Query equivalent parameters
The reason some use_* functions have the extra optional module property parameters is to match the parameter list of the corresponding temperature model functions, so a user who has a dictionary of parameters can it use with both the model function and the conversion method.
Using four constructors, the Faiman and Sandia model class constructors would acquire extra module property parameters.