hydpytools

This module implements the main features for managing HydPy projects.

Module hydpytools implements the following members:


class hydpy.core.hydpytools.HydPy(projectname: str | None = None)[source]

Bases: object

The main class for managing HydPy projects.

In typical HydPy projects, one prepares a single instance of class HydPy. This instance, which we name “hp” throughout this documentation instead of “hydpy” to avoid a naming collision with the hydpy site package, provides many convenient methods to perform tasks like reading time-series data or starting simulation runs. Additionally, it serves as a root to access most details of a HydPy project, allowing for more granular control over the framework features.

We elaborate these short explanations by using the LahnH example project. Calling function prepare_full_example_1() copies the complete example project LahnH into the iotesting directory of the HydPy site package (alternatively, you can copy the LahnH example project, which can be found in subpackage data, into a working directory of your choice):

>>> from hydpy.examples import prepare_full_example_1
>>> prepare_full_example_1()

At first, the HydPy instance needs to know the name of the relevant project, which is identical to the name of the project’s root directory. Pass LahnH to the constructor of class HydPy:

>>> from hydpy import HydPy
>>> hp = HydPy("LahnH")

So far, our HydPy instance does not know any project configurations except its name. Most of this information would be available via properties nodes and elements, but we get the following error responses if we try to access them:

>>> hp.nodes
Traceback (most recent call last):
...
AttributeError: The actual HydPy instance does not handle any nodes at the moment.
>>> hp.elements
Traceback (most recent call last):
...
AttributeError: The actual HydPy instance does not handle any elements at the moment.

One could continue rather quickly by calling the method prepare_everything(), which would make our HydPy instance ready for its first simulation run in one go. However, we prefer to continue step by step by calling the more specific preparation methods, which offers more flexibility.

First, the HydPy instance needs to know the relevant Node and Element objects. Method prepare_network() reads this information from so-called “network files”. Then, the Node and Element objects connect automatically and thereby define the topology or the network structure of the project (see the documentation on class NetworkManager and module devicetools for more detailed explanations):

>>> from hydpy import TestIO
>>> with TestIO():
...     hp.prepare_network()

(Using the “with” statement in combination with class TestIO makes sure we are reading the network files from a subdirectory of the iotesting directory. Here and in the following, you must omit such “with blocks” in case you copied the LahnH example project into your current working directory.)

Now, our HydPy instance offers access to all Node objects defined within the LahnH example project, which are grouped by a Nodes object:

>>> hp.nodes
Nodes("dill", "lahn_1", "lahn_2", "lahn_3")

Taking the node dill as an example, we can dive into the details and, for example, search for those elements which node dill is connected to (it receives water from element land_dill and passes it to element stream_dill_lahn_2), or inspect its simulated discharge value handled by a Sim sequence object (so far, zero):

>>> hp.nodes.dill.entries
Elements("land_dill")
>>> hp.nodes.dill.exits
Elements("stream_dill_lahn_2")
>>> hp.nodes.dill.sequences.sim
sim(0.0)

All Node objects are ready to be used. The same is only partly true for the Element objects, which are also accessible (via a Elements instance) and properly connected to the Node objects but do not handle workable Model objects, which is required to perform any simulation run:

>>> hp.elements
Elements("land_dill", "land_lahn_1", "land_lahn_2", "land_lahn_3",
         "stream_dill_lahn_2", "stream_lahn_1_lahn_2",
         "stream_lahn_2_lahn_3")
>>> hp.elements.stream_dill_lahn_2
Element("stream_dill_lahn_2",
        inlets="dill",
        outlets="lahn_2",
        keywords="river")
>>> hp.elements.land_dill.model
Traceback (most recent call last):
...
hydpy.core.exceptiontools.AttributeNotReady: The model object of element `land_dill` has been requested but not been prepared so far.

Hence, we need to call method prepare_models(), which instructs all Element objects to read the relevant parameter control files and prepare their Model objects. Note that the individual Element object does not know the relevant model type beforehand; both the information on the model type and the parameter settings is encoded in individual control files, making it easy to exchange individual models later (the documentation on method prepare_models() of class Elements is a good starting point for a deeper understanding on configuring HydPy projects via control files):

>>> with TestIO():
...     hp.prepare_models()
Traceback (most recent call last):
...
hydpy.core.exceptiontools.AttributeNotReady: While trying to initialise the model object of element `land_dill`, the following error occurred: The initialisation period has not been defined via attribute `timegrids` of module `pub` yet but might be required to prepare the model properly.

Oops, something went wrong. We forgot to define the simulation period, which might be relevant for some time-dependent configurations. We discuss some examples of such configurations below but now use this little accident to discuss the typical pattern of HydPy error messages. First, we usually try to add some additional “spatial” information (in this case: the name of the related Element object). Second, we try to explain in which program context an error occurs. This context is already available in much more detail in the so-called “stack trace” (the middle part of the printed error response we do not show). Stack trace descriptions are great for programmers but hard to read for others, which is why we often add “While trying to…” explanations to our error messages. In our example, one can see that the error occurred while trying to initialise the Model object of element land_dill, which is quite evident in our example but could be less evident in more complex HydPy applications.

The last sentence of the error message tells us that we need to define the attribute timegrids of module pub. pub stands for “public”, meaning module pub handles all (or at least most of) the globally available configuration data. One example is that module pub handles a Timegrids instance defining both the initialisation and the simulation period, which can be done by the following assignment (see the documentation on class Timegrid and on class Timegrids for further information):

>>> from hydpy import pub
>>> pub.timegrids = "1996-01-01", "1996-01-05", "1d"

Now method prepare_models() does not complain anymore and adds an instance of the hland_v1 application model to element land_dill, to which we set an additional reference to shorten the following examples:

>>> with TestIO():
...     hp.prepare_models()
>>> model = hp.elements.land_dill.model
>>> model.name
'hland_v1'

All control parameter values, defined in the corresponding control file, are correctly set. As an example, we show the values of control parameter IcMax, which in this case defines different values for hydrological response units of type FIELD (1.0 mm) and of type FOREST (1.5 mm):

>>> model.parameters.control.icmax
icmax(field=1.0, forest=1.5)

The appearance (or “string representation”) of all parameters that have a unit with a time reference (we call these parameters “time-dependent”) like PercMax depends on the current setting of option parameterstep, which is one day by default (see the documentation on class Parameter for more information on dealing with time-dependent parameters subclasses):

>>> model.parameters.control.percmax
percmax(1.39636)
>>> pub.options.parameterstep("1h")
Period("1d")

The values of the derived parameters, which need to be calculated before starting a simulation run based on the control parameters and eventually based on some other settings (e.g. the initialisation period), are also ready. Here we show the value of the derived parameter UH, representing the ordinates of a unit hydrograph (the single value of 1.0 means that the unit hydrograph does not cause any time delay):

>>> model.parameters.derived.uh
uh(1.0)

We define all class names in “CamelCase” letters (which is a Python convention) and, whenever practical, name the related objects identically but in lower case letters. We hope that eases finding the relevant parts of the online documentation when in trouble with a particular object. Three examples we already encountered are the Timegrids instance timegrids of module pub, the Nodes instance nodes of class HydPy, and the UH instance uh of application model hland_v1:

>>> from hydpy import classname
>>> classname(pub.timegrids)
'Timegrids'
>>> classname(hp.nodes)
'Nodes'
>>> classname(model.parameters.derived.uh)
'UH'

As shown above, all Parameter objects of the model of element land_dill are ready to be used. However, all sequences (which handle the time variable properties) contain numpy nan values, which we use to indicate missing data. We show this for the 0-dimensional input sequence T, the 1-dimensional factor sequence TC, the 1-dimensional state sequence SM, and the 0-dimensional flux sequence QT:

>>> model.sequences.inputs.t
t(nan)
>>> model.sequences.factors.tc
tc(nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan)
>>> model.sequences.states.sm
sm(nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan)
>>> model.sequences.fluxes.qt
qt(nan)

There are some other sequence types (see the documentation on module sequencetools for more details) but InputSequence, FactorSequence FluxSequence, and StateSequence are the most common ones (besides the NodeSequence subtypes Obs and especially Sim).

StateSequence objects describe many aspects of the current state of a model (or, e.g., of a catchment). Each simulation run requires proper initial states, which we call initial conditions in the following (also covering memory aspects represented by LogSequence objects). We load all necessary initial conditions by calling the method load_conditions() (see the documentation on method load_conditions() for further details):

>>> with TestIO():
...     hp.load_conditions()

Now, the states of our model are also ready to be used. However, one should note that state sequence SM knows only the current soil moisture states for the twelve hydrological response units of element land_dill (more specifically, we loaded the soil moisture values related to the start date of the initialisation period, which is January 1 at zero o’clock). By default and for reasons of memory storage efficiency, sequences generally handle the currently relevant values only instead of complete time-series:

>>> model.sequences.inputs.t
t(nan)
>>> model.sequences.factors.tc
tc(nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan)
>>> model.sequences.states.sm
sm(185.13164, 181.18755, 199.80432, 196.55888, 212.04018, 209.48859,
   222.12115, 220.12671, 230.30756, 228.70779, 236.91943, 235.64427)
>>> model.sequences.fluxes.qt
qt(nan)

For states like SM, we need to know the values at the beginning of the simulation period only. All following values are calculated subsequentially during the simulation run. However, this is different for input sequences like T. Time variable properties like the air temperature are external forcings. Hence they must be available over the whole simulation period apriori. Such complete time-series can be made available via property series of class IOSequence, which has not happened for any sequence so far:

>>> model.sequences.inputs.t.series
Traceback (most recent call last):
...
hydpy.core.exceptiontools.AttributeNotReady: Sequence `t` of element `land_dill` is not requested to make any time-series data available.

Before loading time-series data, we need to reserve the required memory storage. We do this for all sequences at once (not only the ModelSequence objects but also the NodeSequence objects as the Sim instance handled by node dill) by calling the method prepare_allseries():

>>> hp.prepare_allseries()

Now property series returns an InfoArray object, which is a slight modification of the widely applied numpy ndarray. The first axis (or the only axis) corresponds to the number of days of the initialisation period (a HydPy convention). For the 1-dimensional sequences TC and SM, the second axis corresponds to the number of hydrological response units (a hland convention):

>>> model.sequences.inputs.t.series
InfoArray([nan, nan, nan, nan])
>>> model.sequences.factors.tc.series
InfoArray([[nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan],
           [nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan],
           [nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan],
           [nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]])
>>> model.sequences.states.sm.series
InfoArray([[nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan],
           [nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan],
           [nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan],
           [nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]])
>>> model.sequences.fluxes.qt.series
InfoArray([nan, nan, nan, nan])
>>> hp.nodes.dill.sequences.sim.series
InfoArray([nan, nan, nan, nan])

So far, each time-series array is empty. The LahnH example project provides time-series files for the input sequences only, which is the minimum requirement for starting a simulation run. We use method load_inputseries() to load this data:

>>> with TestIO():
...     hp.load_inputseries()
>>> from hydpy import round_
>>> round_(model.sequences.inputs.t.series)
-0.298846, -0.811539, -2.493848, -5.968849

Finally, we can perform the simulation run by calling the method simulate():

>>> hp.simulate()

The time-series arrays of all sequences now contain calculated values — except those of input sequence T, of course (for the sequences TC and SM, we show the time-series of the first hydrological response unit only):

>>> round_(model.sequences.inputs.t.series)
-0.298846, -0.811539, -2.493848, -5.968849
>>> round_(model.sequences.factors.tc.series[:, 0])
0.751154, 0.238461, -1.443848, -4.918849
>>> round_(model.sequences.states.sm.series[:, 0])
184.926173, 184.603966, 184.386666, 184.098541
>>> round_(model.sequences.fluxes.qt.series)
11.78038, 8.901179, 7.131072, 6.017787
>>> round_(hp.nodes.dill.sequences.sim.series)
11.78038, 8.901179, 7.131072, 6.017787

By comparison, you see that the lastly calculated (or read) time-series value is the actual one for each Sequence_ object. This mechanism allows, for example, to write the final states of soil moisture sequence SM and use them as initial conditions later, even if its complete time-series were not available:

>>> model.sequences.inputs.t
t(-5.968849)
>>> model.sequences.states.sm
sm(184.098541, 180.176461, 198.689343, 195.462014, 210.856923,
   208.319571, 220.881637, 218.898327, 229.022364, 227.431521,
   235.597338, 234.329294)
>>> model.sequences.fluxes.qt
qt(6.017787)
>>> hp.nodes.dill.sequences.sim
sim(6.017787)

In many applications, the simulated time-series is the result we are interested in. Hence we close our explanations with some detailed examples on this topic that also cover the potential problem of limited rapid access storage availability.

The HydPy framework does not overwrite already existing time-series by default files. However, you can change this and related settings via the SequenceManager object available in module pub (module pub also handles ControlManager and ConditionManager objects for settings related to reading and writing control files and condition files). We change the default behaviour by setting the overwrite attribute to True:

>>> pub.sequencemanager.overwrite = True

Now we can (over)write all possible time series:

>>> with TestIO():
...     hp.save_inputseries()
...     hp.save_factorseries()
...     hp.save_fluxseries()
...     hp.save_stateseries()
...     hp.save_simseries()
...     hp.save_obsseries()

Alternatively, apply save_modelseries() to write the series of all the InputSequence, FactorSequence, FluxSequence, and StateSequence objects and save_nodeseries() to write the series of all Sim and Obs objects in one step:

>>> with TestIO():
...     hp.save_modelseries()
...     hp.save_nodeseries()

Even shorter, just apply the method save_allseries():

>>> with TestIO():
...     hp.save_allseries()

Next, we show how the reading of time-series works. We first set the time-series values of all considered sequences to zero for this purpose:

>>> model.sequences.inputs.t.series = 0.0
>>> model.sequences.states.sm.series = 0.0
>>> model.sequences.inputs.t.series = 0.0
>>> hp.nodes.dill.sequences.sim.series = 0.

Now we can reload the time-series of all relevant sequences. However, doing so would result in a warning due to incomplete data (for example, of the observation data handled by the Obs sequence objects, which is not available in the LahnH example project). To circumvent this problem, we disable the checkseries option, which is one of the public options handled by the instance of class Options available as another attribute of module pub. We again use “with blocks”, making sure the option (and the current working directory) changes only temporarily while loading the time-series:

>>> with TestIO(), pub.options.checkseries(False):
...     hp.load_inputseries()
...     hp.load_factorseries()
...     hp.load_fluxseries()
...     hp.load_stateseries()
...     hp.load_simseries()
...     hp.load_obsseries()
>>> with TestIO(), pub.options.checkseries(False):
...     hp.load_modelseries()
...     hp.load_nodeseries()
>>> with TestIO(), pub.options.checkseries(False):
...     hp.load_allseries()

The read time-series data equals the previously written one:

>>> round_(model.sequences.inputs.t.series)
-0.298846, -0.811539, -2.493848, -5.968849
>>> round_(model.sequences.factors.tc.series[:, 0])
0.751154, 0.238461, -1.443848, -4.918849
>>> round_(model.sequences.states.sm.series[:, 0])
184.926173, 184.603966, 184.386666, 184.098541
>>> round_(model.sequences.fluxes.qt.series)
11.78038, 8.901179, 7.131072, 6.017787
>>> round_(hp.nodes.dill.sequences.sim.series)
11.78038, 8.901179, 7.131072, 6.017787

We mentioned the possibility for more granular control of HydPy by using the different objects handled by the HydPy object instead of using its convenience methods. Here is an elaborate example showing how to (re)load the states of an arbitrary simulation time step, which might be relevant for more complex workflows implementing data assimilation techniques:

>>> model.sequences.states.load_data(1)
>>> model.sequences.states.sm
sm(184.603966, 180.671117, 199.234825, 195.998635, 211.435809,
   208.891492, 221.488046, 219.49929, 229.651122, 228.055912,
   236.244147, 234.972621)

Using the node sequence Sim as an example, we also show the inverse functionality of changing time-series values:

>>> hp.nodes.dill.sequences.sim = 0.0
>>> hp.nodes.dill.sequences.save_data(2)
>>> round_(hp.nodes.dill.sequences.sim.series)
11.78038, 8.901179, 0.0, 6.017787
>>> hp.nodes.dill.sequences.load_data(1)
>>> hp.nodes.dill.sequences.sim
sim(8.901179)

In the examples above, we keep all data in rapid access memory, which can be problematic when handling long time-series in huge HydPy projects. When in trouble, first try to prepare only those time-series that are strictly required (very often, it is sufficient to call prepare_inputseries(), load_inputseries(), and prepare_simseries() only). If this does not work in your project, you can read input data from and write output data to NetCDF files during simulation. These follow the NetCDF Climate and Forecast (CF) Metadata Conventions. To benefit from this feature, assign False to the allocate_ram argument of the individual “prepare series” methods (which disables handling the time-series in RAM) and assign True to the respective “jit” arguments (which prepares the “just-in-time” file access). The methods prepare_factorseries(), prepare_fluxseries(), and prepare_stateseries() deal with “output sequences” for which read data would be overwritten during the simulation and thus only support the write_jit argument. The prepare_inputseries() method, on the other hand, supports both the read_jit and the write_jit argument. However, in most cases, only reading makes sense. The argument write_jit is thought for when other methods (for example data assimilation approaches) modify the input data, and we need to keep track of these modifications:

>>> hp.prepare_inputseries(allocate_ram=False, read_jit=True)
>>> hp.prepare_factorseries(allocate_ram=False, write_jit=True)
>>> hp.prepare_fluxseries(allocate_ram=False, write_jit=True)
>>> hp.prepare_stateseries(allocate_ram=False, write_jit=True)
>>> hp.prepare_simseries(allocate_ram=False, write_jit=True)
>>> hp.prepare_obsseries(allocate_ram=False, read_jit=True)

By doing so, you lose the previously available time-series information. We use function attrready() to check this:

>>> from hydpy import attrready
>>> attrready(model.sequences.inputs.t, "series")
False
>>> attrready(model.sequences.factors.tc, "series")
False
>>> attrready(model.sequences.states.sm, "series")
False
>>> attrready(model.sequences.fluxes.qt, "series")
False
>>> attrready(hp.nodes.dill.sequences.sim, "series")
False

Reloading the initial conditions and starting a new simulation run leads to the same results as the simulation run above:

>>> with TestIO(), pub.options.checkseries(False):
...     hp.load_conditions()
...     hp.simulate()

This time, reading input data from files happened during simulation. Likewise, the calculated output data is not directly available in RAM but in different NetCDF files. To check all results are identical to those shown above, we must load them into RAM. Therefore, we first need to prepare the series objects again:

>>> hp.prepare_allseries()

By default, HydPy handles time-series data in simple text files (“asc” files):

>>> pub.sequencemanager.filetype
'asc'

One way to prepare to load the results from the available NetCDF files instead is to set the filetype attribute of the public SequenceManager object to “nc”:

>>> pub.sequencemanager.filetype = "nc"

Now we can load the previously written results into RAM (see the documentation on module netcdftools for further information) and inspect the results:

>>> with TestIO(), pub.sequencemanager.netcdfreading():
...     hp.load_modelseries()
...     hp.load_simseries()
>>> round_(model.sequences.inputs.t.series)
-0.298846, -0.811539, -2.493848, -5.968849
>>> round_(model.sequences.factors.tc.series[:, 0])
0.751154, 0.238461, -1.443848, -4.918849
>>> round_(model.sequences.states.sm.series[:, 0])
184.926173, 184.603966, 184.386666, 184.098541
>>> round_(model.sequences.fluxes.qt.series)
11.78038, 8.901179, 7.131072, 6.017787
>>> round_(hp.nodes.dill.sequences.sim.series)
11.78038, 8.901179, 7.131072, 6.017787

You can handle time-series in RAM and allow just-in-time NetCDF file access at the same time. Before showing how this works, we first disable both functionalities for all sequences and delete all previously written NetCDF files:

>>> hp.prepare_allseries(allocate_ram=False)
>>> attrready(model.sequences.inputs.t, "series")
False
>>> attrready(model.sequences.factors.tc, "series")
False
>>> attrready(model.sequences.states.sm, "series")
False
>>> attrready(model.sequences.fluxes.qt, "series")
False
>>> attrready(hp.nodes.dill.sequences.sim, "series")
False
>>> import os
>>> with TestIO():
...     for filename in os.listdir(f"LahnH/series/default"):
...         if "input" not in filename:
...             os.remove(f"LahnH/series/default/{filename}")

We again call method prepare_allseries(), but now with assigning True to the arguments allocate_ram and jit:

>>> hp.prepare_allseries(allocate_ram=True, jit=True)

After another simulation run, all input data (read during simulation) and output data (calculated during simulation) are directly available:

>>> with TestIO(), pub.options.checkseries(False):
...     hp.load_conditions()
...     hp.simulate()
>>> round_(model.sequences.inputs.t.series)
-0.298846, -0.811539, -2.493848, -5.968849
>>> round_(model.sequences.factors.tc.series[:, 0])
0.751154, 0.238461, -1.443848, -4.918849
>>> round_(model.sequences.states.sm.series[:, 0])
184.926173, 184.603966, 184.386666, 184.098541
>>> round_(model.sequences.fluxes.qt.series)
11.78038, 8.901179, 7.131072, 6.017787
>>> round_(hp.nodes.dill.sequences.sim.series)
11.78038, 8.901179, 7.131072, 6.017787

After subsequent deallocation and allocation for refreshing RAM, reading the previously written NetCDF files makes the same data available:

>>> hp.prepare_allseries(allocate_ram=False)
>>> hp.prepare_allseries(allocate_ram=True)
>>> with TestIO(), pub.sequencemanager.netcdfreading():
...     hp.load_modelseries()
...     hp.load_simseries()
>>> round_(model.sequences.inputs.t.series)
-0.298846, -0.811539, -2.493848, -5.968849
>>> round_(model.sequences.factors.tc.series[:, 0])
0.751154, 0.238461, -1.443848, -4.918849
>>> round_(model.sequences.states.sm.series[:, 0])
184.926173, 184.603966, 184.386666, 184.098541
>>> round_(model.sequences.fluxes.qt.series)
11.78038, 8.901179, 7.131072, 6.017787
>>> round_(hp.nodes.dill.sequences.sim.series)
11.78038, 8.901179, 7.131072, 6.017787
deviceorder: List[Node | Element]
nodes

The currently handled Node objects.

You are allowed to get, set and delete the currently handled nodes:

>>> from hydpy.examples import prepare_full_example_2
>>> hp, pub, TestIO = prepare_full_example_2()
>>> hp.nodes
Nodes("dill", "lahn_1", "lahn_2", "lahn_3")
>>> del hp.nodes
>>> hp.nodes
Traceback (most recent call last):
...
AttributeError: The actual HydPy instance does not handle any nodes at the moment.
>>> hp.nodes = "dill", "lahn_1"
>>> hp.nodes
Nodes("dill", "lahn_1")

However, note that doing so might result in erroneous networks and that you, even in case of correctness, must most likely call method update_devices() before performing the next simulation run.

elements

The currently handled Element objects.

You are allowed to get, set and delete the currently handled elements:

>>> from hydpy.examples import prepare_full_example_2
>>> hp, pub, TestIO = prepare_full_example_2()
>>> hp.elements
Elements("land_dill", "land_lahn_1", "land_lahn_2", "land_lahn_3",
         "stream_dill_lahn_2", "stream_lahn_1_lahn_2",
         "stream_lahn_2_lahn_3")
>>> del hp.elements
>>> hp.elements
Traceback (most recent call last):
...
AttributeError: The actual HydPy instance does not handle any elements at the moment.
>>> hp.elements = "land_dill", "land_lahn_1"
>>> hp.elements
Elements("land_dill", "land_lahn_1")

However, note that doing so might result in erroneous networks and that you, even in case of correctness, must most likely call method update_devices() before performing the next simulation run.

prepare_everything() None[source]

Convenience method to make the actual HydPy instance runnable.

Method prepare_everything() is the fastest approach to get a runnable HydPy object. You only need to import class HydPy, initialise it with the project name, define the simulation period via the Timegrids object of module pub, and call method prepare_everything() (in this documentation, we first need to prepare the example project via function prepare_full_example_1() and change the current working directory via class TestIO):

>>> from hydpy.examples import prepare_full_example_1
>>> prepare_full_example_1()
>>> from hydpy import HydPy, pub, round_, TestIO
>>> with TestIO():
...     hp = HydPy("LahnH")
...     pub.timegrids = "1996-01-01", "1996-01-05", "1d"
...     hp.prepare_everything()

Now you can start a simulation run and inspect the calculated time-series of all relevant sequences. We take the discharge values of the flux sequence QT of Element object land_dill and of the node sequence Sim of Node object dill as examples, which provide the same information:

>>> hp.simulate()
>>> round_(hp.elements.land_dill.model.sequences.fluxes.qt.series)
11.78038, 8.901179, 7.131072, 6.017787
>>> round_(hp.nodes.dill.sequences.sim.series)
11.78038, 8.901179, 7.131072, 6.017787
prepare_network() None[source]

Load all network files as Selections (stored in module pub) and assign the “complete” selection to the HydPy object.

First, we call function prepare_full_example_1() to prepare the LahnH example project, including its network files headwaters.py, nonheadwaters.py, and streams.py:

>>> from hydpy.examples import prepare_full_example_1
>>> prepare_full_example_1()

Directly after initialising class HydPy, neither the resulting object nor module pub contain any information stemming from the network files:

>>> from hydpy import HydPy, pub, TestIO
>>> hp = HydPy("LahnH")
>>> pub.selections
Traceback (most recent call last):
...
hydpy.core.exceptiontools.AttributeNotReady: Attribute selections of module `pub` is not defined at the moment.

By calling the method prepare_network(), one loads all three network files into separate Selection objects, all handled by the Selections object of module pub. Additionally, there is a Selection object named complete, covering all Node and Element objects of the other Selection objects:

>>> with TestIO():
...     hp.prepare_network()
>>> pub.selections
Selections("complete", "headwaters", "nonheadwaters", "streams")
>>> pub.selections.headwaters <= pub.selections.complete
True
>>> pub.selections.nonheadwaters <= pub.selections.complete
True
>>> pub.selections.streams <= pub.selections.complete
True

Initially, the HydPy object is aware of the complete set of Node and Element objects:

>>> hp.nodes == pub.selections.complete.nodes
True
>>> hp.elements == pub.selections.complete.elements
True

See the documentation on method update_devices() on how to “activate| another selection in the safest manner.

prepare_models() None[source]

Read all control files related to the current Element objects, initialise the defined models, and prepare their parameter values.

First, we call function prepare_full_example_1() to prepare the LahnH example project:

>>> from hydpy.examples import prepare_full_example_1
>>> prepare_full_example_1()

Now we can initialise a HydPy instance accordingly and call its methods prepare_network() and prepare_models():

>>> from hydpy import HydPy, pub, round_, TestIO
>>> with TestIO():
...     pub.timegrids = "1996-01-01", "1996-01-05", "1d"
...     hp = HydPy("LahnH")
...     hp.prepare_network()
...     hp.prepare_models()

As a result, each Element object handles a model of the type and with the parameter values defined in the relevant control file:

>>> hp.elements.land_dill.model.name
'hland_v1'
>>> hp.elements.land_dill.model.parameters.control.area
area(692.3)
>>> hp.elements.stream_lahn_1_lahn_2.model.name
'musk_classic'
>>> hp.elements.stream_lahn_1_lahn_2.model.parameters.control.nmbsegments
nmbsegments(lag=0.583)

The LahnH example project comes with one auxiliary file, named land.py. This file defines general parameter values, valid for all single parameter objects of the different model instances referencing this file via the auxfile keyword argument. The following examples use the land_dill element to show that the affected parameters are also correctly prepared:

>>> control = hp.elements.land_dill.model.parameters.control
>>> control.alpha
alpha(1.0)
>>> control.pcorr
pcorr(1.0)
>>> control.resparea
resparea(True)
>>> control.icmax
icmax(field=1.0, forest=1.5)

We show that the individual IcMax values for two different elements are different to demonstrate that parameter values defined within a master control file (ZoneType) can affect the actual values of parameters defined in auxiliary control files:

>>> from hydpy import round_
>>> round_(control.icmax.values)
1.0, 1.5, 1.0, 1.5, 1.0, 1.5, 1.0, 1.5, 1.0, 1.5, 1.0, 1.5
>>> round_(
...     hp.elements.land_lahn_2.model.parameters.control.icmax.values)
1.0, 1.5, 1.0, 1.5, 1.0, 1.5, 1.0, 1.5, 1.0, 1.5

Missing parameter information in auxiliary files results in errors like the following:

>>> filepath = "LahnH/control/default/land.py"
>>> with TestIO():
...     with open(filepath) as infile:
...         text = infile.read().replace("alpha(1.0)", "")
...     with open(filepath, "w") as outfile:
...         outfile.write(text)
...     hp.prepare_models()   
Traceback (most recent call last):
...
RuntimeError: While trying to initialise the model object of element `land_dill`, the following error occurred: While trying to load the control file `...land_dill.py`, the following error occurred: While trying to extract information for parameter `alpha` from file `land`, the following error occurred: The selected auxiliary file does not define value(s) for parameter `alpha`.

Completely wrong control files result in the following error:

>>> with TestIO():
...     with open("LahnH/control/default/land_dill.py", "w"):
...         pass
...     hp.prepare_models()   
Traceback (most recent call last):
...
RuntimeError: While trying to initialise the model object of element `land_dill`, the following error occurred: Model parameters cannot be loaded from control file `...land_dill.py`.  Please refer to the HydPy documentation on how to prepare control files properly.
init_models() None[source]

Deprecated! Use method prepare_models() instead.

>>> from hydpy import HydPy
>>> from unittest import mock
>>> from hydpy.core.testtools import warn_later
>>> with warn_later(), mock.patch.object(HydPy, "prepare_models") as mocked:
...     hp = HydPy("test")
...     hp.init_models()
HydPyDeprecationWarning: Method `init_models` of class `HydPy` is deprecated.  Use method `prepare_models` instead.
>>> mocked.call_args_list
[call()]
save_controls(parameterstep: timetools.PeriodConstrArg | None = None, simulationstep: timetools.PeriodConstrArg | None = None, auxfiler: auxfiletools.Auxfiler | None = None) None[source]

Write the control files of all current Element objects.

We use the LahnH example project to demonstrate how to write a complete set of parameter control files. For convenience, we let function prepare_full_example_2() prepare a fully functional HydPy object, handling seven Element objects controlling four hland_v1 and three musk_classic application models:

>>> from hydpy.examples import prepare_full_example_2
>>> hp, pub, TestIO = prepare_full_example_2()

At first, there is only one control subfolder named “default”, containing the seven master control files used in the step above:

>>> import os
>>> with TestIO():
...     os.listdir("LahnH/control")
['default']

Next, we use the ControlManager to create a new directory and write analogue control files into it:

>>> with TestIO():
...     pub.controlmanager.currentdir = "newdir"
...     hp.save_controls()
...     sorted(os.listdir("LahnH/control"))
['default', 'newdir']

We focus our examples on the (shorter) control files of the application model musk_classic. These control files define the values of the parameters NmbSegments and Coefficients via the keyword arguments lag and damp. For the river channel connecting the outlets of subcatchment lahn_1 and lahn_2, the lag value is 0.583 days, and the damp value is zero:

>>> model = hp.elements.stream_lahn_1_lahn_2.model
>>> model.parameters.control
nmbsegments(lag=0.583)
coefficients(damp=0.0)

Its control file’s name equals the element’s name:

>>> dir_ = "LahnH/control/newdir/"
>>> with TestIO():
...     with open(dir_ + "stream_lahn_1_lahn_2.py") as controlfile:
...         print(controlfile.read())
# -*- coding: utf-8 -*-

from hydpy.models.musk_classic import *

simulationstep("1d")
parameterstep("1d")

nmbsegments(lag=0.583)
coefficients(damp=0.0)

The time step information stems from the Timegrid object available via pub:

>>> pub.timegrids.stepsize
Period("1d")

Use the Auxfiler class to avoid redefining the same parameter values in multiple control files. We prepare an Auxfiler object that handles the model’s two parameters discussed above:

>>> from hydpy import Auxfiler
>>> auxfiler = Auxfiler("musk_classic")
>>> auxfiler.musk_classic.add_parameter(
...     model.parameters.control.nmbsegments, filename="stream")
>>> auxfiler.musk_classic.add_parameter(
...     model.parameters.control.coefficients, filename="stream")

When passing the Auxfiler object to the method save_controls(), the control file of element stream_lahn_1_lahn_2 does not define the values of both parameters on its own but references the auxiliary file stream.py instead:

>>> with TestIO():
...     pub.controlmanager.currentdir = "newdir"
...     hp.save_controls(auxfiler=auxfiler)
...     with open(dir_ + "stream_lahn_1_lahn_2.py") as controlfile:
...         print(controlfile.read())
# -*- coding: utf-8 -*-

from hydpy.models.musk_classic import *

simulationstep("1d")
parameterstep("1d")

nmbsegments(auxfile="stream")
coefficients(auxfile="stream")

stream.py contains the actual value definitions:

>>> with TestIO():
...     with open(dir_ + "stream.py") as controlfile:
...         print(controlfile.read())
# -*- coding: utf-8 -*-

from hydpy.models.musk_classic import *

simulationstep("1d")
parameterstep("1d")

nmbsegments(lag=0.583)
coefficients(damp=0.0)

The musk_classic model of element stream_lahn_2_lahn_3 defines the same value for parameter Coefficients but a different one for parameter NmbSegments. Hence, only Coefficients can reference the control file stream.py:

>>> with TestIO():
...     with open(dir_ + "stream_lahn_2_lahn_3.py") as controlfile:
...         print(controlfile.read())
# -*- coding: utf-8 -*-

from hydpy.models.musk_classic import *

simulationstep("1d")
parameterstep("1d")

nmbsegments(lag=0.417)
coefficients(auxfile="stream")

Another option is to pass alternative step size information. The simulationstep information, which is no integral part of control files but helpful in testing them, has no impact on the written data. However, passing an alternative parameterstep information changes the written values of time-dependent parameters both in the primary and the auxiliary control files:

>>> with TestIO():
...     pub.controlmanager.currentdir = "newdir"
...     hp.save_controls(
...         auxfiler=auxfiler, parameterstep="2d", simulationstep="1h")
...     with open(dir_ + "stream_lahn_1_lahn_2.py") as controlfile:
...         print(controlfile.read())
# -*- coding: utf-8 -*-

from hydpy.models.musk_classic import *

simulationstep("1h")
parameterstep("2d")

nmbsegments(auxfile="stream")
coefficients(auxfile="stream")
>>> with TestIO():
...     with open(dir_ + "stream.py") as controlfile:
...         print(controlfile.read())
# -*- coding: utf-8 -*-

from hydpy.models.musk_classic import *

simulationstep("1h")
parameterstep("2d")

nmbsegments(lag=0.2915)
coefficients(damp=0.0)
>>> with TestIO():
...     with open(dir_ + "stream_lahn_2_lahn_3.py") as controlfile:
...         print(controlfile.read())
# -*- coding: utf-8 -*-

from hydpy.models.musk_classic import *

simulationstep("1h")
parameterstep("2d")

nmbsegments(lag=0.2085)
coefficients(auxfile="stream")
load_conditions() None[source]

Load all currently relevant initial conditions.

The following examples demonstrate both the functionality of method load_conditions() and save_conditions() based on the LahnH project, which we prepare via function prepare_full_example_2():

>>> from hydpy.examples import prepare_full_example_2
>>> hp, pub, TestIO = prepare_full_example_2()

Our HydPy instance hp is ready for the first simulation run, meaning the required initial conditions are available already. First, we start a simulation run covering the whole initialisation period and inspect the resulting soil moisture values of Element land_dill, handled by a sequence object of type SM:

>>> hp.simulate()
>>> sm = hp.elements.land_dill.model.sequences.states.sm
>>> sm
sm(184.098541, 180.176461, 198.689343, 195.462014, 210.856923,
   208.319571, 220.881637, 218.898327, 229.022364, 227.431521,
   235.597338, 234.329294)

By default, method load_conditions() always (re)loads the initial conditions from the directory with its name matching the start date of the simulation period, which we prove by also showing the related content of the respective condition file land_dill.py:

>>> with TestIO():
...     hp.load_conditions()
>>> sm
sm(185.13164, 181.18755, 199.80432, 196.55888, 212.04018, 209.48859,
   222.12115, 220.12671, 230.30756, 228.70779, 236.91943, 235.64427)
>>> path = "LahnH/conditions/init_1996_01_01_00_00_00/land_dill.py"
>>> with TestIO():
...     with open(path, "r") as file_:
...         lines = file_.read().split("\n")
...         print(lines[10])
...         print(lines[11])
sm(185.13164, 181.18755, 199.80432, 196.55888, 212.04018, 209.48859,
   222.12115, 220.12671, 230.30756, 228.70779, 236.91943, 235.64427)

Now we perform two consecutive runs, covering the first and the second half of the initialisation period, respectively, and write, in both cases, the resulting final conditions to disk:

>>> pub.timegrids.sim.lastdate = "1996-01-03"
>>> hp.simulate()
>>> sm
sm(184.603966, 180.671117, 199.234825, 195.998635, 211.435809,
   208.891492, 221.488046, 219.49929, 229.651122, 228.055912,
   236.244147, 234.972621)
>>> with TestIO():
...     hp.save_conditions()
>>> pub.timegrids.sim.firstdate = "1996-01-03"
>>> pub.timegrids.sim.lastdate = "1996-01-05"
>>> hp.simulate()
>>> with TestIO():
...     hp.save_conditions()
>>> sm
sm(184.098541, 180.176461, 198.689343, 195.462014, 210.856923,
   208.319571, 220.881637, 218.898327, 229.022364, 227.431521,
   235.597338, 234.329294)

Analogous to method load_conditions(), method save_conditions() writes the resulting conditions to a directory with its name matching the end date of the simulation period, which we prove by reloading the conditions related to the middle of the initialisation period and showing the relevant file content:

>>> with TestIO():
...     hp.load_conditions()
>>> sm
sm(184.603966, 180.671117, 199.234825, 195.998635, 211.435809,
   208.891492, 221.488046, 219.49929, 229.651122, 228.055912,
   236.244147, 234.972621)
>>> path = "LahnH/conditions/init_1996_01_03_00_00_00/land_dill.py"
>>> with TestIO():
...     with open(path, "r") as file_:
...         lines = file_.read().split("\n")
...         print(lines[10])
...         print(lines[11])
...         print(lines[12])
sm(184.603966, 180.671117, 199.234825, 195.998635, 211.435809,
   208.891492, 221.488046, 219.49929, 229.651122, 228.055912,
   236.244147, 234.972621)

You can define another directory by assigning a different name to the attribute currentdir of the actual ConditionManager instance:

>>> with TestIO():
...     pub.conditionmanager.currentdir = "test"
...     hp.save_conditions()
>>> path = "LahnH/conditions/test/land_dill.py"
>>> with TestIO():
...     with open(path, "r") as file_:
...         lines = file_.read().split("\n")
...         print(lines[10])
...         print(lines[11])
...         print(lines[12])
sm(184.603966, 180.671117, 199.234825, 195.998635, 211.435809,
   208.891492, 221.488046, 219.49929, 229.651122, 228.055912,
   236.244147, 234.972621)

This change remains permanent until you undo it manually:

>>> sm(0.0)
>>> pub.timegrids.sim.firstdate = "1996-01-01"
>>> with TestIO():
...     hp.load_conditions()
>>> sm
sm(184.603966, 180.671117, 199.234825, 195.998635, 211.435809,
   208.891492, 221.488046, 219.49929, 229.651122, 228.055912,
   236.244147, 234.972621)
>>> with TestIO():
...     del pub.conditionmanager.currentdir
...     hp.load_conditions()
>>> sm
sm(185.13164, 181.18755, 199.80432, 196.55888, 212.04018, 209.48859,
   222.12115, 220.12671, 230.30756, 228.70779, 236.91943, 235.64427)
save_conditions() None[source]

Save all currently relevant final conditions.

See the documentation on method load_conditions() for further information.

trim_conditions() None[source]

Check all values of the condition sequences (StateSequence and LogSequence objects) for boundary violations and fix them if necessary.

We use the LahnH example project to explain the functionality of the method trim_conditions(), which gives no response when all conditions are correctly set:

>>> from hydpy.examples import prepare_full_example_2
>>> hp, pub, TestIO = prepare_full_example_2()
>>> with pub.options.warntrim(True):
...     hp.trim_conditions()

If you try, for example, to set interception capacities (Ic) that violate the maximum capacity (IcMax), you get a direct response based on function trim():

>>> from hydpy.core.testtools import warn_later
>>> with pub.options.warntrim(True), warn_later():
...     hp.elements.land_dill.model.sequences.states.ic(1.2)
UserWarning: For variable `ic` of element `land_dill` at least one value needed to be trimmed.  The old and the new value(s) are `1.2, 1.2, 1.2, 1.2, 1.2, 1.2, 1.2, 1.2, 1.2, 1.2, 1.2, 1.2` and `1.0, 1.2, 1.0, 1.2, 1.0, 1.2, 1.0, 1.2, 1.0, 1.2, 1.0, 1.2`, respectively.

However, changing the boundaries themselves without adjusting the conditions cannot be detected automatically. Whenever in doubt, call method trim_conditions() explicitly:

>>> hp.elements.land_dill.model.parameters.control.icmax(1.1)
>>> with pub.options.warntrim(True), warn_later():
...     hp.trim_conditions()
UserWarning: For variable `ic` of element `land_dill` at least one value needed to be trimmed.  The old and the new value(s) are `1.0, 1.2, 1.0, 1.2, 1.0, 1.2, 1.0, 1.2, 1.0, 1.2, 1.0, 1.2` and `1.0, 1.1, 1.0, 1.1, 1.0, 1.1, 1.0, 1.1, 1.0, 1.1, 1.0, 1.1`, respectively.
reset_conditions() None[source]

Reset all currently relevant condition sequences.

Method reset_conditions() is the most convenient way to perform simulations repeatedly for the same period, each time starting from the same initial conditions, e.g. for parameter calibration. Each StateSequence and LogSequence object remembers the last assigned values and can reactivate them for the mentioned purpose.

For demonstration, we perform a simulation for the LahnH example project spanning four days:

>>> from hydpy.examples import prepare_full_example_2
>>> hp, pub, TestIO = prepare_full_example_2()
>>> hp.simulate()
>>> from hydpy import print_values
>>> print_values(hp.nodes.lahn_3.sequences.sim.series)
54.043745, 37.320814, 31.922053, 28.413644

Just repeating the simulation gives different results due to applying the final states of the first simulation run as the initial states of the second run:

>>> hp.simulate()
>>> print_values(hp.nodes.lahn_3.sequences.sim.series)
26.218473, 25.039964, 24.205384, 23.296241

Calling reset_conditions() first allows repeating the first simulation run exactly multiple times:

>>> hp.reset_conditions()
>>> hp.simulate()
>>> print_values(hp.nodes.lahn_3.sequences.sim.series)
54.043745, 37.320814, 31.922053, 28.413644
>>> hp.reset_conditions()
>>> hp.simulate()
>>> print_values(hp.nodes.lahn_3.sequences.sim.series)
54.043745, 37.320814, 31.922053, 28.413644
property conditions: Dict[str, Dict[str, Dict[str, float | ndarray[Any, dtype[float64]]]]]

A nested dictionary that contains the values of all condition sequences of all currently handled models.

The primary purpose of property conditions is similar to method reset_conditions(), to allow to perform repeated calculations starting from the same initial conditions. Nevertheless, conditions is more flexible when to handling multiple conditions, which can, for example, be useful for applying ensemble-based assimilation algorithms.

For demonstration, we perform simulations for the LahnH example project spanning the first three months of 1996. We begin with a preparation run beginning on January 1 and ending on February 20:

>>> from hydpy.examples import prepare_full_example_1
>>> prepare_full_example_1()
>>> from hydpy import HydPy, pub, TestIO, print_values
>>> with TestIO():
...     hp = HydPy("LahnH")
...     pub.timegrids = "1996-01-01", "1996-04-01", "1d"
...     hp.prepare_everything()
>>> pub.timegrids.sim.lastdate = "1996-02-20"
>>> hp.simulate()
>>> print_values(hp.nodes.lahn_3.sequences.sim.series[48:52])
70.553509, 94.344086, nan, nan

At the end of the preparation run, a snow layer is covering the Lahn catchment. In the lahn_1 subcatchment, this snow layer contains 19.5 mm of frozen water and 1.7 mm of liquid water:

>>> lahn1_states = hp.elements.land_lahn_1.model.sequences.states
>>> print_values([lahn1_states.sp.average_values()])
19.543831
>>> print_values([lahn1_states.wc.average_values()])
1.745963

Now we save the current conditions and perform the first simulation run from the 20th day of February until the end of March:

>>> conditions = hp.conditions
>>> hp.nodes.lahn_3.sequences.sim.series = 0.0
>>> pub.timegrids.sim.firstdate = "1996-02-20"
>>> pub.timegrids.sim.lastdate = "1996-04-01"
>>> hp.simulate()
>>> first = hp.nodes.lahn_3.sequences.sim.series.copy()
>>> print_values(first[48:52])
0.0, 0.0, 85.150677, 63.902098

To exactly repeat the last simulation run, we assign the memorised conditions to property conditions:

>>> hp.conditions = conditions
>>> print_values([lahn1_states.sp.average_values()])
19.543831
>>> print_values([lahn1_states.wc.average_values()])
1.745963

All discharge values of the second simulation run are identical to the ones of the first simulation run:

>>> hp.nodes.lahn_3.sequences.sim.series = 0.0
>>> pub.timegrids.sim.firstdate = "1996-02-20"
>>> pub.timegrids.sim.lastdate = "1996-04-01"
>>> hp.simulate()
>>> second = hp.nodes.lahn_3.sequences.sim.series.copy()
>>> print_values(second[48:52])
0.0, 0.0, 85.150677, 63.902098
>>> all(first == second)
True

We selected the snow period as an example due to potential problems with the limited water holding capacity of the snow layer, which depends on the ice content of the snow layer (SP) and the relative water holding capacity (WHC). Due to this restriction, problems can occur. To give an example, we set WHC to zero temporarily, apply the memorised conditions, and finally reset the original values of | hland_control.WHC|:

>>> for element in hp.elements.catchment:
...     element.whc = element.model.parameters.control.whc.values
...     element.model.parameters.control.whc = 0.0
>>> with pub.options.warntrim(False):
...     hp.conditions = conditions
>>> for element in hp.elements.catchment:
...     element.model.parameters.control.whc = element.whc

Without any water holding capacity of the snow layer, its water content is zero despite the actual memorised value of 1.7 mm:

>>> print_values([lahn1_states.sp.average_values()])
19.543831
>>> print_values([lahn1_states.wc.average_values()])
0.0

What is happening in such conflicts partly depends on the implementation of the respective application model. For safety, we suggest setting the option warntrim to True before resetting conditions.

property networkproperties: Dict[str, int | Dict[str, int] | Dict[devicetools.NodeVariableType, int]]

Some properties of the network defined by the currently relevant Node and Element objects.

See the documentation on method print_networkproperties() for further information.

print_networkproperties() None[source]

Print some properties of the network defined by the currently relevant Node and Element objects.

print_networkproperties() is for convenience to summarise specific network measures like segregatednetworks.

The LahnH example project defines a small, single network, with all catchments ultimately discharging to node lahn_3:

>>> from hydpy.examples import prepare_full_example_1
>>> prepare_full_example_1()
>>> from hydpy import HydPy, pub, TestIO
>>> pub.timegrids = "1996-01-01", "1996-01-05", "1d"
>>> with TestIO():
...     hp = HydPy("LahnH")
...     hp.prepare_network()
...     hp.prepare_models()
>>> hp.print_networkproperties()
Number of nodes: 4
Number of elements: 7
Number of end nodes: 1
Number of distinct networks: 1
Applied node variables: Q (4)
Applied model types: hland_v1 (4) and musk_classic (3)
property endnodes: Nodes

All currently relevant Node objects that define a downstream endpoint of the network.

The LahnH example project defines a small, single network, with all catchments ultimately discharging to node lahn_3:

>>> from hydpy.examples import prepare_full_example_1
>>> prepare_full_example_1()
>>> from hydpy import HydPy, TestIO
>>> with TestIO():
...     hp = HydPy("LahnH")
...     hp.prepare_network()
>>> hp.endnodes
Nodes("lahn_3")

After breaking the connection between node lahn_1 and its downstream river channel element stream_lahn_1_lahn2, lahn_1 also becomes an end node:

>>> hp.nodes.lahn_1.exits.mutable = True
>>> hp.elements.stream_lahn_1_lahn_2.inlets.mutable = True
>>> del hp.nodes.lahn_1.exits.stream_lahn_1_lahn_2
>>> del hp.elements.stream_lahn_1_lahn_2.inlets.lahn_1
>>> hp.endnodes
Nodes("lahn_1", "lahn_3")

Even with a proper connection to a downstream element, a node counts as an end node as long as these elements are not part of the currently relevant network (meaning, currently handled by the HydPy object):

>>> del hp.elements.stream_dill_lahn_2
>>> hp.nodes.dill.exits
Elements("stream_dill_lahn_2")
>>> hp.endnodes
Nodes("dill", "lahn_1", "lahn_3")

Connections with “remote” elements are considered irrelevant:

>>> stream = hp.elements.stream_lahn_2_lahn_3
>>> stream.inlets.mutable = True
>>> stream.receivers.mutable = True
>>> stream.receivers += stream.inlets.lahn_2
>>> del stream.inlets.lahn_2
>>> hp.endnodes
Nodes("dill", "lahn_1", "lahn_2", "lahn_3")
property segregatednetworks: Selections

The number of segregated networks defined by the currently relevant Node and Element objects.

Each end node (as defined by property endnodes) eventually defines a single network, segregated from the networks of other end nodes. Due to the LahnH example project defining only a single end node, there can be only one segregate network, accordingly:

>>> from hydpy.examples import prepare_full_example_1
>>> prepare_full_example_1()
>>> from hydpy import HydPy, TestIO
>>> with TestIO():
...     hp = HydPy("LahnH")
...     hp.prepare_network()
>>> hp.segregatednetworks
Selections("lahn_3")
>>> hp.segregatednetworks.lahn_3
Selection("lahn_3",
          nodes=("dill", "lahn_1", "lahn_2", "lahn_3"),
          elements=("land_dill", "land_lahn_1", "land_lahn_2",
                    "land_lahn_3", "stream_dill_lahn_2",
                    "stream_lahn_1_lahn_2", "stream_lahn_2_lahn_3"))

Revisiting the examples of the documentation on property endnodes, we get the similar results. Note that the segregated networks are always Selection objects that do not overlap each other (meaning, no Node or Element object occurs more than one time):

>>> hp.nodes.lahn_1.exits.mutable = True
>>> hp.elements.stream_lahn_1_lahn_2.inlets.mutable = True
>>> del hp.nodes.lahn_1.exits.stream_lahn_1_lahn_2
>>> del hp.elements.stream_lahn_1_lahn_2.inlets.lahn_1
>>> hp.segregatednetworks
Selections("lahn_1", "lahn_3")
>>> hp.segregatednetworks.lahn_1
Selection("lahn_1",
          nodes="lahn_1",
          elements="land_lahn_1")
>>> hp.segregatednetworks.lahn_3
Selection("lahn_3",
          nodes=("dill", "lahn_2", "lahn_3"),
          elements=("land_dill", "land_lahn_2", "land_lahn_3",
                    "stream_dill_lahn_2", "stream_lahn_1_lahn_2",
                    "stream_lahn_2_lahn_3"))
>>> del hp.elements.stream_dill_lahn_2
>>> hp.nodes.dill.exits
Elements("stream_dill_lahn_2")
>>> hp.segregatednetworks
Selections("dill", "lahn_1", "lahn_3")
>>> hp.segregatednetworks.dill
Selection("dill",
          nodes="dill",
          elements="land_dill")
>>> hp.segregatednetworks.lahn_1
Selection("lahn_1",
          nodes="lahn_1",
          elements="land_lahn_1")
>>> hp.segregatednetworks.lahn_3
Selection("lahn_3",
          nodes=("lahn_2", "lahn_3"),
          elements=("land_lahn_2", "land_lahn_3",
                    "stream_lahn_1_lahn_2", "stream_lahn_2_lahn_3"))
>>> stream = hp.elements.stream_lahn_2_lahn_3
>>> stream.inlets.mutable = True
>>> stream.receivers.mutable = True
>>> stream.receivers += stream.inlets.lahn_2
>>> del stream.inlets.lahn_2
>>> hp.segregatednetworks
Selections("dill", "lahn_1", "lahn_2", "lahn_3")
>>> hp.segregatednetworks.dill
Selection("dill",
          nodes="dill",
          elements="land_dill")
>>> hp.segregatednetworks.lahn_1
Selection("lahn_1",
          nodes="lahn_1",
          elements="land_lahn_1")
>>> hp.segregatednetworks.lahn_2
Selection("lahn_2",
          nodes="lahn_2",
          elements=("land_lahn_2", "stream_lahn_1_lahn_2"))
>>> hp.segregatednetworks.lahn_3
Selection("lahn_3",
          nodes="lahn_3",
          elements=("land_lahn_3", "stream_lahn_2_lahn_3"))

In all examples above, the number of the end nodes and the number of the segregated networks are identical, which is not the case when two or more networks share the same network. We restore our original network and add two additional end nodes, nowhere and somewhere, linking the first one with element stream_lahn_2_lahn_3 and the second one with the additional element stream_lahn_1_nowhere, which we connect to node lahn_1:

>>> with TestIO():
...     hp = HydPy("LahnH")
...     hp.prepare_network()
>>> from hydpy import Element
>>> _ = Element("stream_lahn_2_lahn_3", outlets="nowhere")
>>> hp.nodes += "nowhere"
>>> hp.elements += Element("stream_lahn_1_nowhere",
...                        inlets="lahn_1",
...                        outlets="somewhere")
>>> hp.nodes += "somewhere"

Now there are three end nodes but only two segregated networks, as node nowhere does not reference any upstream devices not also referenced by node lahn_3. The unique feature of element lahn_3 and stream_lahn_1_nowhere is that they drain to either node lahn_3 or somewhere but not both, which is why they are the only members of selection lahn_3 and somewhere, respectively:

>>> hp.endnodes
Nodes("lahn_3", "nowhere", "somewhere")
>>> hp.segregatednetworks
Selections("lahn_3", "somewhere")
>>> hp.segregatednetworks.lahn_3
Selection("lahn_3",
          nodes="lahn_3",
          elements="land_lahn_3")
>>> hp.segregatednetworks.somewhere
Selection("somewhere",
          nodes="somewhere",
          elements="stream_lahn_1_nowhere")
property variables: Dict[devicetools.NodeVariableType, int]

Summary of all variable properties of the currently relevant Node objects.

>>> from hydpy.examples import prepare_full_example_1
>>> prepare_full_example_1()
>>> from hydpy import HydPy, TestIO
>>> with TestIO():
...     hp = HydPy("LahnH")
...     hp.prepare_network()
>>> hp.variables
{'Q': 4}
>>> from hydpy import FusedVariable, Node
>>> from hydpy.inputs import hland_T
>>> hp.nodes += Node("test", variable=FusedVariable("T", hland_T))
>>> hp.variables
{'Q': 4, FusedVariable("T", hland_T): 1}
property modeltypes: Dict[str, int]

Summary of all Model subclasses of the currently relevant Element objects.

>>> from hydpy.examples import prepare_full_example_1
>>> prepare_full_example_1()
>>> from hydpy import HydPy, pub, TestIO
>>> with TestIO():
...     hp = HydPy("LahnH")
...     hp.prepare_network()
>>> hp.modeltypes
{'unprepared': 7}
>>> pub.timegrids = "1996-01-01", "1996-01-05", "1d"
>>> with TestIO():
...     hp.prepare_models()
>>> hp.modeltypes
{'hland_v1': 4, 'musk_classic': 3}
update_devices(*, selection: selectiontools.Selection | None = None, nodes: devicetools.NodesConstrArg | None = None, elements: devicetools.ElementsConstrArg | None = None) None[source]

Determine the order in which method simulate() processes the currently relevant Node and Element objects.

Eventually passed Node and Element objects (for example, contained within a Selection object) replace existing ones.

As described in the documentation on the method prepare_network(), a HydPy object usually starts with the “complete” network of the considered project:

>>> from hydpy.examples import prepare_full_example_2
>>> hp, pub, TestIO = prepare_full_example_2()

The safest approach to “activate” another selection is to use the method update_devices(). The first option is to pass a complete Selection object:

>>> pub.selections.headwaters
Selection("headwaters",
          nodes=("dill", "lahn_1"),
          elements=("land_dill", "land_lahn_1"))
>>> hp.update_devices(selection=pub.selections.headwaters)
>>> hp.nodes
Nodes("dill", "lahn_1")
>>> hp.elements
Elements("land_dill", "land_lahn_1")

Method update_devices() automatically updates the deviceorder, assuring method simulate() processes “upstream” model instances before it processes their “downstream” neighbours:

>>> for device in hp.deviceorder:
...     print(device)
land_dill
land_lahn_1
dill
lahn_1

Second, you can pass some nodes only, which by the way removes the old elements:

>>> hp.update_devices(nodes="dill")
>>> hp.nodes
Nodes("dill")
>>> hp.elements
Elements()
>>> for device in hp.deviceorder:
...     print(device)
dill

Third, you can pass some elements only, which by the way removes the old nodes:

>>> hp.update_devices(elements=["land_lahn_1", "land_dill"])
>>> hp.nodes
Nodes()
>>> hp.elements
Elements("land_dill", "land_lahn_1")
>>> for device in hp.deviceorder:
...     print(device)
land_dill
land_lahn_1

Fourth, you can pass nodes and elements at the same time:

>>> hp.update_devices(nodes="dill",
...                   elements=["land_lahn_1", "land_dill"])
>>> hp.nodes
Nodes("dill")
>>> hp.elements
Elements("land_dill", "land_lahn_1")
>>> for device in hp.deviceorder:
...     print(device)
land_dill
land_lahn_1
dill

Fifth, you can pass no argument at all, which only updates the device order:

>>> del hp.nodes.dill
>>> for device in hp.deviceorder:
...     print(device)
land_dill
land_lahn_1
dill
>>> hp.update_devices()
>>> for device in hp.deviceorder:
...     print(device)
land_dill
land_lahn_1

Method update_devices() does not allow to pass single devices and devices contained within a selection at the same time:

>>> hp.update_devices(selection=pub.selections.headwaters,
...                   nodes="dill")
Traceback (most recent call last):
...
ValueError: Method `update_devices` of class `HydPy` does not allow to use both the `selection` argument and the `nodes` or  the `elements` argument at the same time.
>>> hp.update_devices(selection=pub.selections.headwaters,
...                   elements=["land_lahn_1", "land_dill"])
Traceback (most recent call last):
...
ValueError: Method `update_devices` of class `HydPy` does not allow to use both the `selection` argument and the `nodes` or  the `elements` argument at the same time.
property methodorder: List[Callable[[int], None]]

All methods of the currently relevant Node and Element objects to be processed by method simulate() during a simulation time step, ordered in a correct execution sequence.

Property methodorder should be of interest to framework developers only.

simulate() None[source]

Perform a simulation run over the actual simulation period defined by the Timegrids object stored in module pub.

We let function prepare_full_example_2() prepare a runnable HydPy object related to the LahnH example project:

>>> from hydpy.examples import prepare_full_example_2
>>> hp, pub, TestIO = prepare_full_example_2()

First, we execute a default simulation run covering the whole simulation period and inspect the discharge series simulated at the outlet of the river basin, represented by node lahn_3:

>>> hp.simulate()
>>> from hydpy import round_
>>> round_(hp.nodes.lahn_3.sequences.sim.series)
54.043745, 37.320814, 31.922053, 28.413644

After resetting the initial conditions via method reset_conditions(), we repeat the simulation run and get the same results:

>>> hp.reset_conditions()
>>> hp.simulate()
>>> round_(hp.nodes.lahn_3.sequences.sim.series)
54.043745, 37.320814, 31.922053, 28.413644

Simulation runs do not need to cover the whole initialisation period at once. After setting the lastdate property of the sim Timegrid of the Timegrids objects stored within module pub to the middle of the initialisation period, method simulate() calculates the first two discharge values only:

>>> hp.reset_conditions()
>>> hp.nodes.lahn_3.sequences.sim.series = 0.0
>>> pub.timegrids.sim.lastdate = "1996-01-03"
>>> hp.simulate()
>>> round_(hp.nodes.lahn_3.sequences.sim.series)
54.043745, 37.320814, 0.0, 0.0

After adjusting both the firstdate and lastdate of the sim Timegrid to the second half of the initialisation period, simulate() completes the time-series:

>>> pub.timegrids.sim.firstdate = "1996-01-03"
>>> pub.timegrids.sim.lastdate = "1996-01-05"
>>> hp.simulate()
>>> round_(hp.nodes.lahn_3.sequences.sim.series)
54.043745, 37.320814, 31.922053, 28.413644

In the above examples, each Model object (handled by an Element object) passes its simulated values via a Node object to its downstream Model object. There are four ways to deviate from this default behaviour that can be selected for each node individually via the property deploymode. We focus on node lahn_2 as the upstream neighbour of node lahn_3. So far, its deploy mode is newsim, meaning that the node passes newly calculated simulation values to the downstream element stream_lahn_2_lahn_3:

>>> hp.nodes.lahn_2.deploymode
'newsim'

Under the second option, oldsim, node lahn_2 does not pass the discharge values simulated in the next simulation run, but the “old” discharge values already available by the series array of the Sim sequence. This behaviour can, for example, be useful when calibrating subsequent subareas of a river basin sequentially, beginning with the headwaters and continuing with their downstream neighbours. For the clarity of this example, we decrease all values of the “old” simulated series of node lahn_2 by 10 m³/s:

>>> round_(hp.nodes.lahn_2.sequences.sim.series)
42.3697, 27.210443, 22.930066, 20.20133
>>> hp.nodes.lahn_2.deploymode = "oldsim"
>>> hp.nodes.lahn_2.sequences.sim.series -= 10.0

After performing another simulation run (over the whole initialisation period, again), the modified discharge values of node lahn_2 are unchanged. The simulated values of node lahn_3 are, compared to the newsim runs, decreased by 10 m³/s (there is no time delay or dampening of the discharge values between both nodes due to the lag time of application model musk_classic being smaller than the simulation time step):

>>> hp.reset_conditions()
>>> pub.timegrids.sim.firstdate = "1996-01-01"
>>> pub.timegrids.sim.lastdate = "1996-01-05"
>>> hp.simulate()
>>> round_(hp.nodes.lahn_2.sequences.sim.series)
32.3697, 17.210443, 12.930066, 10.20133
>>> round_(hp.nodes.lahn_3.sequences.sim.series)
44.043745, 27.320814, 21.922053, 18.413644

The third option is obs, where node lahn_2 receives and stores the values from its upstream models but passes other, observed values, handled by sequence Obs, which we, for simplicity, set to zero for the complete initialisation and simulation period (more often, one would read measured data from files via methods as load_obsseries()):

>>> hp.nodes.lahn_2.deploymode = "obs"
>>> hp.nodes.lahn_2.sequences.obs.series = 0.0

Now the simulated values of node lahn_2 are identical with the ones of the newsim example, but the simulated values of node lahn_3 are lower due to receiving the observed instead of the simulated values from upstream:

>>> hp.reset_conditions()
>>> hp.nodes.lahn_3.sequences.sim.series = 0.0
>>> hp.simulate()
>>> round_(hp.nodes.lahn_2.sequences.obs.series)
0.0, 0.0, 0.0, 0.0
>>> round_(hp.nodes.lahn_2.sequences.sim.series)
42.3697, 27.210443, 22.930066, 20.20133
>>> round_(hp.nodes.lahn_3.sequences.sim.series)
11.674045, 10.110371, 8.991987, 8.212314

Unfortunately, observation time-series are often incomplete. HydPy generally uses numpy nan to represent missing values. Passing nan inputs to a model usually results in nan outputs. Hence, after assigning nan to some entries of the observation series of node lahn_2, the simulation series of node lahn_3 also contains nan values:

>>> from numpy import nan
>>> with pub.options.checkseries(False):
...     hp.nodes.lahn_2.sequences.obs.series= 0.0, nan, 0.0, nan
>>> hp.reset_conditions()
>>> hp.nodes.lahn_3.sequences.sim.series = 0.0
>>> hp.simulate()
>>> round_(hp.nodes.lahn_2.sequences.obs.series)
0.0, nan, 0.0, nan
>>> round_(hp.nodes.lahn_2.sequences.sim.series)
42.3697, 27.210443, 22.930066, 20.20133
>>> round_(hp.nodes.lahn_3.sequences.sim.series)
11.674045, nan, 8.991987, nan

To avoid calculating nan values, one can select the fourth option, obs_newsim. Now the priority for node lahn_2 is to deploy its observed values. However, for each missing observation, it deploys its newly simulated value instead:

>>> hp.nodes.lahn_2.deploymode = "obs_newsim"
>>> hp.reset_conditions()
>>> hp.simulate()
>>> round_(hp.nodes.lahn_2.sequences.obs.series)
0.0, nan, 0.0, nan
>>> round_(hp.nodes.lahn_2.sequences.sim.series)
42.3697, 27.210443, 22.930066, 20.20133
>>> round_(hp.nodes.lahn_3.sequences.sim.series)
11.674045, 37.320814, 8.991987, 28.413644

The fifth option, obs_oldsim, serves the same purpose as option obs_newsim but uses already available “old” simulation results as substitutes:

>>> hp.nodes.lahn_2.deploymode = "obs_oldsim"
>>> hp.reset_conditions()
>>> hp.nodes.lahn_2.sequences.sim.series = (
...     32.3697, 17.210443, 12.930066, 10.20133)
>>> hp.simulate()
>>> round_(hp.nodes.lahn_2.sequences.obs.series)
0.0, nan, 0.0, nan
>>> round_(hp.nodes.lahn_2.sequences.sim.series)
32.3697, 17.210443, 12.930066, 10.20133
>>> round_(hp.nodes.lahn_3.sequences.sim.series)
11.674045, 27.320814, 8.991987, 18.413644

The last example shows that resetting option deploymode to newsim results in the default behaviour of the method simulate() again:

>>> hp.nodes.lahn_2.deploymode = "newsim"
>>> hp.reset_conditions()
>>> hp.simulate()
>>> round_(hp.nodes.lahn_2.sequences.sim.series)
42.3697, 27.210443, 22.930066, 20.20133
>>> round_(hp.nodes.lahn_3.sequences.sim.series)
54.043745, 37.320814, 31.922053, 28.413644
doit() None[source]

Deprecated! Use method simulate() instead.

>>> from hydpy import HydPy
>>> from hydpy.core.testtools import warn_later
>>> from unittest import mock
>>> with warn_later(), mock.patch.object(HydPy, "simulate") as mocked:
...     hp = HydPy("test")
...     hp.doit()
HydPyDeprecationWarning: Method `doit` of class `HydPy` is deprecated.  Use method `simulate` instead.
>>> mocked.call_args_list
[call()]
prepare_allseries(allocate_ram: bool = True, jit: bool = False) None[source]

Tell all current IOSequence objects how to handle time-series data.

Assign True to the allocate_ram argument (default) to activate the series property of all sequences so that their time-series data can become available in RAM.

Assign True to the jit argument to activate the “just-in-time” reading from NetCDF files for all InputSequence and Obs objects and to activate the “just-in-time” writing of NetCDF files for all FactorSequence, FluxSequence, StateSequence and Sim objects.

See the main documentation on class HydPy for further information.

prepare_modelseries(allocate_ram: bool = True, jit: bool = False) None[source]

An alternative method for prepare_allseries() specialised for model sequences.

prepare_inputseries(allocate_ram: bool = True, read_jit: bool = False, write_jit: bool = False) None[source]

An alternative method for prepare_allseries() specialised for model input sequences.

prepare_factorseries(allocate_ram: bool = True, write_jit: bool = False) None[source]

An alternative method for prepare_allseries() specialised for model factor sequences.

prepare_fluxseries(allocate_ram: bool = True, write_jit: bool = False) None[source]

An alternative method for prepare_allseries() specialised for model flux sequences.

prepare_stateseries(allocate_ram: bool = True, write_jit: bool = False) None[source]

An alternative method for prepare_allseries() specialised for model state sequences.

prepare_nodeseries(allocate_ram: bool = True, jit: bool = False) None[source]

An alternative method for prepare_allseries() specialised for node sequences.

prepare_simseries(allocate_ram: bool = True, read_jit: bool = False, write_jit: bool = False) None[source]

An alternative method for prepare_allseries() specialised for simulation sequences of nodes.

prepare_obsseries(allocate_ram: bool = True, read_jit: bool = False, write_jit: bool = False) None[source]

An alternative method for prepare_allseries() specialised for observation sequences of nodes.

save_allseries() None[source]

Write the time-series data of all current IOSequence objects at once to data file(s).

See the main documentation on class HydPy for further information.

save_modelseries() None[source]

An alternative method for save_modelseries() specialised for model sequences.

save_inputseries() None[source]

An alternative method for save_modelseries() specialised for model input sequences.

save_factorseries() None[source]

An alternative method for save_modelseries() specialised for model factor sequences.

save_fluxseries() None[source]

An alternative method for save_modelseries() specialised for model flux sequences.

save_stateseries() None[source]

An alternative method for save_modelseries() specialised for model state sequences.

save_nodeseries() None[source]

An alternative method for save_modelseries() specialised for node sequences.

save_simseries() None[source]

An alternative method for save_modelseries() specialised for simulation sequences of nodes.

save_obsseries() None[source]

An alternative method for save_modelseries() specialised for observation sequences of nodes.

load_allseries() None[source]

Read the time-series data of all current IOSequence objects at once from data file(s).

See the main documentation on class HydPy for further information.

load_modelseries() None[source]

An alternative method for load_modelseries() specialised for model sequences.

load_inputseries() None[source]

An alternative method for load_modelseries() specialised for model input sequences.

load_factorseries() None[source]

An alternative method for load_modelseries() specialised for model factor sequences.

load_fluxseries() None[source]

An alternative method for load_modelseries() specialised for model flux sequences.

load_stateseries() None[source]

An alternative method for load_modelseries() specialised for model state sequences.

load_nodeseries() None[source]

An alternative method for load_modelseries() specialised for node sequences.

load_simseries() None[source]

An alternative method for load_modelseries() specialised for simulation sequences of nodes.

load_obsseries() None[source]

An alternative method for load_modelseries() specialised for observation sequences of nodes.

hydpy.core.hydpytools.create_directedgraph(devices: HydPy | Selection) DiGraph[source]

Create a directed graph based on the given devices.