Alpine3D

Alpine3D Svn Source Tree

Root/trunk/alpine3d/MainPage.h

1/***********************************************************************************/
2/* Copyright 2009-2015 WSL Institute for Snow and Avalanche Research SLF-DAVOS */
3/***********************************************************************************/
4/* This file is part of Alpine3D.
5 Alpine3D is free software: you can redistribute it and/or modify
6 it under the terms of the GNU Lesser General Public License as published by
7 the Free Software Foundation, either version 3 of the License, or
8 (at your option) any later version.
9
10 Alpine3D is distributed in the hope that it will be useful,
11 but WITHOUT ANY WARRANTY; without even the implied warranty of
12 MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
13 GNU Lesser General Public License for more details.
14
15 You should have received a copy of the GNU Lesser General Public License
16 along with Alpine3D. If not, see <http://www.gnu.org/licenses/>.
17*/
18#ifndef MAINPAGE_H
19#define MAINPAGE_H
20
21 /**
22 * @mainpage Table of content
23 * -# External Links
24 * -# <A HREF="https://models.slf.ch/p/alpine3d/">Alpine3D's home page</A>
25 * -# <A HREF="https://models.slf.ch/p/alpine3d/page/Getting-started/">Installation, compilation</A>
26 * -# <A HREF="https://models.slf.ch/p/alpine3d/page/Running-a-simulation/">Running a simulation</A>
27 * -# End User documentation
28 * -# General principles
29 * -# \subpage model_principles "Model principles"
30 * -# \subpage inputs "Inputs"
31 * -# \subpage outputs "Outputs"
32 * -# \subpage tools "Simulation tools"
33 * -# Modules configuration
34 * -# \subpage radiation_balance "Radiation balance"
35 * -# \subpage snowpack "Snowpack"
36 * -# \subpage snowdrift "Snowdrift"
37 * -# \subpage runoff "Runoff"
38 * -# \subpage glaciers "Glaciers katabatic flows"
39 * -# \subpage techsnowA3D "Technical snow production"
40 * -# \subpage running_simulations "Running a simulation"
41 * -# Expanding Alpine3D
42 * -# \subpage coding_style "Coding style"
43 *
44 * <center><hr></center>
45 * <center><i><small>
46 * <p>
47 * Alpine3D is a spatially distributed (surface), three dimensional (atmospheric) model for
48 * analyzing and predicting dynamics of snow-dominated surface processes in mountainous topography.
49 * It includes models for snow cover (<A HREF="https://models.slf.ch/p/snowpack/">SNOWPACK</A>),
50 * vegetation and soil, snow transport, radiation transfer and runoff which can be enabled or disabled on demand.
51 *
52 * The model supports a variety of input options including interpolation of meteorological weather stations,
53 * input from a meteorological model or from remote sensing data
54 * (<A HREF="https://models.slf.ch/p/meteoio">MeteoIO</A>) and has been parallelized
55 * in order to run on computing grids or clusters
56 * (using <A HREF="https://en.wikipedia.org/wiki/Message_Passing_Interface">MPI</A> and <A HREF="http://openmp.org/wp/">OpenMP</A>).
57 *
58 * Alpine3D has a broad variety of potential applications. Most dominant is the assessment of
59 * snow water resource dynamics in mountain catchments (Michlmayr et al., 2008). This includes predictions of
60 * future snow on the basis of climate change scenarios (Bavay et al., 2009; Bavay et al., 2013). One exotic application of the model
61 * system Alpine3D is the forecasting of surface temperatures on ski-pistes, e.g. for the Vancouver winter Olympics.
62 * For this forecast local shadings (might) change surface temperature up to 5 °C.
63 * </p>
64 * <p>
65 * This model is available under GPL version 3 or above, see <a href="http://www.gnu.org/licenses/gpl.txt">www.gnu.org</a>.
66 * </p></small></i></center>
67 */
68
69/**
70 * @page running_simulations Running a simulation
71 *
72 * @section running_intro Introduction
73 *
74 * @subsection model_workflow Simulation workflow
75 * When running a simulation, it is important to keep in mind that the model is organized as several modules that interract together. It is possible to configure
76 * some parameters for the various modules and to enable/disable modules. Some modules can be used outside of Alpine3D (like
77 * <A HREF="https://models.slf.ch">MeteoIO</A> that is used in various applications or libSnowpack that is used by the standalone
78 * <A HREF="https://models.slf.ch">Snowpack</A> model) .
79 *
80 * \image html simulation_workflow.png "Simulation workflow"
81 * \image latex simulation_workflow.eps "Simulation workflow" width=0.9\textwidth
82 *
83 * @subsection installing Installing Alpine3D
84 * Please follow the instructions given on <a href="https://models.slf.ch/p/alpine3d/page/Getting-started/">the forge</a> in order to download Alpine3D (from svn, from source or from a
85 * binary package) and its dependencies (Snowpack and MeteoIO, knowing that binary packages might already contain all the required dependencies). If you've
86 * downloaded a binary package, there is nothing special to do, just install it on your system.
87 *
88 * When installing Alpine3D from sources, please keep in mind that Alpine3D's compilation process is exactly the same as Snowpack's or MeteoIO's (all are based on cmake). After
89 * a successful compilation, it is necessary to install Alpine3D (the same **must** have been done for Snowpack and MeteoIO). There are two options:
90 * - system-wide install. In this case, you need to have administrator permissions. Simply set CMAKE_INSTALL_PREFIX (in cmake) to a system path (by default, it is /usr/local)
91 * and install by typing '*make install*' in a terminal opened at Alpine3D's source root folder.
92 * - user install: if you do not have administrator permissions, you can still install Alpine3D into a directory where you have write permissions. Simply set CMAKE_INSTALL_PREFIX
93 * (in cmake) to a directory where you can write (it is *highly* recommended to set it to '${HOME}/usr'), make sure this directory exists and is writable and then install by
94 * typing '*make install*' in a terminal opened at Alpine3D's source root folder. Make sure that CMAKE_INSTALL_PREFIX/bin is in your PATH and that CMAKE_INSTALL_PREFIX/lib
95 * is recognized as a library path (variables PATH and LD_LIBRARY_PATH for Linux, PATH and DYLD_FALLBACK_LIBRARY_PATH for osX,
96 * see <a href="https://models.slf.ch/p/snowpack/page/Getting-started/#wikititle_4">Getting Started</a> on the forge).
97 *
98 * After Alpine3D has been installed, you can check that it works by opening a terminal and typing "alpine3d". Alpine3D should be found and display its help message.
99 *
100 * @subsection intro_directories Simulation directories setup
101 * After you installed a binary package or compiled and installed Alpine3D (on most compute clusters, you will need to install in your own home directory, see
102 * \ref installing "installing Alpine3D" above),
103 * you can run your first simulation. We highly recommend that you use the following structure: first, create a directory for your simulation, for example
104 * "Stillberg". Then, create the following sub-directories:
105 * - input, to put all the input data required by the simulation
106 * - input/meteo, for the meteorological input data
107 * - input/surface-grids, for the dem, land cover and optional catchments grids
108 * - input/snowfiles, for the input .sno files
109 * - output, where Alpine3D will write the results
110 * -output/grids, where Alpine3D will write the gridded results
111 * - output/snowfiles, where Alpine3D will write its status files (sno files)
112 * - setup, to put the configuration of the simulation and its start scripts
113 *
114 * Edit the "run.sh" script to set it up with the proper start and end dates, the modules that you want to enable and the proper configuration for a sequential or parallel run.
115 *
116 * @section intro_run_simple Simple sequential simulation
117 * This is the simplest way of running an Alpine3D simulation: it runs on one core on one computer. The drawback is that if the simulation is quite large,
118 * it might require a very long time to run or even not have enough memory (RAM) to run once there is snow in the simulated domain. In order to run a
119 * sequential simulation, set \c "PARALLEL=N" in the \em run.sh script. Then run the following command in a terminal (this can be a remote terminal
120 * such as \em ssh) on the computer where the simulation should run:
121 * @code
122 * nohup ./run.sh &
123 * @endcode
124 * \em nohup means that if you close the terminal the simulation will keep going; \em & means that the terminal prompt can acept other commands after you've submitted this one. In order to monitor what is going on with your simulation, simply run something such as (\em -f means that it keeps updating with the new content in this file. Replace it with something such as \em -500 to show the last 500 lines of this file):
125 * @code
126 * tail -f stdouterr.log
127 * @endcode
128 *
129 * If you need to terminate the simulation, first find out its Process ID (PID) by doing
130 * @code
131 * ps ux
132 * @endcode
133 * Then kill the process
134 * @code
135 * kill {PID}
136 * @endcode
137 *
138 * @section intro_parallel Parallel simulations
139 * When a simulated domain gets bigger, the computational requirements (memory and runtime) get bigger. In order to reduce these requirements, it is possible to
140 * run the simulation in parallel accross multiple cores or nodes.
141 *
142 * @subsection intro_run_openmp Multi-cores simulation
143 * This is the easiest way to run a parallel simulation because it does not require any specific software, only a compiler that supports <a href="http://openmp.org">OpenMP</a> (see also its <a href="https://en.wikipedia.org/wiki/OpenMP">wikipedia</a> page). Such compilers are for example gcc, clang, etc. The limitations on the memory still remain (ie a simulation requiring lots of memory will still only have access
144 * to the local computer's memory) but the run time will be roughtly divided by the number of available cores that are given to the simulation. In order
145 * to run such a simulation, please compile Alpine3D with the \b OpenMP option set to ON in cmake. Then in the simulation's \em run.sh file, set \c "PARALLEL=OPENMP" as well as the number of cores you want to us as \c "NCORES=". Then run the simulation as laid out in the previous section.
146 *
147 * @subsection intro_run_mpi Multi-nodes simulation
148 * This is the most powerful way of running a simulation: the load is distributed among distinct computing nodes, therefore reducing the amount
149 * of memory that must be available on each node. For very large simulations, this might be the only way to proceed. This is achieved by relying on
150 * <a href="https://en.wikipedia.org/wiki/Message_Passing_Interface">MPI</a> to exchange data between the nodes and distribute the workload. In
151 * order to run such a simulation, please compile Alpine3D with the \b MPI option set to ON in cmake. Then in the simulation's \em run.sh file, set \c "PARALLEL=MPI" as well as the number of processors/cores you want to us as \c "NPROC=" and a machine file. This machine file contains the list of machines
152 * to use for the simulation as well as how many processors/cores to use. For example, such as file could be:
153 * @code
154 * 192.168.0.11:2
155 * 192.168.0.5
156 * 192.168.0.4:3
157 * 192.168.2.25
158 * @endcode
159 * Then run the simulation as laid out in the previous section.
160 *
161 * \image html mpi_scaling.png "Scaling of Alpine3D with the number of processors when using MPI (Gemstock_1m, 1 year simulation)"
162 * \image latex mpi_scaling.eps "Scaling of Alpine3D with the number of processors when using MPI (Gemstock_1m, 1 year simulation)" width=0.9\textwidth
163 *
164 * \note Please make sure that the environment variables $TEMPDIR, $TEMP and $TMP (if defined) don't point to a shared drive (such as an NFS mounted home directory),
165 * otherwise MPI might run very slowly.
166 *
167 * @subsection intro_run_sge Sun/Oracle Grid Engine
168 * If your computing infrastructure relies on <a href="https://en.wikipedia.org/wiki/Oracle_Grid_Engine">Sun/Oracle Grid Engine (SGE)</A> (for example on a computing cluster),
169 * you need to do things differently. First, the job management directives/options must be provided, either on the command line or in the \em run.sh script.
170 * These lines rely on special comments, starting with \em "#" followed by \em "$":
171 * @code
172 * #$ -m aes
173 * #$ -N {simulation name}
174 * #$ -S /bin/bash
175 * #$ -cwd
176 * #$ -j y
177 * #$ -pe smp {number of cores to use}
178 * @endcode
179 * The last line specifies the computing profile that should be used. Since the job manager will allocate the ressources, there is no need to provide
180 * either NCORES or NPROC. The machine file (for MPI) is also not used. Then, submit the computing job to the system: \c "qsub ./run.sh". This
181 * should return almost immediately with a message providing the allocated job number. This job number is useful to delete
182 * the job \c "qdel {job_number}" or querry its status \c "qstat {job_number}" (or \c "qstat" to see for all jobs).
183 *
184 * If the job submission fails with an error message such as \em unknown \em command, please check that there is no extra "#$" in the script.
185 * This happens frequently when commenting out some part of the script and is mis-interpreted by SGE. In such a case, simply add an
186 * extra "#" in front of the comment.
187 *
188 * \note At WSL, the computing profiles are either \em smp for shared memory runs such as OpenMP or \em orte for load distributed among
189 * nodes such as MPI
190 *
191 * @section data_strategies Data strategies
192 * Depending on the meteorological data availability, the meteorological data quality as well as convenience (if the raw data is difficult to access, for example), there
193 * are different data strategies that can be used. This is illustrated in the figure below.
194 *
195 * \image html data_strategies.png "Meteorological data strategies"
196 * \image latex data_strategies.eps "Meteorological data strategies" width=0.9\textwidth
197 *
198 * The simplest case is <b>1</b>: the meteorological forcings are directly read and processed by Alpine3D. Since it embbeds MeteoIO, it can perform the whole
199 * preprocessing on the fly. In order to have a look at the pre-processed data (or to spare a lengthy data extraction), it is possible to preprocess the data first,
200 * then dump them to files and run Alpine3D from these intermediate file (case <b>2</b>). Since Alpine3D anyway embbeds MeteoIO, it would be possible to add
201 * some pre-processing to be done on the fly within Alpine3D (such as resampling to the sampling rates that Alpine3D needs). This is actually highly recommended
202 * in order to guarantee that Alpine3D gets the data it needs (ie proper sampling rates as well as basic checks on the ranges, etc).
203 *
204 * In some cases, it might be interesting to first run the data through Snowpack in order to produce some meteorological parameter from other parameters, such as
205 * the ISWR from RSWR (since Snowpack computes the snow albedo, the generate data would be much better than assuming a fixed albedo) or to generate
206 * precipitation from the measured snow height. This is shown in case <b>3</b>.
207 *
208 * Case <b>4</b> shows a strategy that could be used to prepare an Alpine3D simulation: it consists of simple Snowpack simulations performed at spatially
209 * interpolated locations, using MeteoIO's virtual stations concept. This is useful to (relatively) quickly validate the spatially interpolated fields with some
210 * measured data (specially if there are some meteorological data that could be compared to the spatially interpolated meteorological fields).
211 *
212 *
213 * @subsection snowpack_coupling Coupling with Snowpack
214 * Alpine3D needs spatially interpolated forcings for each grid points. Unfortunately, it limits the choice of forcing parameters: for some parameters
215 * (such as HS or RSWR), there are no reliable interpolation methods. One way to make use of the existing measurements that could not be easily
216 * interpolated is to run a <A HREF="https://models.slf.ch/p/snowpack">Snowpack</A> simulation at the stations that provided these measurements,
217 * then use alternate, computed parameters (such as PSUM or ISWR) as inputs to Alpine3D.
218 *
219 * This process is made easier by writing Snowpack's outputs in the smet format and making sure all the necessary parameters are written out.
220 * This means that Snowpack should be configured along the following lines (only using one slope):
221 * @code
222 * [Output]
223 * TS_WRITE = TRUE
224 * TS_FORMAT = SMET
225 * TS_DAYS_BETWEEN = 0.04166667;so we get hourly values
226 *
227 * OUT_CANOPY = FALSE
228 * OUT_HAZ = FALSE
229 * OUT_SOILEB = FALSE
230 * OUT_HEAT = FALSE
231 * OUT_T = FALSE
232 * OUT_STAB = FALSE
233 * OUT_LW = TRUE
234 * OUT_SW = TRUE
235 * OUT_MASS = TRUE
236 * OUT_METEO = TRUE
237 *
238 * AVGSUM_TIME_SERIES = TRUE
239 * CUMSUM_MASS = FALSE
240 * PRECIP_RATES = FALSE
241 * @endcode
242 *
243 * Then the output smet files produced by Snowpack can be directly used as Alpine3D inputs, performing some on-the-fly renaming and conversions
244 * (here, from split precipitation to precipitation/phase):
245 * @code
246 * [Input]
247 * METEO = SMET
248 * METEOPATH = ../input/meteo
249 * STATION1 = WFJ2
250 *
251 * PSUM_S::MOVE = MS_Snow
252 * PSUM_L::MOVE = MS_Rain
253 * HS::MOVE = HS_meas;so we can still compare measured vs modelled snow height
254 * TSG::MOVE = T_bottom;so we can compare the ground temperatures
255 * TSS::MOVE = TSS_meas;so we can compare the surface temperatures
256 *
257 * WFJ2::KEEP = TA TSS TSG RH ISWR ILWR HS VW DW PSUM_S PSUM_L PSUM PSUM_PH;so we do not keep all kind of unnecessary parameters
258 *
259 * PSUM_PH::create = PRECSPLITTING
260 * PSUM_PH::PRECSPLITTING::type = THRESH
261 * PSUM_PH::PRECSPLITTING::snow = 274.35
262 * PSUM::create = PRECSPLITTING
263 * PSUM::PRECSPLITTING::type = THRESH
264 * PSUM::PRECSPLITTING::snow = 274.35
265 *
266 * [SNOWPACK]
267 * ENFORCE_MEASURED_SNOW_HEIGHTS = FALSE
268 * @endcode
269 *
270 * Of course, other stations can also be part of the meteo input and their inputs should remain unaffected (assuming they don't use parameter names
271 * such as MS_Snow, MS_Rain or HS_meas and assuming that their parameters are not rejected by the KEEP command). As a side note, the last three parameter
272 * renamings (with MOVE) must be set when using Snowpack's outputs as inputs for a new Snowpack simulation.
273 */
274
275/**
276 * @page model_principles Model principles
277 * Here, we expose the core principles underlying the Alpine3D model. This should just give a quick overview of the model and help you understand the
278 * global architecture of Alpine3D. If you want to go deeper into the details, please have a look at the publications covering the whole model
279 * (M. Lehning et al. <i>"ALPINE3D: a detailed model of mountain surface processes and its application to snow hydrology"</i>, Hydrological Processes, \b 20.10, 2006, pp 2111-2128; and
280 * P. Kuonen, M. Bavay, M. Lehning, <i>"POP-C++ and ALPINE3D: petition for a new HPC approach"</i>, Advanced ICTs for disaster management and threat detection: collaborative and distributed frameworks, IGI Global, 2010, pp 237-61.)
281 * The individual modules are described here below and contain references to the relevant papers.
282 *
283 * @section at_the_core At the core...
284 * Here we expose the very foundations of Alpine3D. These remain valid independently of which modules are enabled when running the model.
285 *
286 * @subsection principles_snowpack Distributed 1D soil/snow/canopy column
287 * \image html distributed_sn.png "Distributed SNOWPACK over the domain taking into account the land cover"
288 * \image latex distributed_sn.eps "Distributed SNOWPACK over the domain taking into account the land cover" width=0.9\textwidth
289 * At the core of the model, is the <A HREF="http://models.slf.ch/p/snowpack/">SNOWPACK</A> model, a physically based,
290 * energy balance model for a 1D soil/snow/canopy column.
291 * This gives us a very detailed description of the snow stratigraphy and a very good evaluation of the mass and energy balance (therefore also of
292 * quantities such as Snow Water Equivalent (SWE) or temperature profile). This 1D energy balance is performed for each pixel of the domain
293 * (therefore it is a distributed SNOWPACK simulation) and for one time step (usually one hour). Any quantity that the user would like to get
294 * out of the simulation can be written out from this module.
295 *
296 * @subsection distributed_meteo Distributed meteo fields
297 * \image html 2d_interpolations.png "Spatially interpolating the meteorological fields"
298 * \image latex 2d_interpolations.eps "Spatially interpolating the meteorological fields" width=0.9\textwidth
299 * In order to perform a SNOWPACK simulation at every pixel of the domain, it is necessary to get the meteorological forcing for each pixel.
300 * But the measured meteorological parameters are usually measured by a set of stations, which means that the data is available at a set of points.
301 * Interpolating these points measurements to every pixels of the domain is performed by the means of statistical interpolations with
302 * <A HREF="http://models.slf.ch/p/meteoio">MeteoIO</A>. if the forcing data is coming out of another model (such as a meteorological model),
303 * most probably the input grids have a resolution that is very insufficient for Alpine3D and therefore need downscaling. If the downscaling factor
304 * is very large, we often end up with only a few points from the meteorological model that are part of the Alpine3D domain, therefore such points
305 * can be considered as "virtual stations" and spatially interpolated similarly to weather stations.
306 *
307 * @section lateral_fluxes Lateral fluxes
308 * The core principles laid out in the previous section rely on the assumption that there are no lateral fluxes,
309 * which is too strong of an assumption. Therefore the lateral fluxes deemed relevant are introduced by other modules:
310 * - the EBalance module computes the radiation fields, taking into account atmospheric cloudiness, topographic shading effects and reflections by the
311 * surrounding terrain.
312 * - the SnowDrift module that simulates the transport of snow by the wind. It performs a 3D simulation of the saltation, suspension and diffusion processes.
313 * - the runoff module that collects the precipitation and/or melt water at each pixel to transfer it to an hydrological routing module
314 *
315 * @subsection principles_ebalance Radiation balance
316 * \image html ebalance.png "Radiation balance with shading and terrain reflections"
317 * \image latex ebalance.eps "Radiation balance with shading and terrain reflections" width=0.9\textwidth
318 * Once the albedo of each pixels of the domain have been initialized or taken from the last time step, the radiation balance is computed. First, the
319 * incoming short wave radiation measured at one reference station is used to compute the splitting (between direct and diffuse, see
320 * D. G. Erbs, S.A. Klein, J.A. Duffie, <i>"Estimation of the diffuse radiation fraction for hourly, daily and monthly-average global radiation"</i>, Solar Energy, <b>28</b>, 4, 1982, Pages 293-302 and summarized in M. Iqbal, <i>"An introduction to solar radiation"</i>, 1983, Academic Press, ISBN: 0-12-373750-8).
321 * This splitting will be assumed to be constant over the whole domain. Then, according to the meteorological parameters and elevation of each pixel,
322 * the direct and diffuse radiation fields are computed. Since the position of the sun has been computed (
323 * J. Meeus, <i>"Astronomical Algorithms"</i>, 1998, 2nd ed, Willmann-Bell, Inc., Richmond, VA, USA, ISBN 0-943396-61-1),
324 * it is used to compute the topographic shading for each pixel.
325 * If the terrain reflections have been enabled, a radiosity approach is used to compute the reflections by the surrounding terrain (
326 * N. Helbig, H. Löwe, M. Lehning, <i>"Radiosity Approach for the Shortwave Surface Radiation Balance in Complex Terrain"</i>, Journal of the Atmospheric Sciences, \b 66.9, 2009).
327 * Finally, the direct and diffuse radiation fields are returned.
328 *
329 * @subsection principles_snowdrift Snowdrift
330 * \image html snowdrift.png "Snowdrift: saltation, suspension, sublimation"
331 * \image latex snowdrift.eps "Snowdrift: saltation, suspension, sublimation" width=0.9\textwidth
332 * Externally computed wind fields (for example with <A HREF="http://arps.ou.edu/">ARPS</A>) are assigned to each time steps.
333 * If the surface shear stress exceeds a given threshold at a given pixel, the saltation will be computed. This in turn can feed the suspension
334 * if the saltation layer is saturated. While in suspension, some of the mass will sublimate and contribute to the relative humidity field (
335 * C. D. Groot Zwaaftink et al. <i>"Drifting snow sublimation: A high‐resolution 3‐D model with temperature and moisture feedbacks"</i>, Journal of Geophysical Research: Atmospheres (1984–2012), \b 116.D16, 2011).
336 *
337 * @subsection principles_runoff Runoff
338 * \image html runoff.png "Runoff: simple bucket approach with PREVAH or sub-catchments sums"
339 * \image latex runoff.eps "Runoff: simple bucket approach with PREVAH or sub-catchments sums" width=0.9\textwidth
340 * Several options are available for collecting the melt water or precipitation running out of each pixel. The historical approach relies on the
341 * PREVAH hydrological modeling system (
342 * D. Viviroli et al. <i>"An introduction to the hydrological modelling system PREVAH and its pre-and post-processing-tools"</i>, Environmental Modelling \& Software, \b 24.10, 2009, pp 1209-1222)
343 * to perform on-the-fly hydrological simulation. Another approach consists of collecting the runoff sums for each sub-catchments defined by the user
344 * (M. Bavay, T. Grünewald, M. Lehning, <i>"Response of snow cover and runoff to climate change in high Alpine catchments of Eastern Switzerland"</i>, Advances in Water Resources, \b 55, 2013, pp 4-16).
345 * Finally, it is also possible to output the hourly distributed runoff (ie the runoff for each pixel, once per hour) and process this runoff in
346 * an external model
347 * ( F. Comola et al. <i>"Comparison of hydrologic response between a conceptual and a travel time distribution model for a snow-covered alpine catchment using Alpine3D"</i>, EGU General Assembly Conference Abstracts, Vol. \b 15, 2013;
348 * J. Magnusson et al. <i>"Quantitative evaluation of different hydrological modelling approaches in a partly glacierized Swiss watershed"</i>, Hydrological Processes, \b 25.13, 2011, pp 2071-2084).
349 * This approach enables the external model to be calibrated without having to re-run Alpine3D.
350 *
351 */
352
353/**
354 * @page inputs Inputs
355 * Several categories of data are necessary for the input:
356 * - Landscape data: Digital Elevation Model and Land Cover model
357 * - Snowcover and soil data
358 * - Meteorological data
359 *
360 * The landscape data can be prepared with any GIS software (for example, the Open Source <A HREF="http://qgis.org/">QGIS</A>), the meteorological data with
361 * statistical computing tools (for example <A HREF="https://www.r-project.org/">R</A>) or scripts (for example
362 * <A HREF="https://www.python.org/">Python</A> or <A HREF="https://en.wikipedia.org/wiki/AWK">AWK</A>) and the snowcover data has to be prepared
363 * with scripts.
364 *
365 * @section landscape_input Landscape data
366 * @subsection dem_input Digital Elevation Model
367 * \image html DEM.png "Example of a Digital Elevation Model grid"
368 * \image latex DEM.eps "Example of a Digital Elevation Model grid" width=0.9\textwidth
369 * Once you have defined the domain that you want to simulate, you have to get a
370 * <A HREF="http://www.fgi.fi/fgi/themes/digital-elevation-model">Digital Elevation Model</A> that contains this area.
371 * The DEM must be rectangular, but you will be able to restrict the actual domain within this
372 * rectangular area by settings the cells you want to ignore to nodata. However, you need to keep in mind the following:
373 * - The DEM defines the spatial grid that will be used in Alpine3D
374 * - Therefore, the resolution of the DEM is the spatial resolution of the simulation
375 * - Each cell whose DEM is nodata will be skipped by all modules of Alpine3D
376 * - In order to properly compute the shading effects, do not exclude cells that could cast shades on an interesting part of the domain!
377 *
378 * \anchor dem_geolocalization
379 * These last two points are important for both the definition of the rectangular DEM as well as filling the necessary cells with nodata.
380 * All other grids (either land cover or potential meteorological grids) will have to use the exact same
381 * <A HREF="https://en.wikipedia.org/wiki/Geolocation">geolocalization</A> as the DEM
382 * (same position, same size, same resolution) unless otherwise specified.
383 *
384 * DEM data can either come from your own measurements (for example from laser scans, see P. Axelsson, <i>"DEM generation from laser scanner data using adaptive TIN models"</i>, International Archives of Photogrammetry and Remote Sensing, \b 33.B4/1, PART 4, 2000, pp 111-118),
385 * from your national topographic service (<A HREF="http://www.swisstopo.admin.ch/internet/swisstopo/en/home/products/height.html">Swiss Topo</A> for Switzerland,
386 * <A HREF="http://professionnels.ign.fr/catalogue">IGN</A> for France,
387 * <A HREF="http://ned.usgs.gov/">National Elevation Dataset</A> for the USA, <A HREF="https://geoservice.ist.supsi.ch/helidem/">HeliDEM</A> for the southern Alps)
388 * or from global initiatives (
389 * <A HREF="https://lta.cr.usgs.gov/GTOPO30">GTOPO30</A>, a 30-arc-second global dem,
390 * <A HREF="http://www.ngdc.noaa.gov/mgg/topo/globe.html">GLOBE</A>, a 30-arc-second global dem,
391 * <A HREF="http://srtm.usgs.gov/">SRTM</A>, a 1-arc-second global dem between 60N and 56S averaged at 90m resolution,
392 * <A HREF="http://gdem.ersdac.jspacesystems.or.jp/">ASTER DEM</A>, a 30m resolution global dem between 83N and 83S).
393 *
394 * @subsection lus_input Land Cover model
395 * \image html LUS.png "Example of a Land Cover grid"
396 * \image latex LUS.eps "Example of a Land Cover grid" width=0.9\textwidth
397 * For each cell of the domain, a land cover code must be provided. This must be using the exact same geolocalization as the DEM.
398 * Such data are usually available in various classifications depending on the country (such as
399 * <A HREF="http://www.eea.europa.eu/publications/COR0-landcover">CORINE</A> for Europe,
400 * the <A HREF="http://landcover.usgs.gov/">US National Land Cover Dataset</A> for the USA,
401 * the <A HREF="http://www.countrysidesurvey.org.uk/">countryside survey</A> for the UK,
402 * <A HREF="https://www.bfs.admin.ch/bfs/de/home/statistiken/raum-umwelt/nomenklaturen/arealstatistik/noas2004.html">Arealstatistik NOAS04</A> for Switzerland,
403 * <A HREF="http://data.ess.tsinghua.edu.cn/">GLC</A> for a 30m resolution global land cover or
404 * <A HREF="http://data.fao.org/map?entryId=6c34ec8b-f31e-4976-9344-fd11b738a850">FAO GeoNetwork</A> for multiple land cover data sets including a 30" resolution global land cover).
405 *
406 * Currently, Alpine3D improperly calls the Land Cover Model <i>"Land Use"</i>, abbreviated as LUS and uses an ARC ascii file
407 * (see <A HREF="https://models.slf.ch/docserver/meteoio/html/arc.html">MeteoIO's documentation</A>) with PREVAH landuse codes
408 * that have the format 1LLDC where:
409 * - LL is the land use code as given in the table given below
410 * - D is the soil depth (unused)
411 * - C is the field capacity (unused)
412 *
413 * <center><table border="0">
414 * <caption>PREVAH land cover codes</caption>
415 * <tr><td>
416 * <table border="1">
417 * <tr><th>land use (vegetation)</th><th>Prevah land use classes</th></tr>
418 * <tr><td>01</td><td>water</td></tr>
419 * <tr><td>02</td><td>settlement</td></tr>
420 * <tr><td>03</td><td>coniferous forest</td></tr>
421 * <tr><td>04</td><td>decidous forest</td></tr>
422 * <tr><td>05</td><td>mixed forest</td></tr>
423 * <tr><td>06</td><td>cereals</td></tr>
424 * <tr><td>07</td><td>pasture</td></tr>
425 * <tr><td>08</td><td>bush</td></tr>
426 * <tr><td>09</td><td>undefined</td></tr>
427 * <tr><td>10</td><td>undefined</td></tr>
428 * <tr><td>11</td><td>road</td></tr>
429 * <tr><td>12</td><td>undefined</td></tr>
430 * <tr><td>13</td><td>firn</td></tr>
431 * <tr><td>14</td><td>bare ice</td></tr>
432 * <tr><td>15</td><td>rock</td></tr>
433 * </table></td><td><table border="1">
434 * <tr><th>land use (vegetation)</th><th>Prevah land use classes</th></tr>
435 * <tr><td>16</td><td>undefined</td></tr>
436 * <tr><td>17</td><td>undefined</td></tr>
437 * <tr><td>18</td><td>fruit</td></tr>
438 * <tr><td>19</td><td>vegetables</td></tr>
439 * <tr><td>20</td><td>wheat</td></tr>
440 * <tr><td>21</td><td>alpine vegetation</td></tr>
441 * <tr><td>22</td><td>wetlands</td></tr>
442 * <tr><td>23</td><td>rough pasture</td></tr>
443 * <tr><td>24</td><td>subalpine meadow</td></tr>
444 * <tr><td>25</td><td>alpine meadow</td></tr>
445 * <tr><td>26</td><td>bare soil vegetation</td></tr>
446 * <tr><td>27</td><td>free</td></tr>
447 * <tr><td>28</td><td>corn</td></tr>
448 * <tr><td>29</td><td>grapes</td></tr>
449 * <tr><td>30-99</td><td>undefined</td></tr>
450 * </table></td></tr>
451 * </table></center>
452 *
453 * \remarks There is a common confusion between land use and land cover, when actually a land cover is determined by direct
454 * observations (<A HREF="http://stats.oecd.org/glossary/detail.asp?ID=6489">OECD definition</A>) while a
455 * land use requires socio-economic interpretation of the activities that take place on that surface
456 * (<A HREF="http://stats.oecd.org/glossary/detail.asp?ID=6493">OECD definition</A>), see
457 * P. Fisher, A. Comber, and R. Wadsworth, <i>"Land use and Land cover: Contradiction or Complement"</i>, Re-presenting GIS, 2005, pp85-98).
458 *
459 * @subsection sub_catch_input Catchments definition
460 * \image html catchments.png "Example of catchments definition"
461 * \image latex catchments.eps "Example of catchments definition" width=0.9\textwidth
462 * For hydrological modeling, hydrological subcatchments must be defined. Both the precipitation, snow melt and glacier melt
463 * contributions are provided as well as the total catchment runoff, for each subcatchment. The subcatchments are defined by
464 * creating a grid that contains a code describing which catchments this cell belongs to, based on sums of powers of two.
465 * For example, a pixel belonging to catchments 0 and 3 would receive the code: 2^0+2^3=9. A pixel belonging to catchments
466 * 2, 5, 6 would receive the code 2^2+2^5+2^6=100. Finally, this grid must have the same geolocalization as the dem.
467 *
468 * @section sno_input Snow cover and soil data
469 * \image html soil.png "Initial soil and snow definition"
470 * \image latex soil.eps "Initial soil and snow definition" width=0.9\textwidth
471 * Either each cell must be assigned an initial soil and snow profile (as well as a few Canopy parameters). Usually, to make things easier, the simulation starts
472 * at a time when no snow is present in the domain, making the snow profile empty. The same file also contains potential
473 * soil layers data. This file is written in a Snowpack snowfile supported format (see Snowpack documentation, "Data File Formats" > "Single Snow Profiles") and usually kept
474 * with all other similar files in a separate directory (in order to keep the simulation tidy). Finally, the profile and layers
475 * must be dated from before the simulation starts.
476 *
477 * There are two possibilities for assigning these files to each cell of the simulation domain (see \subpage reading_snow_files "reading snow files"):
478 * - by land cover classes. In this case, every cell receives an initial soil/snow profile based on its land cover class. The files must be named as {LAND_USE_CLASS}_{EXPERIMENT_NAME}.{ext};
479 * - independently for each (i,j) pixel. In this case, the files must be named as {i_index}_{j_index}_{EXPERIMENT_NAME}.{ext};
480 *
481 * @note In any case, you MUST set the key "EXPERIMENT_NAME" in the [Output] section.
482 *
483 * @section meteo_input Meteorological data
484 * Usually, the meteorological inputs are provided as point-measurement time series and spatially interpolated within alpine3d (using meteoio).
485 * But it is also possible to provide some (or all) data as gridded data.
486 *
487 * Meteorological inputs often come from automatic weather stations that are part of a measurement network (for example
488 * M. Lehning et al. <i>"A network of automatic weather and snow stations and supplementary model calculations providing snowpack information for avalanche warning"</i>, Proceedings of the International Snow Science Workshop “a merging of theory and practice”, <b>27</b>, 1998 or the US <A HREF="http://www.wcc.nrcs.usda.gov/snow/">SNOTEL</A> network)
489 * or are deployed for a specific experiment and area. The networks are often managed by national weather services.
490 *
491 * It is also possible to find some other data producers such as
492 * <A HREF="http://mesonet.agron.iastate.edu/request/download.phtml?network=CN_ASOS">airports</A>, <A HREF="http://www.ogimet.com/index.phtml.en">Synop stations</A> (a script to download and convert the data is available in <i>MeteoIO's tools directory</i>),
493 * railways operators,
494 * <A HREF="https://pub-apps.th.gov.bc.ca/saw-paws/weatherstation">highways</A> (or <A HREF="http://www.chart.state.md.us/travinfo/weatherstationdata.asp">this</A>),
495 * <A HREF="http://www.parkcitymountain.com/site/mountain-info/conditions/weather/weather-station-reports">ski lift operators</A>,
496 * <A HREF="http://www.wxqa.com/index.html">network of citizen weather stations</A>.
497 *
498 * Another source of data can be reanalysis runs performed with weather forecasting models on measured data
499 * (<A HREF="http://www.metoffice.gov.uk/datapoint/product/uk-daily-site-specific-forecast">UK metoffice</A>,
500 * <A HREF="http://www.yr.no/place/Norway/">Norvegian Meteorological Institute</A>,
501 * <A HREF="http://www.euro4m.eu/datasets.html">Euro4M</A> for all of Europe as well as worldwide
502 * <A HREF="http://www.esrl.noaa.gov/psd/data/gridded/reanalysis/">NOAA ESRL</A> and
503 * <A HREF="http://www.ecmwf.int/en/research/climate-reanalysis">ECMWF</A> (with associated <A HREF="http://data-portal.ecmwf.int/data/d/interim_full_invariant/">DEM</A>) reanalysis).
504 *
505 * @subsection stations_input Point measurements
506 * This is mostly centered around the concept of station: a specific point in space where multiple parameters are measured for a given period.
507 * The file formats vary greatly depending on which meteo plugin is used. It is nevertheless always possible to use stations that don't cover
508 * the whole period or contain data gaps. It is also possible to use stations that are outside the domain (but they should not be too far away,
509 * otherwise this does not make sense!). The model will run with an hourly resolution, so the data should be either hourly or could be
510 * meaningfully resampled to an hourly resolution (i.e., daily measurements would usually not qualify). Please note that in any case, the
511 * following must always be provided:
512 * - at least \em one of:
513 * - air temperature (TA)
514 * - relative humidity (RH)
515 * - wind speed (VW)
516 * - precipitation (PSUM)
517 * - at least \em one point offering simultaneously the following:
518 * - air temperature (TA)
519 * - relative humidity (RH)
520 * - incoming short wave radiation (ISWR)
521 * - incoming long wave radiation (ILWR)
522 *
523 * If the air pressure (P, usually in Pa) is provided, it will be used for improving some of the parametrizations, but this is absolutely not mandatory.
524 * However, it is recommended to provide stations well distributed over the domain and over the elevation range, so the elevation gradients
525 * can be properly computed.
526 *
527 * @subsection grids_input Gridded meteorological data
528 * This relies on meteoio's USER spatial interpolation algorithm. The grids must follow a specific naming scheme and be placed in a specific
529 * directory so they can be found and the grids must have the same \ref dem_geolocalization "geolocalization" as the dem. This is detailed in
530 * meteoio's documentation.
531 *
532 * Please note that if not all necessary grids are provided, meteoio will revert to point measurements inputs (according to the spatial interpolation
533 * algorithms declared in the configuration file). It is therefore possible to only provide a few specific grids according to the needs
534 * (for example, the precipitation grids only when precipitation occurs).
535 *
536 */
537
538/**
539 * @page outputs Outputs
540 * Alpine3D writes four types of outputs:
541 * - grids, that is the distributed value of a given parameter (see \ref gridded_outputs "gridded outputs");
542 * - .met and .pro for points of interests. These are the standard SNOWPACK outputs and can be written out for any number of points (see \ref poi_outputs "POI outputs");
543 * - sno files, that is the status of each pixel in respect with snow information. These files are necessary in order to restart Alpine3D from a
544 * previous point (see \subpage restarts "restarts" and \subpage reading_snow_files "reading snow files");
545 * - subcatchments runoff sums. One file per subcatchment is generated and contains the sums of all runoff components at an hourly resolution as well as some other
546 * relevant catchment parameters (such as mean air temperature, etc). The runoff is discriminated between precipitation, glacier melt and snow melt as well as
547 * global sum (see \ref runoff_sums "Runoff sums").
548 *
549 * The grid files are written in any format supported by MeteoIO, as configured by the user. This means that it is for example possible to directly write PNG files
550 * from Alpine3D. The sno files as well as .met and .pro are written according to the SNOWPACK standalone model documentation.
551 *
552 */
553
554/**
555 * @page tools Simulation tools
556 * Several tools are available to help using Alpine3D. As for SNOWPACK, it is possible to use <A HREF="https://models.slf.ch/p/inishell">inishell</A> to
557 * configure the simulations. There is also another java tool, "view" in the "Interface" sub directory, that can be used to visualize ARC ASCII grids as
558 * well as to visualize DEM and LUS files in this format. This can also be used to generate a LUS file by opening an aerial picture and manually tagging
559 * the pixels (one by one, along lines or within polygons). Finally, this tool can also generate a POI (points of interest) file for more detailed
560 * outputs at some specific points.
561 * \image html view_tool.png "\"View\" application for visualizing grids"
562 * \image latex view_tool.eps "\"View\" application for visualizing grids" width=0.9\textwidth
563 */
564
565/**
566 * @page coding_style Coding style
567 * @section coding_sty Recommended coding style
568 * The recommended coding style for MeteoIO is the <A HREF="http://www.kernel.org/doc/Documentation/CodingStyle">Kernel coding style</A> with a few exceptions:
569 * - we don't enforce strict 80 characters line width. try to remain reasonable, but don't necessarily cut everything off at 80 characters
570 * - try to intelligently use spaces to visually group elements of a complex formula. If the formula can be split into meaningful elements,
571 * please do it (using some "const double element = " constructs).
572 * - try to properly qualify variables: for example, if a variable will not be changed, will never be negative and always integer,
573 * then use "const unsigned int". When some specific types are used for some standard library calls, try to properly use these types (for example, "size_t")
574 * - use C++ method naming convention: a method name starts with lowercase letter and each individual word in a name starts capitalized.
575 * Usually, no underscores are used in a method. For example, a method that would return the lapse rate contained in an object would be named "getLapseRate()"
576 * - qualify variables and parameters with "const" when appropriate (see <A HREF="http://jriddell.org/const-in-cpp.html">const-in-cpp</A>).
577 *
578 * A few important points to emphasize (from the <A HREF="http://www.kernel.org/doc/Documentation/CodingStyle">Kernel coding style</A>):
579 * - Functions should be short and sweet, and do just one thing. They should fit on one or two screenfuls of text, and do one thing and do that well.
580 * - If you have a complex function, and you suspect that a less-than-gifted first-year high-school student might not even understand
581 * what the function is all about, you should adhere to the maximum limits all the more closely. Use helper functions with descriptive names.
582 * - Comments are good, but there is also a danger of over-commenting. NEVER try to explain HOW your code works in a comment:
583 * it's much better to write the code so that the _working_ is obvious, and it's a waste of time to explain badly written code.
584 *
585 * @section code_indentation Indentation
586 * Since every user has his/her own preference for the ideal indentation width, please use <A HREF="http://www.emacswiki.org/emacs/SmartTabs">"smart tabs"</A>.
587 * That practically means:
588 * - indent with tabs
589 * - align with spaces
590 *
591 * This way, each developer can set his/her indentation size as he/she wishes without forcing his/her choice to others...
592 *
593 * @section containers Memory management and Containers
594 * Please do NOT manage memory manually but use <A HREF="https://secure.wikimedia.org/wikipedia/en/wiki/Standard_Template_Library">Standard Template Library (STL)
595 * </A> <A HREF="http://www.cplusplus.com/reference/stl/">containers</A> instead.
596 * This dramatically reduces memory errors (ie: <A HREF="https://secure.wikimedia.org/wikipedia/en/wiki/Segmentation_fault">segfaults</A>), often
597 * offers more performance and provides you with lots of <A HREF="http://www.cplusplus.com/reference/algorithm/">associated algorithms</A>
598 * (like sorting, search, filling, etc).
599 *
600 * When you need your own data class, please design it based on these STL containers (like grid2DObject is based on std::vector). Basically, this means
601 * that you will replace mallocs and arrays by vectors (for 1d, 2d, 3d grids), maps (for multiple key/value pairs), lists (for unordered table), etc
602 *
603 * @section exceptions_handling Exceptions handling
604 * The recommended C++ usage should be followed: <b>"throw by value, catch by reference"</b> (as specified in <i>C++ Coding Standards: 101 Rules, Guidelines,
605 * and Best Practices</i>, Herb Sutter, Andrei Alexandrescu, 2004, Addison-Wesley Professional). Moreover, we should consider catching by
606 * <b>const reference</b> and not even declaring a variable if not doing anything with it: something like `catch(const IOException&)` would often be enough.
607 *
608 */
609
610#endif

Archive Download this file

Revision: HEAD