Академический Документы
Профессиональный Документы
Культура Документы
in bottom-up modelling
crop- and grasslands
Scenario
uncertainty
measured/statistical
uncertainty - type A conceptual
uncertainty
Ecosystem - type E
Model
example study
One Model on site
– DNDC at the Oensingen cropland site (Hastings et
al. submitted and Wattenbach et al. poster session)
Cross model comparison at one site:
– Using five models at the same site to compare
the model uncertainty – DNDC, FASSET,
PASIM, EPIC, (CENTURY) (M. Wattenbach et al. - in
preparation)
Count
40
(Nitrogen) application
Temperature +/- 1°C
DNDC 20
Precipitation +/- 5%
Global radiation +/- 5% 0
6000 6200 6400 6600 6800 7000 7200
Clay content +/- 10% Oensingen NEE
NEE/EC kg C/ha
6000
ha-1
4000
Oensingen 5851 6735 6675 2000
2004 EC NEE
0 DNDC NEE
-2000
1 51 101 151 201 251 301 351
day of year
Cross site PaSim model at
European grassland sites
Cross site: PaSim factor importance
Oensingen in 2002 Oensingen In 2003
N input N input
100.00 100.00
c passive temperature c passive temperature
80.00 80.00
60.00 60.00
c slow global radiation c slow global radiation
40.00 40.00
20.00 20.00
0.00 c active 0.00 precipitation
c active precipitation -20.00
-20.00
atmospheric CO2
atmospheric CO2 c metabolic
c metabolic concentration
concentration
c struct bulk density
c struct bulk density
pH clay content
pH clay content
N input
Carlow 2002 N input
Carlow 2003
100.00 100.00
c passive temperature c passive temperature
80.00 80.00
60.00 60.00
c slow global radiation c slow global radiation
40.00 40.00
20.00 20.00
c active 0.00 precipitation c active 0.00 precipitation
-20.00 -20.00
atmospheric CO2 atmospheric CO2
c metabolic c metabolic
concentration concentration
2002
2003
Contribution index
normalized
Temperature
change of 90.00
PASIM - 2002
DNDC84G - 2002
standard 70.00
EPIC - 2002
FASSET - 2002
deviation 2002 50.00
in % 30.00
Nfert Precipitation
10.00
-10.00
iniSOC Clay
What we need (ISO ?)
• Standardized methods for uncertainty and
sensitivity analysis for ecosystem models
• Standardized datasets to allow inter-model
comparison of uncertainty and sensitivity
measures.
• Standardized software interfaces for ecosystem
models to allow access to databases for model
experiments and results.
• Databases for model evaluation results to allow
scientists, stake-holders and policy maker’s
easy access to information of model quality and
uncertainty.
• To implement the approach we propose a web-
based client - server architecture
Framework multi-run control
input factors &
sampling schemes
design
model dataset for
experiment comparison
ecosystem
model client
result
database
evaluation post-
client processing
& &
visualization visualization
model evaluation
result database
central modelling
framwork server
conclusions
• Ecosystem models produce very
heterogeneous uncertainties
• Results are only meaningful if they are
accompanied by uncertainty ranges
• Standardisation is necessary to reach
inter-comparability
• The presented framework approach might
be a way to achieve this target
Things I learned recently don’t
really understand but think might
be important
• Presentation of John Norton and Ken Reckhow about
“Modelling and Monitoring Environmental Outcomes in
Adaptive Management (AM)” IEMSS 2006
Principles:
• Design management as continuing trial-and-error learning, in
which some variation in system state is valuable because it
yields information about the system’s behaviour: “learning by
doing”
• Compare results of alternative policies, through selected
indicators, rather than attempting to optimise some cost
function
• Include resilience to disturbance as an objective
Things I learned recently, don’t
really understand but think might
be important
• Lyapunov stability ( from Wikipedia)
• In mathematics, the notion of Lyapunov stability
occurs in the study of dynamical systems.
– In simple terms, if all points that start out near a point
x stay near x forever, then x is Lyapunov stable. More
strongly, if all points that start out near x converge to
x, then x is asymptotically stable.
– The idea of Lyapunov stability can be extended to
infinite-dimensional manifolds, where it is known as
structural stability, which concerns the behaviour of
different but "nearby" solutions to differential
equations.
Thank you !
Necessity of consistency
• Changing expectations of models
– In the beginning models where made to explore the
behaviour of systems for scientific reasons only
– Today models are more and more used as predictive
tools for risk analysis and as policy support systems
• Consequences
– We need to understand the predictive capacities and
restrictions of our models
– We need standardized quality checks to give
meaningful uncertainty ranges
tools
• Monte Carlo method
– Monte Carlo methods: algorithms for solving various kinds of computational problems by
using random numbers
• Advantage: easy to use
• Disadvantage: a lot of model runs
– Latin Hypercube sampling: stratified sampling method, which can characterise the
population equally well as simple random sampling with a smaller sample size
• The tool we are using is Simlab (http://sensitivity-analysis.jrc.cec.eu.int/)
a software designed for Monte Carlo based uncertainty and sensitivity analysis
– Advantage: easy to use because of the graphical user interface providing a great number of
different distributions, sampling methods and parameter interactions
– Disadvantage: difficult to integrate external models
• Alternative tools and methods to Monte Carlo
– University of Sheffield: Gaussian Emulation Machine for Sensitivity Analysis (GEM-SA)
see http://www.shef.ac.uk/st1mck/code.html
– Advantage: easy to use, fast and efficient and high precision
– Disadvantage: problems with thresholds and no time dependants integrated (will be done)
Sources of model uncertainty
• Type A and B contributing to Type C - baseline
uncertainty:
– uncertainty resulting from accuracy and precision of measurements
used to determine input factors
– Input factors: parameter and variables
– Accuracy and precision of the model to represent the processes it is
supposed to simulate
– Internal parameters: Their accuracy and precision is harder to
evaluate as they are often derived parameters (e.g. statistical)
based on different measurement methods. In addition they are
mostly hard coded.
Sources of model uncertainty
• Subset of type B - scenario uncertainty
– resulting from the vagueness in scenarios of the
future
– The input factors are uncertain as they are dependant
on unpredictable conditions
– The base assumptions of our models may be
uncertain because they are based on the current
system conditions which may change in future