Documentation > Explore

The profile method is designed to test the sensitivity of the input parameters in a calibration context.
Although it may look similar to traditional sensitivity analysis in principle, the calibration profile
algorithm goes deeper, as it captures the full effect of a parameter variation on the model fitness, every other input
being calibrated to optimize the fitness.

Profiles reveal a lot about your model, as they may show how an input (i.e. a parameter of the model) has so much effect on the model dynamics that it has to be in a certain interval for the model to produce acceptable dynamics! In other cases, on the contrary, Calibration Profiles show that an input can so much be compensated by other inputs that it will not have enough effect to constrain the model towards acceptable dynamics by itself.

Run

With M, the model, X , the Input space, Y, the output space, π(X) the power set of X ( i.e. every subset of X , including X and β )

In other words : this function takes a model M ( whose signature is (XβY) ) , an element

By defining a threshold below which the distance between the model output and the data is considered acceptable, the results of the profile methods can be interpreted as the subset of values of the parameter for which the model output reproduces the data sufficiently well. The Profile method takes the following parameters:

The computed calibration profiles may take very diverse shapes depending on the effect of the parameter of the model dynamics, however some of this shapes are recurrent. The most typical shapes are shown on the figure bellow. They have been discriminated by according to the variation of the values of the profile compared to the threshold value :

The calibration profile algorithm has been published in the following paper: Romain Reuillon, Clara Schmitt, Ricardo De Aldama, and Jean-Baptiste Mouret, Β«A New Method to Evaluate Simulation Models: The Calibration Profile (CP) Algorithm Β» published in

[online version] [bibteX]

Profiles reveal a lot about your model, as they may show how an input (i.e. a parameter of the model) has so much effect on the model dynamics that it has to be in a certain interval for the model to produce acceptable dynamics! In other cases, on the contrary, Calibration Profiles show that an input can so much be compensated by other inputs that it will not have enough effect to constrain the model towards acceptable dynamics by itself.

## Method scores π

The calibration profile method is perfect to reveal a model's sensitivity regarding its parameter, hence the highest score possible in sensitivity. However, it does not retrieve information about the input space nor the output space structures, as it focus on**one**parameter/input, every other input being let free. As the studied parameter varies, the other parameter are calibrated (see below), so this method scores very well regarding calibration, which is also why it can handle stochasticity since calibration does too. Finally, the profile method realizes calibrations on the other inputs for each interval of the input under study, so the more inputs, the more sensitive to dimensionality of input space.Run

Given a distance measure between the model output values and
data, the profile of a selected parameter **i** is constructed by
dividing the interval within which **i** can vary into subintervals
of equal size, and calibrating the model within each interval to
minimise the distance (similarly to Calibration) between outputs and data. The
optimisation is made over the other parameters of the model which are
left free.

## Typed signature π

The profile method can be typed likewise:
Profile : ((π§

_{1},...,π§_{k},) β {1,...,k} β [(π§,Y)]
such that : Profile(M)(i) = [(π

_{i1},y_{1}), ... , (π_{in}, y_{n})],
β j β [1;n], y

_{j}=min_{πβX-i}(M(π))
X

_{-i}={πβπ§_{1}x ... x π§_{i-1}x {π_{i}} x π§_{i+1}x ... x π§_{k}}With M, the model, X , the Input space, Y, the output space, π(X) the power set of X ( i.e. every subset of X , including X and β )

In other words : this function takes a model M ( whose signature is (XβY) ) , an element

*y*of Y (y is the list of criterion value to reach) and find a list of elements of X (noted*x*) such that, M(x) are Pareto dominant compared to every image of other elements of X by M , regarding criterion*y*)By defining a threshold below which the distance between the model output and the data is considered acceptable, the results of the profile methods can be interpreted as the subset of values of the parameter for which the model output reproduces the data sufficiently well. The Profile method takes the following parameters:

`inputs`

: a list of the model parameters with their minimum and maximum bounds`objective`

: a variable defined in the OpenMOLE script that contains the distance between the data and the model output`x`

: the parameter which is being profiled`nX`

: the size of the subintervals`stochastic`

: the seed provider, mandatory if your model contains randomness

```
val param1 = Val[Double]
val param2 = Val[Double]
val exploration =
GenomeProfileEvolution(
evaluation = modelTask,
parallelism = 10,
termination = 100,
x = param1,
nX = 20,
genome =
Seq(
param1 in (0.0, 99.0),
param2 in (0.0, 99.0)),
objective = fitness,
stochastic = Stochastic(seed = myseed))
```

where `param1`

and `param2`

are inputs
of the task that runs the model (and refer to the model parameters),
and `fitness`

is an output of that
same task. The number of inputs are unlimited. Here, `x = param1`

specifies that we are profiling `param1`

.
## Interpretation of the profiles π

A calibration profile is a 2D curve with the value of the parameter under study represented on the X-axis and the best possible calibration error on the Y-axis. To ease the interpretation of the profiles we propose to define an acceptance threshold on the calibration error: under this acceptance threshold the calibration error is considered sufficiently satisfying and the dynamics exposed by the model sufficiently acceptable, over this acceptance threshold the calibration error is considered too high and the dynamics exposed by the model are considered unacceptable.The computed calibration profiles may take very diverse shapes depending on the effect of the parameter of the model dynamics, however some of this shapes are recurrent. The most typical shapes are shown on the figure bellow. They have been discriminated by according to the variation of the values of the profile compared to the threshold value :

- The shape 1 is exposed when a parameter is restricting with respect to the calibration criterion and when the model is able produce acceptable dynamics only for a specific range of the parameter. In this case a connected validity interval can be established for the parameter.
- The shape 2 is exposed when a parameter is restricting with respect to the calibration criterion, but the validity domain of the parameter is not connected. It might mean that several qualitatively different dynamics of the model meet the calibration requirement. In this case model dynamics should be observed directly to determine if the different kinds of dynamics are all suitable or if some of them are mistakenly accepted by the calibration objective.
- The shape 3 is exposed when the model is not possible to calibrate. The profile doesnβt expose any acceptable dynamic according to the calibration criterion. In this case, the model should be improved or the calibration criterion should be adapted.
- The shape 4 is exposed when a parameter doesnβt restrict the the model dynamics with regards to the calibration criterion. The model can always be calibrated whatever the value of the parameter is. In this case this parameter constitute a superfluous degree of liberty for the model since itβs effect can always be compensated by a variation on the other parameters. In general it means that this parameter should be fixed, that a mechanism of the model should be removed or that the model should be reduced by expressing the value of this parameter in function of the value of the other parameters.

The calibration profile algorithm has been published in the following paper: Romain Reuillon, Clara Schmitt, Ricardo De Aldama, and Jean-Baptiste Mouret, Β«A New Method to Evaluate Simulation Models: The Calibration Profile (CP) Algorithm Β» published in

*Journal of Artificial Societies and Social Simulation*(JASSS) , Vol 18, Issue 1, 2015.[online version] [bibteX]