Calibrating SDR model

What is the issue or question you have?

I am trying to optimise k_parameter, sdr_max, and ic0

I have batch run the sdr model for a given area, iterating over every combination of
sdr_max for 0 < sdrmax < 4
k_parameter for 1 < k_parameter < 4
ic0 for 0 < ico < 2

8000 iterations in total, and I plotted the result of sum of all non-NA values of sed_deposition.tif

Another modeler has estimated the same land area as having ~6400 tons of sediment

The 3d scatterplot shows us all of the models whose sum(sed_deposition) < 9000 tons. You can see there are several model runs that fall in a similar area, some with sdr_max = 3,1.5-2, 0.5, etc. with k_param around 4, and 0.5 < ic0< 2.

I have 2 questions:
Are these all valid parameter space values? Or or are the parameters (k_param, ic0, sdr_max) constrained over some range? (e.g., 0-1)

And: using outputs from various iterations, can I add an error estimate or confidence interval to the model?

What do you expect to happen?

I hope to find optimal parameters

What have you tried so far?

I have an excellent graph

Attach the logfile here:

1 Like

It looks like the question is being answered over on Model sensitivity and selecting parameter space, so I’ll close this thread.