Skip to content

Commit

Permalink
Make sure all initial parameters are floats
Browse files Browse the repository at this point in the history
  • Loading branch information
fivegrant committed Jun 30, 2023
1 parent 98d8757 commit 775bb43
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion src/operations/Operations.jl
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,8 @@ for custom loss functions, we probably just allow an enum of functions defined i
function calibrate(; model, dataset::DataFrame, context)
timesteps, data = select_data(dataset)
prob = to_prob(model, extrema(timesteps))
p = [Num(param) => model.defaults[param] for param in parameters(model)]
p = Vector{Pair{Num, Float64}}([Num(param) => model.defaults[param] for param in parameters(model)])
@show p
data = symbolize_args(data, states(model))
fitp = EasyModelAnalysis.datafit(prob, p, timesteps, data)
@info fitp
Expand Down

0 comments on commit 775bb43

Please sign in to comment.