Custom loss function with Hessians #625
-
Hello @MilesCranmer, and thank you for making PySR! I would like to write a custom loss function that looks like this: function loss_hess(tree, dataset::Dataset{T,L}, options) where {T,L}
X = copy(dataset.X) # shape (n, x)
y = copy(dataset.y) # shape (n)
features = custom_fn1(X) # shape (n, f)
y_pred = eval_tree_array(tree, features, options)
hess = batch_hessian(y_pred, X) # shape (n, x, x)
return L(custom_fn2(X, y, hess))
end That is, I need to do symbolic regression on Thank you very much in advance. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi @mirjanic, Maybe you could use Also check through other discussions in the forums, there are a few about including derivatives in the custom loss that might be useful. Cheers, P.S., Another option is Enzyme.jl as explained here: https://symbolicml.org/DynamicExpressions.jl/dev/eval/#Enzyme. However, this is experimental and won't be easy. It will require you to get pretty deep into the Julia codebase. But if this is really important for you then it's worth considering. I do think that Enzyme will eventually be the way to do this type of thing, but it hasn't reached a level of stability where I'm comfortable recommending it as the first thing to reach for. |
Beta Was this translation helpful? Give feedback.
Hi @mirjanic,
Maybe you could use
eval_grad_tree_array
to get the 1st order derivatives, and then use finite difference to get the 2nd order differences? It might even be faster than computing the exact Hessian.Also check through other discussions in the forums, there are a few about including derivatives in the custom loss that might be useful.
Cheers,
Miles
P.S., Another option is Enzyme.jl as explained here: https://symbolicml.org/DynamicExpressions.jl/dev/eval/#Enzyme. However, this is experimental and won't be easy. It will require you to get pretty deep into the Julia codebase. But if this is really important for you then it's worth considering. I do think that Enzyme will eventual…