Symbolic integrals with SymbolicRegression.jl #401
MilesCranmer
started this conversation in
Show and tell
Replies: 2 comments 3 replies
-
Does |
Beta Was this translation helpful? Give feedback.
2 replies
-
@tomaklutfu I have added your request to this PR: MilesCranmer/SymbolicRegression.jl#249. Could you try it out? For example: function derivative_loss(tree, dataset::Dataset{T,L}, options, idx) where {T,L}
# Select from the batch indices, if given
X = idx === nothing ? dataset.X : view(dataset.X, :, idx)
# Evaluate both f(x) and f'(x), where f is defined by `tree`
ŷ, ∂ŷ, completed = eval_grad_tree_array(tree, X, options; variable=true)
!completed && return L(Inf)
y = idx === nothing ? dataset.y : view(dataset.y, idx)
∂y = idx === nothing ? dataset.extra.∂y : view(dataset.extra.∂y, idx)
mse_deriv = sum(i -> (∂ŷ[i] - ∂y[i])^2, eachindex(∂y)) / length(∂y)
mse_value = sum(i -> (ŷ[i] - y[i])^2, eachindex(y)) / length(y)
return mse_value + mse_deriv
end where you pass the extra data as a NamedTuple: model = SRRegressor(;
binary_operators=[+, -, *],
unary_operators=[cos],
loss_function=derivative_loss,
enable_autodiff=true,
batching=true,
batch_size=25,
niterations=100,
early_stop_condition=1e-6,
)
mach = machine(model, X, y, (; ∂y=∂y)) |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Example of solving an integral with SymbolicRegression.jl. Since automatic differentiation is enabled (via Zygote.jl), you can literally just evaluate the forward derivative, and compute the loss against your function. Then the resultant expression == the integral! (Or some approximation of it)
Screen.Recording.2023-08-07.at.15.26.36.mp4
Code:
Beta Was this translation helpful? Give feedback.
All reactions