-
-
Notifications
You must be signed in to change notification settings - Fork 154
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Recursive training on infinite stream of data #833
Comments
schlichtanders
changed the title
Recursive training for infinite stream of data
Recursive training on infinite stream of data
Jun 16, 2023
I don't see any potential issues with doing an online learning of SDEs, especially using a Bayesian approach using the previous posterior as a prior to the next one. This would just be a form of data assimilation on neural SDEs. It should just work. Please open an issue with a concrete feature request or bug report if it doesn't work as intended. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hello,
via the DiffEqFlux documentation I found this paper which demonstrates that Neural StochasticDifferentialEquations NeuralSDEs can be used to model financial data.
I am now looking for a way to learn such NeuralSDEs models iteratively/recursively, so that I can adapt to new incoming data efficiently.
The paper itself used a simple mean-squared error loss function, or something similar, for which you would need to store all historic data. Instead of relearning from the entire dataset every time, I rather want to update the previous fit in an efficient manner.
While I also learned about the math of SDEs in academia, I am more familiar with Bayesian probability modelling - and here the update logic is solved by using the previously found posterior as the new prior. Can something similar also be achieved for SDEs?
The text was updated successfully, but these errors were encountered: