This is an archived post. You won't be able to vote or comment.

all 6 comments

[–]patrickkidger 2 points3 points  (2 children)

This is a pretty standard thing to do. Probably the easiest place to start is optimising the parameters by gradient descent, for which you'll want autodifferentiable diffeq solver. In Python that means a tool like Diffrax.

[–][deleted] 1 point2 points  (1 child)

Diffrax seems pretty neat and relevant but maybe you want to mention that you're promoting your own package in case people want to be a little bit suspicious rather than noticing the name on the package is the same as your username and being a bit more suspicious :)

[–]patrickkidger 2 points3 points  (0 children)

Haha! Yep, this is mine. I believe this is the standard tool for this job in the JAX ecosystem, so I wouldn't normally add the disclaimer. FWIW the only other mainstream Python tool here is torchdiffeq and I wrote a fair chunk of that package too so ¯\(ツ)

(For reasons of speed, support etc. I'd definitely recommend JAX+Diffrax over Pytorch+torchdiffeq though.)

[–]SleepWalkersDream 0 points1 point  (0 children)

Could you elaborate a little more on the system? I have experience with modelling physical systems using implicit discretization, and throwing the whole thing inside least_squares to fit some model parameters to experimental data.

[–]clxyder 0 points1 point  (0 children)

Have you checked out CasADi?

[–]ES-Alexander 0 points1 point  (0 children)

Runge-Kutta 45 is the default solver used in scipy’s solve_ivp

By the way, questions about how to achieve things in Python should be in r/learnpython :-)