I am looking to solve (in Python) a differential algebraic equation of the form x'(t) = f(x(t),y(t)) subject to g(x)=0 for a function g:R^n->R^m defining the constraints on the state variable x. I think it should be possible to do that without needing to use fsolve and hence manually building up a numerical method.
For example, it seems reasonable this is possible using sci.integrate.solve_ivp or something similar. Can you suggest some specific effective option?
Related
I am trying to solve the differential equation 4(y')^3-y'=1/x^2 in python. I am familiar with the use of odeint to solve coupled ODEs and linear ODEs, but can't find much guidance on nonlinear ODEs such as the one I'm grappling with.
Attempted to use odeint and scipy but can't seem to implement properly
Any thoughts are much appreciated
NB: y is a function of x
The problem is that you get 3 valid solutions for the direction at each point of the phase space (including double roots). But each selection criterion breaks down at double roots.
One way is to use a DAE solver (which does not exist in scipy) on the system y'=v, 4v^3-v=x^-2
The second way is to take the derivative of the equation to get an explicit second-order ODE y''=-2/x^3/(12*y'^2-1).
Both methods require the selection of the initial direction from the 3 roots of the cubic at the initial point.
Link to Complicated Equation
I'm trying to find Ts for a given time. I have all necessary variables in the above equation, except for Ts. I realized that I can integrate with respect to t, which makes things a little easier. From there, how do I go about solving for Ts?
This is not a root-finding problem. This is a differential equation, which you can solve using e.g. solve_ivp from scipy.integrate. You will need the initial value for T(t=0).
I'm trying to set up a fast numerical solver in Python for a differential problem of the form:
where r is some constant.
I want to integrate A over some time period, t of interest. However, this is complicated by the fact that the dA/dt equation includes another variable B, which itself is described by an ODE dB/dt. B is actually a vector, but I've simplified the expression to try and highlight my problems more clearly.
I currently have a solution using a manual Euler method: ie compute dB/dt (then use B = B_previous + dB/dt * dt) and manually step along using a fixed time step size dt. However, this is slow and unreliable. I imagine it would be far better to use the built-in ODE solvers in Numpy, but I'm not sure this is possible given the coupled nature of the problem I'm trying to solve?
Is this possible using Numpy odeint or solve_ivp please? And if so, can anyone suggest any pointers please! Thanks.
What you have is a coupled differential equation which are standard to solve using Runge kutta, Eulers, and many other methods. You can use this example to guide you in writting your python code:
https://scipy-cookbook.readthedocs.io/items/CoupledSpringMassSystem.html
Keep in mind that that not all equations can be solved with ODEINT. If your ODE is a "stiff" ODE then you will have to choose your algorithm precisely. The definition of a stiff ODE is not completely defined but usually they arise if you have large or non-integral powers of your dependent variable in your ODE.
The first step in solving a coupled ODE though is to use standard methods. If they don't work then look into something else.
I am try to solve the next equation more than one week:
I have to use Newton-Raphson Method for getting the approximate solution of u. I have the script to do that, but I need to "linearize" this non linear ODE. The k1-k4 are not constants. On each grid point (x=1-100) they get a different value which is calculated. The initial condition is u(0)=0.
Is this a homework assignment?
Also, is it a boundary value problem or an ODE? From what you write, it sounds like BVP. Also, your boundary condition at u(0) is not enough.
If BVP, you can just use scikits.bvp_solver or scikits.bvp1lg which do the difficult parts for you.
If ODE, write the problem as a first order system, and use scipy.integrate.odeint or scipy.integrate.ode.
Regarding linearization (assuming this is a BVP): in practice it is usually enough to compute the partial derivative required for the Newton method via numerical differentiation.
I am trying to solve a system of coupled iterative equations, each of which containing lots of integrations and derivatives.
First I used maxima (embedded in Sage) to solve it analytically, but the solution was too dependent on the initial guesses I had make for my unknown functions, constant initial guesses yielded answering back almost immediately while symbolic functions when used as initial guesses yielded the system to go deep into calculations, sometimes seemingly never ending ones.
However, what I tried with Sage was actually a simplified version of my original equations so I thought it might be the case that I have no other choice rather than to treat the integrations and derivatives numerically, however, I had some not ignorable problems:
integrations were only allowed to have numerical limits and variables were not allowed as e.g. their upper limits (I thought maybe a numerical method algorithm is faster than the analytic one even-though I leave a variable or parameter in its calculations, but it just didn't work so).
integrands couldn't also admit extra variables and parameters w.r.t. which not being integrated.
the derivative function was itself a big obstacle, as I wasn't able to compute partial derivatives or to use a derivative in the integrand of an integral.
To get rid of all the problems with numerical derivative I substitute it by the symbolic diff() function and the speed improvement was still hopeful but the problems with numerical integration persist.
Now I have three questions:
a- Is it right to conclude there is no other way for me rather than to discretize the equations and do a complete numerical treatment instead of a mixed one?
b- If so then is there any way to do this automatically? My equations are not DE ones to use ODEint or else, they are iterative equations, I have integrations and derivatives only to update my unknowns at each step to their newer values.
c- If my calculations are so huge in size is there any suggestion on switching from python to fortran or things like that as well?
Best Regards