Non convex quadratic optimization solvers in python - python

Let me start saying that I am by no means expert in optimization, so any suggestion would be greatly appreciated. I have a non-convex quadratic optimization problem for which I need a solver. I myself am used to working with the modelling language CVXPY, using some of the default solvers there (ECOS, CVXOPT,...) so something similar to it would be great. I cannot use cvxpy on this problem because it is only suitable for convex optimization problems.
After googling, I saw some people recommending Pyomo, saying it could deal with non convex problems, but my problem requires matrix algebra, and as far as I know, Pyomo does not include matrix algebra capabilities. So I would require a non convex modeling language or solver for non convex quadratic problems.
The type of problem I am actually interested in solving has the following structure:
where wis the vector variable to be optimized, Xis a known data matrix and tis a prespecified parameter value.

Related

How to efficiently formalize an optimization problem for python

I have an optimization problem that I could set up as either a rather complex univariate problem with a non-linear optimization function, or as a multivariate problem with linear optimization function.
Is it possible to somewhat generalize whether from a programming perspective, one is more efficient than the other? I.e. even without going into the details of the functions and constraints, are solvers more efficient with functionally complex univariate problems over simple multivariate problems or vice versa?
I tried to google the problem but the results I read were all looking at examples that didn't even require numerical solving. (e.g. this)
Once I figured out the best approach I'll implement it in python.

Solving a se(3) optimization problem using SciPy?

I want to use scipy.optimize package to solve a rigid body optimization problem (finding optimal translation and rotation of a object).
A way to parametrize this problem is through lie group theory and the lie algebra se(3) (often called twist coordinates). This means the state vector is composed of 6 parameters (i.e. pose = [rho_x, rho_y, rho_z, theta_0, theta_1, theta_3]) from which the SE(3) Transformation matrix (a composition of translation vector and rotation matrix) can be obtained through the so called 'wedge' or 'hat' operator and the matrix exponentiation. (I found this cheat-sheet to be an excellent overview)
Now my question: Can scipy be used to optimize a 3D pose that is parametrized by the twist coordinates? I read somewhere that such problems require 'on manifold optimization' methods. So can I use scipy's least_square() or minimize(method="BFGS") for this? or do these only work for flat euclidean spaces?
Alternatively it would also help me a lot if anyone knows of any example where scipy is used to solve a 3D pose optimization problem. I have been looking for examples for the last couple of days but without much success.
This question is a implementation specific part of a more general question I posted on the robotics stackexchange here.

Bi-level optimization using Python scipy.optimize

How can I code a bi-level optimization problem using scipy.optimize.minimize which consists of the following two levels, (1) being the upper-level problem, which is subject to (2) being the lower-level:
minimize linear programming problem, subject to a
minimize quadratic programming problem
I know how to write a single objective function for quadratic programming as a typical scipy.optimize.minimize routine (step 2 by itself), but how can I add an upper level minimization problem as well that has its own objective function? Can scipy handle a minimization problem subject to a lower minimization problem? A rough outline of the code using minimize and linprog would help
I don't think scipy solvers are very suited for this.
A standard approach for bilevel problems is to form the KKT conditions of the inner problem and add these as constraints to the outer problem. The complementarity conditions make the problem hard. You need a global NLP solver or a MIP solver to solve this thing to proven optimality. Here are some formulations for linear bilevel problems. If the inner problem is a (convex) QP problem, things become a little bit more difficult, but the same idea can be applied (form KKT conditions of the inner problem).
For more information see Bard, Practical Bilevel Optimization: Algorithms And Applications, Springer, 1998.

Find equilibrium points and eigen values of a system of differential equations

I have a system of non-linear differential equations that model memory processes in the brain. I am using scipy.integrate.odeint to solve this system of ODEs. This is the first step and this executes fine. I get a good oscillating system.
The next step, however, is to find the equilibrium points and eigen values of this oscillating system which will prove what kind of bifurcation I am seeing. In python, this seems to be an impossible task. As far as I have googled, people speak about using AUTO in xppaut but I am not comfortable using any other tool as that involves lot of redundant work.
Is it possible to find the equilibrium points and eigen values of a system of ODEs in python?
Please help.
Thanks in advance.
Esash

multivariate optimization in python

anyone please let me know how to solve multivariate problems in python. The objective function is given in picture 1 and few constraints in picture 2
Objective function
Constraints
You are asking how to solve a linear integer programming problem in python. You could use Pyomo, PuLP or python-MIP to model your optimization problem and solve it with a suitable solver of your choice. Gurobi and Cplex are common commercial solvers (both offering a free academic-licence) while GLPK, SCIP or CBC are common used non-commercial ones.

Categories