Defining several functions (Lambda) in sympy - python

I am trying to define several functions on, let say, one variable.
First I defined x variable:
from sympy import *
x=var('x')
I want to define series of functions using Lambda something look like this:
f0=Lambda(x,x)
f1=Lambda(x,x**2)
....
fn=....
How can I define this?
Thanks

It took me a while to understand what your after. It seems that you are most likely after loops, but do not articulate this. So I would suggest you do this:
from sympy import *
x = symbols('x')
f=[]
for i in range(1,11): # generate 1 to 10
f.append( Lambda(x,x**i) )
# then you use them like this
print( f[0](2) ) # indexes 0 based
print( f[1](2) )
# ,,, up to [9]
Anyway your not really clear in your question on what the progression should be.
EDIT: As for generating random functions here is one that example generates a polynomial with growing order and a random set of lower orders:
from random import randint
from sympy import *
x = symbols('x')
f=[]
for i in range(1,11):
fun = x**i
for j in range(i):
fun += randint(0,1)* x**j
f.append( Lambda(x,fun) )
# then you use them like this
# note I am using python 2.7 if you use 3.x modify to suit your need
print( map(str, f) ) #show them all
def call(me):
return me(2)
print( map(call, f) )
Your random procedure may be different as there are a infinite number of randoms available. Note its different each time you run the creation loop it, use random seed to fix the random if needed same generation between runs. The functions once created are stable in one process.

Related

Modify or add functions to an existing package

I want to modify functions from a package in Python. To be precise, I'll take the example of the fastkde package. I have two questions.
Looking at the source code of the function pdf_at_points, the object returned is pdf. Instead, I'd like to return the _pdfobj object. How can I do so without modifying the source code ? Ideally, I'd like to do it in a random script, so that my code is "transferable".
The last two lines of the pdf_at_points function call the applyBernacchiaFilter and the __transformphiSC_points__ functions, that cannot be called in a script. How can I make them accessible in a random script ?
Here's an idea of the ideal output:
from fastkde import fastKDE
import numpy as np
N = 2e5
var1 = 50*np.random.normal(size=N) + 0.1
var2 = 0.01*np.random.normal(size=N) - 300
var3 = 50*np.random.normal(size=N) + 0.1
var4 = 0.01*np.random.normal(size=N) - 300
test_points = list(zip(var3, var4))
# Some lines of code here that I'm looking for,
# that would create the pdf_at_points_modified function
# and that would make applyBernacchiaFilter and __transformphiSC_points__
# functions usable
myPDF = fastKDE.pdf_at_points_modified(var1,var2)
myPDF.applyBernacchiaFilterModified()
pred = myPDF.__transformphiSC_points__(test_points)
pdf_at_points_modified corresponds to the modified pdf_at_points function.
Thanks in advance.

dask.delayed and import statements

I'm learning dask and I want to generate random strings. But this only works if the import statements are inside the function f.
This works:
import dask
from dask.distributed import Client, progress
c = Client(host='scheduler')
def f():
from random import choices
from string import ascii_letters
rand_str = lambda n: ''.join(choices(population=list(ascii_letters), k=n))
return rand_str(5)
xs = []
for i in range(3):
x = dask.delayed(f)()
xs.append(x)
res = c.compute(xs)
print([r.result() for r in res])
This prints something like ['myvDi', 'rZnYO', 'MyzaG']. This is good, as the strings are random.
This, however, doesn't work:
from random import choices
from string import ascii_letters
import dask
from dask.distributed import Client, progress
c = Client(host='scheduler')
def f():
rand_str = lambda n: ''.join(choices(population=list(ascii_letters), k=n))
return rand_str(5)
xs = []
for i in range(3):
x = dask.delayed(f)()
xs.append(x)
res = c.compute(xs)
print([r.result() for r in res])
This prints something like ['tySQP', 'tySQP', 'tySQP'], which is bad because all the random strings are the same.
So I'm curious how I'm going to distribute large non-trivial code. My goal is to be able to pass arbitrary json to a dask.delayed function and have that function perform analysis using other modules, like google's ortools.
Any suggestions?
Python's random module is odd.
It creates some state when it first imports and uses that state when generating random numbers. Unfortunately, having this state around makes it difficult to serialize and move between processes.
Your solution of importing random within your function is what I do.

How do i iterate functions in Python to create array?

In Scala–since it is a functional programming language–I can sequentially iterate a function from a starting value to create an array of [f(initial), f( f(initial)), f( f( f(initial))), ...].
For example, if I want to predict the future temperature based on the current temperature, I can do something like this in Python:
import random as rnd
def estimateTemp( previousTemp):
# function to estimate the temperature, for simplicity assume it is as follows:
return( previousTemp * rnd.uniform(0.8, 1.2) + rnd.uniform(-1.0, 1.0))
Temperature = [0.0 for i in range(100)]
for i in range(1,100):
Temperature[i] = estimateTemp( Temperature[i-1] )
The problem with the previous code is that it uses for loop, requires predefined array for the temperature, and in many languages you can replace the for loop with an iterator. For example, in Scala you can easily do the previous example by using the iterate method to create a list:
val Temperature = List.iterate(0.0,100)( n =>
(n * (scala.util.Random.nextDouble()*0.4+0.8)) +
(scala.util.Random.nextDouble()*2-1)
)
Such an implementation is easy to follow and clearly written.
Python have implemented the itertools module to imitate some functional programming languages. Are there any methods in the itertools module which imitate the Scala iterate method?
You could turn your function into an infinite generator and take an appropriate slice:
import random as rnd
from itertools import islice
def estimateTemp(startTemp):
while 1:
yield startTemp
startTemp = (startTemp * rnd.uniform(0.8, 1.2) + rnd.uniform(-1.0, 1.0))
temperature = list(islice(estimateTemp(0.0), 0, 100))
An equivalent program can be produced by using itertools.accumulate-:
from itertools import accumulate
accumulate(range(0, 100), lambda x, y => estimateTemp(x))
So here we have an accumulator x that is updated, the y parameter (which is the next element of the iterable) is ignored. We use it as a way to iterate 100 times.
Unfortunately, itertools does not have this functionality built-in. Haskell and Scala both have this function, and it bothered me too. An itertools wrapper called Alakazam that I am developing has some additional helper functions, including the aforementioned iterate function.
Runnable example using Alakazam:
import random as rnd
import alakazam as zz
def estimateTemp(previousTemp):
return( previousTemp * rnd.uniform(0.8, 1.2) + rnd.uniform(-1.0, 1.0))
Temperature = zz.iterate(estimateTemp, 0.0).take(100).list()
print(Temperature)

How to check python codes by reduction?

import numpy
def rtpairs(R,T):
for i in range(numpy.size(R)):
o=0.0
for j in range(T[i]):
o +=2*(numpy.pi)/T[i]
yield R[i],o
R=[0.0,0.1,0.2]
T=[1,10,20]
for r,t in genpolar.rtpairs(R,T):
plot(r*cos(t),r*sin(t),'bo')
This program is supposed to be a generator, but I would like to check if i'm doing the right thing by first asking it to return some values for pheta (see below)
import numpy as np
def rtpairs (R=None,T=None):
R = np.array(R)
T = np.array(T)
for i in range(np.size(R)):
pheta = 0.0
for j in range(T[i]):
pheta += (2*np.pi)/T[i]
return pheta
Then
I typed import omg as o in the prompt
x = [o.rtpairs(R=[0.0,0.1,0.2],T=[1,10,20])]
# I tried to collect all values generated by the loops
It turns out to give me only one value which is 2 pi ... I have a habit to check my codes in the half way through, Is there any way for me to get a list of angles by using the code above? I don't understand why I must use a generator structure to check (first one) , but I couldn't use normal loop method to check.
Normal loop e.g.
x=[i for i in range(10)]
x=[0,1,2,3,4,5,6,7,8,9]
Here I can see a list of values I should get.
return pheta
You switched to return instead of yield. It isn't a generator any more; it's stopping at the first return. Change it back.
x = [o.rtpairs(R=[0.0,0.1,0.2],T=[1,10,20])]
This wraps the rtpairs return value in a 1-element list. That's not what you want. If you want to extract all elements from a generator and store them in a list, call list on the generator:
x = list(o.rtpairs(R=[0.0,0.1,0.2],T=[1,10,20]))

Parallelize resolution of differential equation in Python

i am solving a system of ordinary differential equations using the odeint function. Is it possible (and if yes how) to parallelize easily this kind of problem?
The answer above is wrong, solving a ODE nummerically needs to calculate the function f(t,y)=y' several times per iteration, e.g. four times for Runge-Kutta. But i dont know any package for python doing this.
Numerically integrating an ODE is an intrinsically sequential operation, since you need each result to compute the following one (well, except if you're integrating from multiple starting points). So I guess the answer is no.
EDIT: Wow, I've just realised this question is more than 3 years old. I'll still leave my answer hoping it finds its way to someone in the same predicament. Sorry for that.
I had the same problem. I was able to parallelise such process as follows.
First you need dispy. In there you'll find some programs that will do the paralelization process for you. I am not an expert on dispybut I had no problems using it, and I didn't need to configure anything.
So, how to use it?
Run python dispynode.py -d. If you do not run this script before running your main program, the parallel jobs won't be performed.
Run your main program. Here I post the one I used (sorry for the mess). You'll need to change the function sim, and change accordingly to what you want to do with the results. I hope however that my program works as a reference for you.
import os, sys, inspect
#Add dispy to your path
cmd_folder = os.path.realpath(os.path.abspath(os.path.split(inspect.getfile( inspect.currentframe() ))[0]))
if cmd_folder not in sys.path:
sys.path.insert(0, cmd_folder)
# use this if you want to include modules from a subforder
cmd_subfolder = os.path.realpath(os.path.abspath(os.path.join(os.path.split(inspect.getfile( inspect.currentframe() ))[0],cmd_folder+"/dispy-3.10/")))
if cmd_subfolder not in sys.path:
sys.path.insert(0, cmd_subfolder)
#----------------------------------------#
#This function contains the differential equation to be simulated.
def sim(ic,e,O): #ic=initial conditions; e=Epsiolon; O=Omega
from scipy.integrate import ode
import numpy as np
#Diff Eq.
def sys(t,x,e,O,z,b,l):
p = 2.*e*O*np.sin(O*t)*(1-e*np.cos(O*t))/(z+(1-e*np.cos(O*t))**2)
q = (1+4.*b/l*np.cos(O*t))*(z+(1-e*np.cos(O*t)))/( z+(1-e*np.cos(O*t))**2 )
dx=np.zeros(2)
dx[0] = x[1]
dx[1] = -q*x[0]-p*x[1]
return dx
#Simulation.
t0=0; tEnd=10000.; dt=0.1
r = ode(sys).set_integrator('dop853', nsteps=10,max_step=dt) #Definition of the integrator
Y=[];S=[];T=[]
# - parameters - #
z=0.5; l=1.0; b=0.06;
# -------------- #
color=1
r.set_initial_value(ic, t0).set_f_params(e,O,z,b,l) #Set the parameters, the initial condition and the initial time
#Loop to integrate.
while r.successful() and r.t +dt < tEnd:
r.integrate(r.t+dt)
Y.append(r.y)
T.append(r.t)
if r.y[0]>1.25*ic[0]: #Bound. This is due to my own requirements.
color=0
break
#r.y contains the solutions and r.t contains the time vector.
return e,O,color #For each pair e,O return e,O and a color (0,1) which correspond to the color of the point in the stability chart (0=unstable) (1=stable)
# ------------------------------------ #
#MAIN PROGRAM where the parallel magic happens
import matplotlib.pyplot as plt
import dispy
import numpy as np
F=100 #Total files
#Range of the values of Epsilon and Omega
Epsilon = np.linspace(0,1,100)
Omega_intervals = np.linspace(0,4,F)
ic=[0.1,0]
cluster = dispy.JobCluster(sim) #This function sets that the cluster (array of processors) will be assigned the job sim.
jobs = [] #Initialize the array of jobs
for i in range(F-1):
Data_Array=[]
jobs = []
Omega=np.linspace(Omega_intervals[i], Omega_intervals[i+1],10)
print Omega
for e in Epsilon:
for O in Omega:
job = cluster.submit(ic,e,O) #Send to the cluster a job with the specified parameters
jobs.append(job) #Join all the jobs specified above
cluster.wait()
#Do the jobs
for job in jobs:
e,O,color = job()
Data_Array.append([e,O,color])
#Save the results of the simulation.
file_name='Data'+str(i)+'.txt'
f=open(file_name, 'a')
f.write(str(Data_Array))
f.close()

Categories