Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I'm trying to demonstrate a concept like Fourier Transform. While searching the web, I encountered an image in Wikipedia:
Is that possible to plot this figure in Python or MATLAB?
Have a look at the documentatin of plot3 and patch as well as some standard plot tools.
This code produces the following image:
t = 0:.01:2*pi;
x1 = 1/2*sin(2*t);
x2 = 1/3*sin(4*t);
x3 = 1/4*sin(8*t);
x4 = 1/6*sin(16*t);
x5 = 1/8*sin(24*t);
x6 = 1/10*sin(30*t);
step = double(x1>0);
step(step==0) = -1;
step = step*.5;
figure
hold on
plot3(t,ones(size(t))*0,step,'r')
plot3(t,ones(size(t))*1,x1,'b')
plot3(t,ones(size(t))*2,x2,'b')
plot3(t,ones(size(t))*3,x3,'b')
plot3(t,ones(size(t))*4,x4,'b')
plot3(t,ones(size(t))*5,x5,'b')
plot3(t,ones(size(t))*6,x6,'b')
plot3([2*pi+.5 2*pi+.5],[.5 6],[0 0],'b')
plot3([2*pi+.5 2*pi+.5],[1 1],[0 1/2],'b')
plot3([2*pi+.5 2*pi+.5],[2 2],[0 1/3],'b')
plot3([2*pi+.5 2*pi+.5],[3 3],[0 1/4],'b')
plot3([2*pi+.5 2*pi+.5],[4 4],[0 1/6],'b')
plot3([2*pi+.5 2*pi+.5],[5 5],[0 1/8],'b')
plot3([2*pi+.5 2*pi+.5],[6 6],[0 1/10],'b')
hold off
view([45,45])
patch([0 2*pi 2*pi 0 0],[0 0 0 0 0],[-1 -1 1 1 -1],'g','FaceAlpha',.3,'EdgeColor','none')
patch([2*pi+.5 2*pi+.5 2*pi+.5 2*pi+.5 2*pi+.5],[.5 6 6 .5 .5],[-1 -1 1 1 -1],'g','FaceAlpha',.3,'EdgeColor','none')
zlim([-1,1])
xlim([-.5,2*pi+.5])
ylim([-.5,6.5])
axis off
It could serve you as a start point.
Since you already read the article about fft I leave the red plot as an exercise to yourself ;-)
The function to plot 3D lines is plot3
The following code will produce the various lines
T=(0:.01:2).';
X = repmat(1:6,[length(T),1]);
phase = bsxfun(#times,T*2*pi,1:2:11);
Z = 4/pi*bsxfun(#rdivide,sin(phase),1:2:11);
Xsum = zeros(size(T));
Zsum = sum(Z,2);
figure;
plot3(X,T,Z,'b');
hold on
plot3(Xsum,T,Zsum,'r');
patch objects with an alpha channel can be used for the grey surfaces.
Xpatch=zeros(4,1);
Ypatch= [0 2 2 0].';
Zpatch= [2 2 -2 -2].';
patch(Xpatch,Ypatch,Zpatch,[.5 .5 .5],'FaceAlpha',.3,'EdgeColor',[.5 .5 .5]);
% patch(X,Y,Z,FaceColor_RGB_triplet,'Name','Value',...)
% FaceAlpha : transparency
% EdgeColor : RGB triplet for the edge
The same can be used to plot the frequency spectrum
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 days ago.
Improve this question
I have a dataset of x and y coordinates of eye gaze data with fixation duration.
I want to plot a heatmap on a png image and the output will be like in the picture or in the link
How do I plot it in Python
Let's assume that this is the database below
we have x , y and time [900.399, 980.142, 0.78] ,, so the longest time represents high temperature and the shortest time represents low temperature
x and y represent the coordinates of the eye focus on the image because the image = width and height x and y
data = [ [900.399, 980.142, 0.78], [922.252, 880.885, 0.68], [724.311, 780.543, 0.58], [523.195, 582.994, 0.46], [623.431, 680.427, 0.76], [926.363, 881.791, 1.81], [722.942, 783.257, 0.75], [223.751, 279.995, 0.16], [723.215, 781.004, 0.64], [724.541, 779.889, 0.55] ]
and let's also assume that this is the width and height image that I want to plot on it = [1920, 1080]
Can someone help me designing a method in python to generate heatmap.
https://i.insider.com/53ce61e16bb3f7dd693ffa82?width=1000&format=jpeg&auto=webp
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 years ago.
Improve this question
I am trying to find a solution to a problem, I have searched and the closest I can find is a post on interpolation from these forums which I have been able to achieve limited success.
I have multiple polygon's defined as separate scatter plots given by a number of x-y points. What I am attempting to do is draw a horizontal line on the y-axis and find x-min and x-max values at which the horizontal line intersects the polygon. I would like to do this for the full range of y-values. So in theory I could step through a loop and record the values at y=1, y=2 etc. An engineering software I am using requires input parameters in this format hence my attempt to find a solution.
Any advise or pointers on the best approach to this problem would be much appreciated and I will give it a go.
import matplotlib.pyplot as plt
import numpy as np
x =[1,2.6,2.56,2.57,10,11.66,13.07,11.78,11.27,6.49,5.98,5.76,3.02,1.87,1]
y =[15.59,15.09,15.14,15.15,16,17,25.47,26,27,27,28,28,26.67,16.37,15.59]
plt.plot(x,y)
plt.grid()
plt.show()
You can linearly interpolate between those two points, where the polygon crosses the line defined by ys==?. To find those points, you may substract the ys value from the y values of the polygon, find those points between which the sign changes and get the minimum and maximum of those points.
import numpy as np
import matplotlib.pyplot as plt
x =[1,2.6,2.56,2.57,10,11.66,13.07,11.78,11.27,6.49,5.98,5.76,3.02,1.87,1]
y =[15.59,15.09,15.14,15.15,16,17,25.47,26,27,27,28,28,26.67,16.37,15.59]
def findminmax(t, x, zero=0):
t = np.array(t); x = np.array(x)
ta = []
p = (x-zero) > 0
ti = np.where(np.bitwise_xor(p[1:], p[:-1]))[0]
for i in ti:
y_ = np.sort(x[i:i+2])
z_ = t[i:i+2][np.argsort(x[i:i+2])]
t_ = np.interp(zero, y_, z_)
ta.append( t_ )
if ta:
return min(ta), max(ta)
else:
return None, None
plt.plot(x,y)
ys = np.arange(13, 29, 0.2)
result = []
for s in ys:
mi, ma = findminmax(x,y,zero=s)
if mi and ma:
result.append([mi,ma,s])
print("y = {}, minimum {}, maximum {}".format(s,mi,ma))
result=np.array(result)
plt.scatter(result[:,0],result[:,2], label="min", color="limegreen")
plt.scatter(result[:,1],result[:,2], label="max", color="crimson")
plt.legend()
plt.grid()
plt.show()
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 7 years ago.
Improve this question
I'm trying to design an IIR Notch filter in python using numpy array and the scipy librairy to remove a sine tone from a imported wave file (I'm using the wave module to do so). My file was generated by Adobe audition : it is a pure sine # 1.2 kHZ, sampled # 48, 96 or 192 kHz, in order to have a "pseudo-periodic" data for my circular fft (just ask if I'm not clear enough)
Here is the code I used to implement the coefficient of my filter (I get the coefficient from the article "Second-order IIR Notch Filter Design and implementation of digital signal
processing system" by C. M. Wang & W. C. Xiao)
f_cut = 1200.0
wn = f_cut/rate
r = 0.99
B, A = np.zeros(3), np.zeros(3)
A[0],A[1],A[2] = 1.0, -2.0*r*np.cos(2*np.pi*wn), r*r
B[0],B[1],B[2] = 1.0, -2.0*np.cos(2*np.pi*wn), 1.0
filtered = signal.lfilter(B, A, data_flt_R, axis=0)
Where data_flt_R is a numpy array containing my right channel in a float64 type, and rate is my sampling frequency. I plot the frequency response and the fft of my data using the matplotlib module to see if everything is ok.
N = len(data_flt_R)
w, h = signal.freqz(B,A, N)
pyplot.subplot(2,1,1)
pyplot.semilogx(w*rate/(2*np.pi), 20*np.log10(np.absolute(h)))
fft1 = fftpack.fft(data_flt_R, N)
fft_abs1 = np.absolute(fft1)
ref = np.nanmax(fft_abs1)
dB_unfiltered = 20*np.log10(fft_abs1/ref)
fft2 = fftpack.fft(filtered, N)
fft_abs2 = np.absolute(fft2)
dB_filtered = 20*np.log10(fft_abs2/ref)
abs = fftpack.fftfreq(N,1.0/rate)
pyplot.subplot(2,1,2)
pyplot.semilogx(abs,dB_unfiltered,'r', label='unfiltered')
pyplot.semilogx(abs,dB_filtered,'b', label='filtered')
pyplot.grid(True)
pyplot.legend()
pyplot.ylabel('power spectrum (in dB)')
pyplot.xlim(10,rate/2)
pyplot.xlabel('frequencies (in Hz)')
And here is what I get :
I don't understand the results and values I get before and after my fc. Shouldn't I get a plot which looks like the red one but whithout the main peak ? Why do I have a slope in HF? Is this linked with windowing?
Moreover, the result changes if I change my sampling frequency and/or the data length (16/24 or 32 bits). Can anyone enlighten me?
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
I have a binary file from which I have to read data. The file consists of a 128x128x243 matrix (hex-formatted) which I have read with the following code:
with open("zubal_voxel_man.dat", "rb") as fileHandle:
dim_x = 128
dim_y = 128
dim_z = 243
data = np.zeros((dim_x,dim_y,dim_z), dtype=np.int)
for p in range(0, dim_x):
for q in range (0, dim_y):
for r in range(0, dim_z):
data[p][q][r] = ord(fileHandle.read(1))
How do I visualize these data with Python? Each x,y,z position has a value from 0 to 255 (grey scale) which I would like to render.
Any help is greatly appreciated!
Part of your problem is with the code:
datax = data[:,0]
datay = data[:,1]
dataz = data[:,2]
Which is not doing what you are expecting of slicing in a single axis it is taking a slice of the Y=0 then of Y=1, Y=2 and plotting them against each other - your other issue is that you have a 3 dimensional array of values which gives each value 4 dimensions X, Y, Z, Value - and you are trying to plot these into a surface. which only has 3 dimensions.
I think that your first priority is to clarify your what your data represents and how it is structured.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am dealing with some kind of huge data set where I need to do binary classification using kernelized perceptron. I am using this source code: https://gist.github.com/mblondel/656147 .
Here there 3 things that can be paralelized , 1)kernel computation, 2)update rule 3)projection part. Also I did some kind of other speed up like calculation upper triangulated part of kernel then making it to full symmetric matrix :
K = np.zeros((n_samples, n_samples))
for index in itertools.combinations_with_replacement(range(n_samples),2):
K[index] = self.kernel(X[ index[0] ],X[ index[1] ],self.gamma)
#make the full KERNEL
K = K + np.triu(K,1).T
I also paralelized the projection part like:
def parallel_project(self,X):
""" Function to parallelizing prediction"""
y_predict=np.zeros(self.nOfWorkers,"object")
pool=mp.Pool(processes=self.nOfWorkers)
results=[pool.apply_async(prediction_worker,args=(self.alpha,self.sv_y,self.sv,self.kernel,(parts,))) for parts in np.array_split(X,self.nOfWorkers)]
pool.close()
pool.join()
i=0
for r in results:
y_predict[i]=r.get()
i+=1
return np.hstack(y_predict)
and worker:
def prediction_worker(alpha,sv_y,sv,kernel,samples):
""" WORKER FOR PARALELIZING PREDICTION PART"""
print "starting:" , mp.current_process().name
X= samples[0]
y_predict=np.zeros(len(X))
for i in range(len(X)):
s = 0
for a1, sv_y1, sv1 in zip(alpha, sv_y, sv):
s += a1 * sv_y1 * kernel(X[i], sv1)
y_predict[i]=s
return y_predict.flatten()
but still the code is too slow. So can you give me any hint regarding paralelization or any other speed up?
remark:
please prove general solution,I am not dealing with customize kernel functions.
thanks
Here's something that should give you an instant speedup. The kernels in Mathieu's example code take single samples, but then full Gram matrices are computed using them:
K = np.zeros((n_samples, n_samples))
for i in range(n_samples):
for j in range(n_samples):
K[i,j] = self.kernel(X[i], X[j])
This is slow, and can be avoided by vectorizing the kernel functions:
def linear_kernel(X, Y):
return np.dot(X, Y.T)
def polynomial_kernel(X, Y, p=3):
return (1 + np.dot(X, Y.T)) ** p
# the Gaussian RBF kernel is a bit trickier
Now the Gram matrix can be computed as just
K = kernel(X, X)
The project function should be changed accordingly to speed that up as well.