I am trying to create a cylindrical 3D surface plot using Python, where my independent variables are z and theta, and the dependent variable is radius (i.e., radius is a function of vertical position and azimuth angle).
So far, I have only been able to find ways to create a 3D surface plot that:
has z as a function of r and theta
has r as a function of z, but does not change with theta (so, the end product looks like a revolved contour; for example, the case of r = sin(z) + 1 ).
I would like to have r as a function of z and theta, because my function will produce a shape that, at any given height, will be a complex function of theta.
On top of that, I need the surface plot be able to have (but does not have to have, depending on the properties of the function) an open top or bottom. For example, if r is constant from z = 0 to z = 1 (a perfect cylinder), I would want a surface plot that would only consist of the side of the cylinder, not the top or bottom. The plot should look like a hollow shell.
I already have the function r defined.
Thanks for any help!
Apparently, after some trial and error, the best/easiest thing to do in this case is to just to convert the r, theta, and z data points (defined as 2D arrays, just like for an x,y,z plot) into cartesian coordinates:
# convert to rectangular
x = r*numpy.cos(theta)
y = r*numpy.sin(theta)
z = z
The new x,y,z arrays can be plotted just like any other x,y,z arrays generated from a polynomial where z is a function of x,y. I had originally thought that the data points would get screwed up because of overlapping z values or maybe the adjacent data points would not be connected correctly, but apparently that is not the case.
Related
I have generated a graph using basic function -
plt.plot(tm, o1)
tm is list of all x coordinates and o1 is a list of all y coordinates
NOTE
there is no specific function such as y=f(x), rather a certain y value remains constant for a given range of x.. see figure for clarity
My question is how to integrate this function, either using the matplotlib figure or using the lists (tm and o1)
The integral corresponds to computing the area under the curve.
The most easy way to compute (or approximate) the integral "numerically" is using the rectangle rule which is basically approximating the area under the curve by summing area of rectangles (see https://en.wikipedia.org/wiki/Numerical_integration#Quadrature_rules_based_on_interpolating_functions).
Practically in your case, it quite straightforward since it is a step function.
First, I recomment to use numpy arrays instead of list (more handy for numerical computing):
import matplotlib.pyplot as plt
import numpy as np
x = np.array([0,1,3,4,6,7,8,11,13,15])
y = np.array([8,5,2,2,2,5,6,5,9,9])
plt.plot(x,y)
Then, we compute the width of rectangles using np.diff():
w = np.diff(x)
Then, the height of the same rectangles (multiple possibilities exist):
h = y[:-1]
Here I chose the 2nd value of each two successive y values. the top right angle of rectangle is on the curve. You can choose the mean value of each two successive y values h = (y[1:]+y[:-1])/2 in which the middle of the top of the rectangle coincide with the curve.
Then , you will need to multiply and sum:
area = (w*h).sum()
I am trying to generate a contour plot of the gravitational potential of a 2 body system. The input is the masses of the 2 bodies and their separation. I keep receiving the error
TypeError: Input z must be at least a 2x2 array.
which I assume is referring to the PHI term in ax.contour(X,Y,PHI).
I have tried changing the x, y, and phi so that they match (I don't understand why they don't already match because phi is generated from the x and y values). Also, I have never used the contour plot before and I have limited programming experience so please forgive my ignorance.
import matplotlib.pyplot as plt
import numpy as np
n=100
#x and y evenly spaced values
x=np.arange(-n,n,0.01)
y=np.arange(-n,n,0.01)
X,Y=np.meshgrid(x,y)
r=np.array([X,Y])
#r is the scalar distance from the center that each test particle resides at
#r=np.array([(X**2+Y**2)**0.5])<--this was used to generate the output image
def Lagrange(m1,m2,a):
mtot=m1+m2
#the distance from the bodies to the center of mass of the system
x1=-(m2/mtot)*a
x2=(m1/mtot)*a
omsq=mtot/(a**3) #omega squared term
def phi(r):#gravitational potential function
phi= -m1/abs(r-x1)-m2/abs(r-x2)-0.5*omsq*r**2
return phi
#I also had a vector plot included in this code (among other details),
#but decided to omit them as it's not relevant to the error I am receiving.
fig=plt.figure()
ax=fig.add_subplot(111)
PHI=np.meshgrid(phi(r))
ax.contour(X,Y,PHI) #3rd dimension is the contour lines
plt.show()
Lagrange(3.0,1.0,1.0)
I expect by taking the input x and y coordinates of the system (their lengths match) and using the generated output values from the phi function (which should also match the length of x as well as y) to generate a contour plot, where the contour lines will represent the gravitational potential given by phi.
Here is the pair of images I mentioned in the comments. python code (left) vs working matlab code (right)
I've a very simply script with which I'm trying to plot 2 points with a set size:
from mayavi.mlab import *
x = [0.,3.]
y = [0.,0.]
z = [0.,0.]
scalars = [1.5,1.5]
pts = points3d(x, y, z, scalars, scale_factor = 1)
However, I can't figure out how, with this simple example, to set the size of the two points so that the points just touch each other. I want to set the size in the same units as the coordinates of the points. So I separate the two points by 3 units and set the size of the two points to 1.5.
However, in the image attached, the two points don't touch like expected.
Any idea why?
In mayavi, the scale of spheres determines their diameter and not their radius.
Use
pts = points3d(x, y, z, scalars, scale_factor=2, resolution=100)
the resolution argument makes a smoother sphere (number of angular points). Beware of high values of resolutions if you intend to display many spheres.
I want to create a small simulation, and I think I know how, but in order to actually see what happens I need to visualize it.
I started with a 5x5x5 array, which I want to fill up with values.
data = numpy.zeros(shape=(5,5,5))
data[:,:,0]=4
data[:,:,1]=3
data[:,:,2]=2
data[:,:,3]=1
data[:,:,4]=0
This should create a cube which has a gradient in the upward direction (if the third axis is z).
Now, how can I plot this? I dont want a surface plot, or wireframe. Just Points on each coordinate, and maybe colorcoded or transperency by value.
As a test I tried plotting all coordinates using
ax.scatter(numpy.arange(5),numpy.arange(5),numpy.arange(5))
but this will only plot a line consisting of 5 dots.
So... how can I get the 125 dots, that I want to create?
Thx.
You can encode the value in color like this:
x = np.arange(5)
X, Y, Z = np.meshgrid(x,x,x)
v = np.arange(125)
ax.scatter(X,Y,Z, c=v)
See here for the documention.
I have a set of x, y, z data but with a constraint on x and y such as y < x :
I would like to draw contours of these data on an 2D plot. I first interpolated the data with the scipy.interpolate.SmoothBivariateSpline class and then I drew contours. But in order to do that, I defined a rectangular grid from the range of x and y and I used that grid for the interpolation and the contours plot. How can I avoid to use a rectangular grid ? Does it exist an equivalent of scipy.meshgrid function for a triangular grid which can be used to plot contours ?
At the end I obtain the following plot. I added a filled triangle to hide the not relevant data.
But if you look at the result of the interpolation, the splines diverge where there is no data. It is not a problem as I do not use it but I would prefer to use a grid corresponding to the constraint on x and y :