Related
I'm making a python script right now that is trying to find the length of an arc, where it given this information:
center of arc: x1, y1
start point of arc: x2, y2
end point of arc: x3, y3
direction, cw, ccw
so far I have been able to successfully calculate the radius, and I tried calculating the angle using the equation:
But for any arcs that have an angle greater than 1*pi or 180 degrees, it returns the incorrect (but correct) inside angle.
What is the correct equation knowing the radius and these three points that I can use to find the value of the angle of the arc from 0 rad/degrees to 360 degrees/2pi radians, going in either the clockwise or counterclockwise direction (it can be either or and I need to be able to calculate for both scenarios)
Code:
# code to find theta
aVector = np.array([x1 - x2, y1 - y2])
bVector = np.array([x1 - x3, y1 - y3])
aMag = np.linalg.norm(aVector)
bMag = np.linalg.norm(aVector)
theta = np.arcos(np.dot(aVector, bVector) / (aMag * bMag))
as you can see here, I'm using arccos which to my dismay only outputs 0-180 degrees
Solution/Working code:
# equation for angle using atan2
start = math.atan2(y2 - y1, x2 - x1)
end = math.atan2(y3 - y1, x3 - x1)
if gcodeAnalysis[tempLineNum][4] == "G3": # going CW
start, end = end, start
tau = 2.0 * math.pi
theta = math.fmod(math.fmod(end - start, tau) + tau, tau)
Working Values:
X1 = 0.00048399999999998444
Y1 = 0.0002720000000007161
X2 = 0.378484
Y2 = -14.694728
X3 = 3.376
Y3 = -14.307
Proper result/value
Theta = 6.077209477545957
Assume this arc was done CCW
As you noticed, the range of math.acos is [0, pi], making it rather useless for telling you the relative directions of the vectors. To get full circular information about a pair of angles, you can use math.atan2. While regular math.atan has a range of [-pi/2, pi/2], atan2 splits the inputs into two parts and returns an angle in the range (-pi, pi]. You can compute the angles relative to any reference, not necessarily relative to each other:
start = math.atan2(y2 - y1, x2 - x1)
end = math.atan2(y3 - y1, x3 - x1)
Now you can use some common formulae to find the difference between the angles in whatever direction you want. I've implemented some of these in a small utility library I made called haggis. The specific function you want is haggis.math.ang_diff_pos.
First, the "manual" computation:
if direction == 'cw':
start, end = end, start
tau = 2.0 * math.pi
angle = math.fmod(math.fmod(end - start, tau) + tau, tau)
If you want to use my function, you can do
if direction == 'cw':
start, end = end, start
angle = ang_diff_pos(start, end)
All of these operations can be easily vectorized using numpy if you find yourself dealing with many points all at once.
You can use the cross product of the two vector to determine if the two vector need to rotate clock or counter-clock wise.
See code below:
import numpy as np
from numpy import linalg as LA
x1 = 0
y1 = 0
x2 = 2
y2 = 0
x3 = 2
y3 = -2
dir = 'ccw' # or ccw
v1 = np.array([x2-x1,y2-y1])
v2 = np.array( [x3-x1,y3-y1])
# if the cross product is positive, then the two vector need to rotate counter clockwise
rot = np.cross(v1,v2)
vdir = 'ccw' if rot >0 else 'cw'
r = (v1[0]*v2[0]+v1[1]*v2[1])/(LA.norm(v1)*LA.norm(v2))
deg = np.arccos(r)/np.pi*180
if vdir != dir:
deg = 360 -deg
print(deg)
I am making a geometry interface in python (currently using tkinter) but I have stumbled upon a major problem: I need a function that is able to return a point, that is at a certain angle with a certain line segment, is a certain length apart from the vertex of the angle. We know the coordinates of the points of the line segment, and also the angle at which we want the point to be. I have attached an image below for a more graphical view of my question.
The problem: I can calculate it using trigonometry, where
x, y = vertex.getCoords()
endx = x + length * cos(radians(angle))
endy = y + length * sin(radians(angle))
p = Point(endx, endy)
The angle I input is in degrees. That calculation is true only when the line segment is parallel to the abscissa. But the sizes of the angles I get back are very strange, to say the least. I want the function to work wherever the first two points are on the tkinter canvas, whatever the angle is. I am very lost as to what I should do to fix it. What I found out: I get as output a point that when connected to the vertex, makes a line that is at the desired angle to the abscissa. So it works when the first arm(leg, shoulder) of the angle is parallel to the abscissa, then the function runs flawlessly (because of cross angles) - the Z formation. As soon as I make it not parallel, it becomes weird. This is because we are taking the y of the vertex, not where the foot of the perpendicular lands(C1 on the attached image). I am pretty good at math, so feel free to post some more technical solutions, I will understand them
EDIT: I just wanted to make a quick recap of my question: how should I construct a point that is at a certain angle from a line segment. I have already made functions that create the angle in respect to the X and Y axes, but I have no idea how i can make it in respect to the line inputted. Some code for the two functions:
def inRespectToXAxis(vertex, angle, length):
x, y = vertex.getCoords()
newx = x + length * cos(radians(angle))
newy = y + length * sin(radians(angle))
p = Point(abs(newx), abs(newy))
return p
def inRespectToYAxis(vertex, length, angle):
x, y = vertex.getCoords()
theta_rad = pi / 2 - radians(angle)
newx = x + length * cos(radians(angle))
newy = y + length * sin(radians(angle))
p = Point(newx, newy)
return p
Seems you want to add line segment angle to get proper result. You can calculate it using segment ends coordinates (x1,y1) and (x2,y2)
lineAngle = math.atan2(y2 - y1, x2 - x1)
Result is in radians, so apply it as
endx = x1 + length * cos(radians(angle) + lineAngle) etc
I face a seemingly simple problem that somehow I didn't manage to solve, despite looking into several trigonometry and geometry intros.
I have a 2D space in which x=0; y=0 is the centre. I would like, given some position x1, y1 (being the coordinates of one end of the segment), and a length and an angle (0 denoting vertical lines), to find the coordinates of the other end of the segment.
In other words, being able to move from one set of parameters (x1; y1; angle; length) to (x1; y1; x2; y2) and vice versa.
Thanks a lot,
For this you want to use sine and cosine. Here is some example code:
from math import cos, sin, radians
a = radians(45)
l = 10
x1, y1 = (10, 15)
x2 += sin(a) * l
y2 += cos(a) * l
Here is an article about how and why this works.
I'm working with 3D images and have to rotate them according to Euler angles (phi,psi,theta) in 'zxz' convention (these Euler angles are part of a dataset, so I have to use that convention). I found the function scipy.ndimage.rotate that seems useful in that regard.
arrayR = scipy.ndimage.rotate(array , phi, axes=(0,1), reshape=False)
arrayR = scipy.ndimage.rotate(arrayR, psi, axes=(1,2), reshape=False)
arrayR = scipy.ndimage.rotate(arrayR, the, axes=(0,1), reshape=False)
Sadly, this does not do what intended. This is why:
Definition:
In the z-x-z convention, the x-y-z frame is rotated three times: first
about the z-axis by an angle phi; then about the new x-axis by an
angle psi; then about the newest z-axis by an angle theta.
However with above code, the rotations are always with respect to the original axes. Which is why obtained rotations are not correct. Anyone has a suggestion to obtain correct rotations, as explained in the definition?
In other words, in the present 'zxz' convention the rotations are intrinsic (rotations about the axes of the rotating coordinate system XYZ, solidary with the moving body, which changes its orientation after each elemental rotation). If I use the above code, the rotations are extrinsic (rotations about the axes xyz of the original coordinate system, which is assumed to remain motionless). I need a way for doing extrinsic rotations, in python.
I found a satisfying solution following this link: https://nbviewer.jupyter.org/gist/lhk/f05ee20b5a826e4c8b9bb3e528348688
This method uses np.meshgrid, scipy.ndimage.map_coordinates. The above link uses some third party library for generating the rotation matrix, however I use scipy.spatial.transform.Rotation. This function allows to define both intrinsic and extrinsic rotations: see description of scipy.spatial.transform.Rotation.from_euler.
Here is my function:
import numpy as np
from scipy.spatial.transform import Rotation as R
from scipy.ndimage import map_coordinates
# Rotates 3D image around image center
# INPUTS
# array: 3D numpy array
# orient: list of Euler angles (phi,psi,the)
# OUTPUT
# arrayR: rotated 3D numpy array
# by E. Moebel, 2020
def rotate_array(array, orient):
phi = orient[0]
psi = orient[1]
the = orient[2]
# create meshgrid
dim = array.shape
ax = np.arange(dim[0])
ay = np.arange(dim[1])
az = np.arange(dim[2])
coords = np.meshgrid(ax, ay, az)
# stack the meshgrid to position vectors, center them around 0 by substracting dim/2
xyz = np.vstack([coords[0].reshape(-1) - float(dim[0]) / 2, # x coordinate, centered
coords[1].reshape(-1) - float(dim[1]) / 2, # y coordinate, centered
coords[2].reshape(-1) - float(dim[2]) / 2]) # z coordinate, centered
# create transformation matrix
r = R.from_euler('zxz', [phi, psi, the], degrees=True)
mat = r.as_matrix()
# apply transformation
transformed_xyz = np.dot(mat, xyz)
# extract coordinates
x = transformed_xyz[0, :] + float(dim[0]) / 2
y = transformed_xyz[1, :] + float(dim[1]) / 2
z = transformed_xyz[2, :] + float(dim[2]) / 2
x = x.reshape((dim[1],dim[0],dim[2]))
y = y.reshape((dim[1],dim[0],dim[2]))
z = z.reshape((dim[1],dim[0],dim[2])) # reason for strange ordering: see next line
# the coordinate system seems to be strange, it has to be ordered like this
new_xyz = [y, x, z]
# sample
arrayR = map_coordinates(array, new_xyz, order=1)
Note:
You can also use this function for intrinsic rotations, simply adapt the first argument of 'from_euler' to your Euler convention. In this case, you obtain equivalent result than in my 1st post (using scipy.ndimage.rotate). However I noticed that the present code is 3x faster (0.01s for 40^3 volume) than when using scipy.ndimage.rotate (0.03s for 40^3 volume).
Hope this will help someone!
There seem to be a bit confusion about the "axes" parameter in your first post. To do a rotation about the x axis, the plane of rotation would be the yz plane which means your "axes" parameter should be set to (1,2). Also the first and the third rotations are, presumably about the x and z axes. But, both your rotations are in the xy plane. Could these be possibly the reasons behind the discrepancies in your answers? I am not convinced by your explanations about the new and original axes. The independent calls to the "rotate" function does not have access to the old data in any form or shape. It only sees the new axes and rotated array.
I check the code https://nbviewer.jupyter.org/gist/lhk/f05ee20b5a826e4c8b9bb3e528348688
There is a minor bug. The tested image is square, but if doing rectangular image, it will encounter some problems. below are correct ones for 2D and 3D rotations (noted that the Euler angle sequence used in my example is 'ZYZ', you should define this before using it):
def rotate_array_2D(array, orient):
# create a transformation matrix
angle=orient/180.*np.pi
c=np.cos(angle)
s=np.sin(angle)
mat=np.array([[c,s],[-s,c]])
# create meshgrid
dim = array.shape
ax = np.arange(dim[0])
ay = np.arange(dim[1])
coords = np.meshgrid(ax, ay)
# stack the meshgrid to position vectors, center them around 0 by substracting dim/2
xy = np.vstack([coords[0].reshape(-1) - float(dim[0]) / 2, # x coordinate, centered
coords[1].reshape(-1) - float(dim[1]) / 2]) # y coordinate, centered
# apply transformation
transformed_xy = np.dot(mat, xy)
# extract coordinates
x = transformed_xy[0, :] + float(dim[0]) / 2
y = transformed_xy[1, :] + float(dim[1]) / 2
x = x.reshape((dim[1],dim[0]))
y = y.reshape((dim[1],dim[0]))
new_xy = [x,y]
# sample
arrayR = map_coordinates(array, new_xy, order=1).T
return arrayR
def rotate_array_3D(array, orient):
rot = orient[0]
tilt = orient[1]
phi = orient[2]
# create meshgrid
dim = array.shape
ax = np.arange(dim[0])
ay = np.arange(dim[1])
az = np.arange(dim[2])
coords = np.meshgrid(ax, ay, az)
# stack the meshgrid to position vectors, center them around 0 by substracting dim/2
xyz = np.vstack([coords[0].reshape(-1) - float(dim[0]) / 2, # x coordinate, centered
coords[1].reshape(-1) - float(dim[1]) / 2, # y coordinate, centered
coords[2].reshape(-1) - float(dim[2]) / 2]) # z coordinate, centered
# create transformation matrix
r = R.from_euler('ZYZ', [rot, tilt, phi], degrees=True)
mat = r.as_matrix()
# apply transformation
transformed_xyz = np.dot(mat, xyz)
# extract coordinates
x = transformed_xyz[0, :] + float(dim[0]) / 2
y = transformed_xyz[1, :] + float(dim[1]) / 2
z = transformed_xyz[2, :] + float(dim[2]) / 2
x = x.reshape((dim[1],dim[0],dim[2]))
y = y.reshape((dim[1],dim[0],dim[2]))
z = z.reshape((dim[1],dim[0],dim[2])) # I test the rotation in 2D and this strange thing can be explained
new_xyz = [x,y,z]
arrayR = map_coordinates(array, new_xyz, order=1).T
return arrayR
I'm programming a function in Python in Autodesk Maya (using PyMel for Maya)
I have three 3D points; p0, p1, p2.
Then they make a rigid transformation, so after the transformation (an affine transformation) I have their new positions; q0, q1, q2.
I also have a fourth point before the transformation; p3. I want to calculate its position after the same transformation; q4.
So I need to calculate the transformation matrix, and then apply it to p4. I don't know how to do either. List = an array of objects
import pymel.core as pm
import pymel.core.datatypes as dt
p0 = dt.Vector(pm.getAttr(list[0]+".tx"), pm.getAttr(list[0]+".ty"), pm.getAttr(list[0]+".tz"))
p1 = dt.Vector(pm.getAttr(list[1]+".tx"), pm.getAttr(list[1]+".ty"), pm.getAttr(list[1]+".tz"))
p2 = dt.Vector(pm.getAttr(list[2]+".tx"), pm.getAttr(list[2]+".ty"), pm.getAttr(list[2]+".tz")
p3 = dt.Vector(pm.getAttr(list[3]+".tx"), pm.getAttr(list[3]+".ty"), pm.getAttr(list[3]+".tz"))
The 3D points are read from animated objects in the Maya scene. So at another frame,
I run this code to get
q0 = dt.Vector(pm.getAttr(list[0]+".tx"), pm.getAttr(list[0]+".ty"), pm.getAttr(list[0]+".tz"))
q1 = dt.Vector(pm.getAttr(list[1]+".tx"), pm.getAttr(list[1]+".ty"), pm.getAttr(list[1]+".tz"))
q2 = dt.Vector(pm.getAttr(list[2]+".tx"), pm.getAttr(list[2]+".ty"), pm.getAttr(list[2]+".tz"))
#q3 = TransformationMatrix between (p0,p1,p2) and (q0,q1,q2), applied to p3
I tried to calculate with vectors, but I ended up with errors due to divisions by zero...
So I figured that a transformation matrix should solve it without problems.
I've got a deadline not far ahead and I REALLY need help solving this!
PLEASE HELP!
Edit:
how to perform coordinates affine transformation using python?
I need this function "solve_affine", but it should take only 3 points from each set instead of 4. And I can't use numpy...
Here's a solution using numpy and scipy. scipy is mostly used to generate random rotations, except for scipy.linalg.norm which is easy to code oneself. The main things used from numpy are cross product and matrix multiplication, which are also easy to code oneself.
The basic idea is this: given three non-collinear points x1,x2,x3, it's possible to find an orthogonal triple of vectors (axes) v1,v2,v3, with v1 in the direction of x2-x1, v2 in the plane spanned by (x2-x1) and (x3-x1), and v3 completing the triple.
The points y1,y2,y3 are rotated and translated relative to x1,x2,x3. The axes w1,w2,w3 generated from y1,y2,y3 are rotated (i.e., no translation) from v1,v2,v3. These two sets of triples are each orthogonal, so it's easy to find the rotation from them: R = W * transpose(V)
Once we have the rotation, finding the translation is simple: y1 = R*x + t, so t = y1 - R*x. It might be a better to use a least-squares solver and combine all three points to get an estimate of t.
import numpy
import scipy.linalg
def rand_rot():
"""Return a random rotation
Return a random orthogonal matrix with determinant 1"""
q, _ = scipy.linalg.qr(numpy.random.randn(3, 3))
if scipy.linalg.det(q) < 0:
# does this ever happen?
print "got a negative det"
q[:, 0] = -q[:, 0]
return q
def rand_noncollinear():
"""Return 3 random non-collinear vectors"""
while True:
b = numpy.random.randn(3, 3)
sigma = scipy.linalg.svdvals(b)
if sigma[2]/sigma[0] > 0.1:
# "very" non-collinear
break
# "nearly" collinear; try again
return b[:, 0], b[:, 1], b[:, 2]
def normalize(a):
"""Return argument normalized"""
return a/scipy.linalg.norm(a)
def colstack(a1, a2, a3):
"""Stack three vectors as columns"""
return numpy.hstack((a1[:, numpy.newaxis],
a2[:, numpy.newaxis],
a3[:, numpy.newaxis]))
def get_axes(a1, a2, a3):
"""Generate orthogonal axes from three non-collinear points"""
# I tried to do this with QR, but something didn't work
b1 = normalize(a2-a1)
b2 = normalize(a3-a1)
b3 = normalize(numpy.cross(b1, b2))
b4 = normalize(numpy.cross(b3, b1))
return b1, b4, b3
# random rotation and translation
r = rand_rot()
t = numpy.random.randn(3)
# three non-collinear points
x1, x2, x3 = rand_noncollinear()
# some other point
x4 = numpy.random.randn(3)
# the images of the above in the transformation.
# y4 is for checking only -- won't be used to estimate r or t
y1, y2, y3, y4 = [numpy.dot(r, x) + t
for x in x1, x2, x3, x4]
v1, v2, v3 = get_axes(x1, x2, x3)
w1, w2, w3 = get_axes(y1, y2, y3)
V = colstack(v1, v2, v3)
W = colstack(w1, w2, w3)
# W = R V, so R = W * inverse(V); but V orthogonal, so inverse(V) is
# transpose(V):
rfound = numpy.dot(W, V.T)
# y1 = R x1 + t, so...
tfound = y1-numpy.dot(r, x1)
# get error on images of x2 and x3, just in case
y2err = scipy.linalg.norm(numpy.dot(rfound, x2) + tfound - y2)
y3err = scipy.linalg.norm(numpy.dot(rfound, x3) + tfound - y3)
# and check error image of x4 -- getting an estimate of y4 is the
# point of all of this
y4err = scipy.linalg.norm(numpy.dot(rfound, x4) + tfound - y4)
print "y2 error: ", y2err
print "y3 error: ", y3err
print "y4 error: ", y4err
Both the description and your code are confusing. Description is a bit vague while the code examples are missing important bits and pieces. So here is how I understand the question:
Knowing three points in two spaces how to construct a transform from space A to space B?
Image 1: How to form a transformation between 2 spaces.
The answer depends on the type of transform the spaces have. You see three points always form a planar span. This means that you can know what the rotation, transform, and uniform scale of the new space is. You can also know the shear on the plane, as well as nonuniform scale. However, you can not know what the shear or nonuniform scale would be in the plane normal direction.
Therefore to make sense the question mutates into how to rotate and translate two spaces to match? this is pretty easy to do Translation part is directly:
trans = q0 - p0
That leaves you with rotation which has been explained in several posts:
python + maya: Rotate Y axis to be along vector
How to convert three dimensional vector to an Euler rotation in software like Maya using python
You can also calculate a scaling factor after this.
I've figured it out
p0p1 = p1-p0
p0p2 = p2-p0
p0p3 = p3-p0
q0q1 = q1-q0
q0q2 = q2-q0
q0q3 = q3-q0
before = dt.Matrix([p0.x, p0.y, p0.z, 0],[p1.x, p1.y, p1.z, 0],[p2.x, p2.y, p2.z, 0], [0,0,0,1]);
after = dt.Matrix([q0.x, q0.y, q0.z, 0],[q1.x, q1.y, q1.z, 0],[q2.x, q2.y, q2.z, 0], [0,0,0,1]);
normal = p0p1.cross(p0p2).normal()
dist = p0p3.dot(normal)
q3 = p3 - dist*normal
transformMatrix = before.inverse()*after
solve = dt.Matrix(q3.x, q3.y, q3.z, 1)*transformMatrix
q3 = dt.Vector(solve[0][0], solve[0][1], solve[0][2])
newNormal = q0q1.cross(q0q2).normal()
q3 = q3 + newNormal*dist
pm.move(list[3], q3, r=False)
The transformation matrix only worked for points that are within the plane p0p1p2. So I solved it by transforming the projected point of p3, then move it out from the plane by the same distance.
If you have a solution that only involves a matrix, feel free to share, it may still help me! :)