I am using Open3d in Python to cast shadows and determine intersections on an object. In the example below I use a 2 twist mobius strip from the Open3d library as the object and create a tensor for each point on the mobius strip. The tensor origin is the point on the object, and the direction is the same for all tensors: [1,0,0]. Thus, roughly speaking, things to the left (negative x direction) should generally intersect with the object, and things to the right (positive x direction) will generally not intersect. On a macro level I yield this result, as you can see in the two images below. But the lighted (yellow) section is very spotty for some reason. I have tried this with several shapes and get the same result. Why does the Raycasting in Open3d generate such an incorrect and spotty intersection result?
Code:
#create mobius mesh and points
mesh=o3d.geometry.TriangleMesh.create_mobius(twists=2)
mesh.compute_vertex_normals()
pcd = mesh.sample_points_uniformly(number_of_points=1000000)
points=np.asarray(pcd.points)
#create a scene and add the triangle mesh for ray tracing
cube = o3d.t.geometry.TriangleMesh.from_legacy(mesh)
scene = o3d.t.geometry.RaycastingScene()
cube_id=scene.add_triangles(cube)
#create ray
ray=[1,0,0]
ray=ray/np.linalg.norm(ray)
#create array of ray
array=np.ones((len(points),3))*ray[0]
pd.DataFrame(array).loc[:,1]=ray[1]
pd.DataFrame(array).loc[:,2]=ray[2]
array=pd.DataFrame(array)
#create tensor with origin at each point in mobius strip and same direction for all
tensorrays=np.array([points.loc[:,0].values.T, points.loc[:,1].values.T, points.loc[:,2].values.T,`array.loc[:,0].values.T, array.loc[:,1].values.T, array.loc[:,2].values.T]).T `
rays = o3d.core.Tensor([[tensorrays]],dtype=o3d.core.Dtype.Float32)
ans = scene.cast_rays(rays)
#determine if ray intersected the object
intersections=ans['t_hit'].numpy()[0][0]
intersections[intersections==float('inf')]=1
intersections[intersections!=1]=0
pts=1000000
[x,y]=np.meshgrid(np.linspace(np.min(points.loc[:,0].values),np.max(points.loc[:,0].values),int(np.sqrt(pts))), np.linspace(np.min(points.loc[:,1].values),np.max(points.loc[:,1].values),int(np.sqrt(pts))))
z = griddata((points.loc[:,0].values, points.loc[:,1].values), points.loc[:,2].values, (x, y), method='linear',rescale=True)
color = griddata((points.loc[:,0].values, points.loc[:,1].values), intersections, (x, y), method='linear',rescale=True)
#create surface for mobius, colored by 0's and 1's for ray intersection
trace = go.Surface(x=x,y=y,z=z, surfacecolor=color)
fig_data=[trace]
#plot
layout=go.Layout(margin={'l': 0, 'r': 0, 'b': 0, 't': 0})
fig=Figure(data=fig_data,layout=layout) path=r'C:\Users\JosephKenrick\test.html'
pio.write_html(fig, file=path, auto_open=True,validate=False)}
Related
I use shapes to work with contours. I need to add to the contours of different sizes around the field by a given value. Do not scale the contour by a certain percentage, but expand the border by the same given value, regardless of the size of the contour itself.
I am trying to do it like this:
from shapely.geometry import Polygon, LinearRing
coords = [(30.3283020760901, 59.929340439331035), (30.32625283518726, 59.929669569762034), (30.326617897500824, 59.93065894162025), (30.328354001537814, 59.93056342794558), (30.329838363175877, 59.93089851628186), (30.330225213253033, 59.929729335995624), (30.3283020760901, 59.929340439331035)]
poly_B = Polygon(coords)
poly_A = s.buffer(0.005, quad_segs=10.0, cap_style=1, join_style=2, mitre_limit=10.0)
Or like this:
r = LinearRing(coords)
poly_B = Polygon(r)
poly_A = Polygon(s.buffer(0.005).exterior, [r])
But every time I get a contour in which the Y coordinate is doubled(see image).
Help me figure out where I'm wrong.
I need the fields of the larger contour to be uniform relative to the smaller one.
I'm trying to scale a QPolygonF that is on a QGraphicsScene's QGraphicsView on its origin.
However, even after translating the polygon (poly_2) to its origin (using QPolygon.translate() and the center coordinates of the polygon received via boundingRect (x+width)/2 and (y+height)/2), the new polygon is still placed on the wrong location.
The blue polygon should be scaled according to the origin of poly_2 (please see the image below, black is the original polygon, blue polygon is the result of the code below, and the orange polygon is representing the intended outcome)
I thought that the issue might be that coordinates are from global and should be local, yet this does solve the issue unfortunately.
Here's the code:
import PyQt5
from PyQt5 import QtCore
import sys
import PyQt5
from PyQt5.QtCore import *#QPointF, QRectF
from PyQt5.QtGui import *#QPainterPath, QPolygonF, QBrush,QPen,QFont,QColor, QTransform
from PyQt5.QtWidgets import *#QApplication, QGraphicsScene, QGraphicsView, QGraphicsSimpleTextItem
poly_2_coords= [PyQt5.QtCore.QPointF(532.35, 274.98), PyQt5.QtCore.QPointF(525.67, 281.66), PyQt5.QtCore.QPointF(518.4, 292.58), PyQt5.QtCore.QPointF(507.72, 315.49), PyQt5.QtCore.QPointF(501.22, 326.04), PyQt5.QtCore.QPointF(497.16, 328.47), PyQt5.QtCore.QPointF(495.53, 331.71), PyQt5.QtCore.QPointF(488.24, 339.02), PyQt5.QtCore.QPointF(480.94, 349.56), PyQt5.QtCore.QPointF(476.09, 360.1), PyQt5.QtCore.QPointF(476.89, 378.76), PyQt5.QtCore.QPointF(492.3, 393.35), PyQt5.QtCore.QPointF(501.22, 398.21), PyQt5.QtCore.QPointF(527.17, 398.21), PyQt5.QtCore.QPointF(535.28, 390.1), PyQt5.QtCore.QPointF(540.96, 373.89), PyQt5.QtCore.QPointF(539.64, 356.93), PyQt5.QtCore.QPointF(541.46, 329.0), PyQt5.QtCore.QPointF(543.39, 313.87), PyQt5.QtCore.QPointF(545.83, 300.89), PyQt5.QtCore.QPointF(545.83, 276.56), PyQt5.QtCore.QPointF(543.39, 267.64), PyQt5.QtCore.QPointF(537.81, 268.91)]
def main():
app = QApplication(sys.argv)
scene = QGraphicsScene()
view = QGraphicsView(scene)
pen = QPen(QColor(0, 20, 255))
scene.addPolygon(QPolygonF(poly_2_coords))
poly_2 = QPolygonF(poly_2_coords)
trans = QTransform().scale(1.5,1.5)
#poly_22 = trans.mapToPolygon(QRect(int(poly_2.boundingRect().x()),int(poly_2.boundingRect().y()),int(poly_2.boundingRect().width()),int(poly_2.boundingRect().height())))
#trans.mapToPolygon()
#scene.addPolygon(QPolygonF(poly_22),QPen(QColor(0, 20, 255)))
poly_2.translate((poly_2.boundingRect().x()+poly_2.boundingRect().width())/2,(poly_2.boundingRect().y()+poly_2.boundingRect().height())/2)
print(f'poly_2.boundingRect().x() {poly_2.boundingRect().x()}+poly_2.boundingRect().width(){poly_2.boundingRect().width()}')
trans = QTransform().scale(1.4,1.4)
#poly_2.setTransformOriginPoint()
poly_22 = trans.map(poly_2)
scene.addPolygon(poly_22,QPen(QColor(0, 20, 255)))
view.show()
sys.exit(app.exec_())
if __name__ == "__main__":
main()
Edit: I've tried saving the polygon as a QGraphicsItem, and set its transformation origin point according the bbox's middle X,Y and then mapped from Global to Scene, yet no luck: the new polygon is still drawn to the wrong place.
poly_2 = QPolygonF(poly_2_coords)
poly = scene.addPolygon(poly_2)
point = QPoint((poly_2.boundingRect().x()+poly_2.boundingRect().width())/2,(poly_2.boundingRect().y()+poly_2.boundingRect().height())/2)
poly.setTransformOriginPoint(point)
poly.setScale(3)
If replacing point to equal only X,Y of the bounding rectangle, the result seems to be closer to what I need. However, in this case the origin point is obviously wrong. Is this just random luck that this answer seems to be closer to what I need?
Before considering the problem of the translation, there is a more important aspect that has to be considered: if you want to create a transformation based on the center of a polygon, you must find that center. That point is called centroid, the geometric center of any polygon.
While there are simple formulas for all basic geometric shapes, finding the centroid of a (possibly irregular) polygon with an arbitrary number of vertices is a bit more complex.
Using the arithmetic mean of vertices is not a viable option, as even in a simple square you might have multiple points on a single side, which would move the computed "center" towards those points.
The formula can be found in the Wikipedia article linked above, while a valid python implementation is available in this answer.
I modified the formula of that answer in order to accept a sequence of QPoints, while improving readability and performance, but the concept remains the same:
def centroid(points):
if len(points) < 3:
raise ValueError('At least 3 points are required')
# https://en.wikipedia.org/wiki/Centroid#Of_a_polygon
# https://en.wikipedia.org/wiki/Shoelace_formula
# computation uses concatenated pairs from the sequence, with the
# last point paired to the first one:
# (p[0], p[1]), (p[1], p[2]) [...] (p[n], p[0])
area = cx = cy = 0
p1 = points[0]
for p2 in points[1:] + [p1]:
shoelace = p1.x() * p2.y() - p2.x() * p1.y()
area += shoelace
cx += (p1.x() + p2.x()) * shoelace
cy += (p1.y() + p2.y()) * shoelace
p1 = p2
A = 0.5 * area
factor = 1 / (6 * A)
return cx * factor, cy * factor
Then, you have two options, depending on what you want to do with the resulting item.
Scale the item
In this case, you create a QGraphicsPolygonItem like the original one, then set its transform origin point using the formula above, and scale it:
poly_2 = QtGui.QPolygonF(poly_2_coords)
item2 = scene.addPolygon(poly_2, QtGui.QPen(QtGui.QColor(0, 20, 255)))
item2.setTransformOriginPoint(*centroid(poly_2_coords))
item2.setScale(1.5)
Use a QTransform
With Qt transformations some special care must be taken, as scaling always uses 0, 0 as origin point.
To scale around a specified point, you must first translate the matrix to that point, then apply the scale, and finally restore the matrix translation to its origin:
poly_2 = QtGui.QPolygonF(poly_2_coords)
cx, cy = centroid(poly_2_coords)
trans = QtGui.QTransform()
trans.translate(cx, cy)
trans.scale(1.5, 1.5)
trans.translate(-cx, -cy)
poly_2_scaled = trans.map(poly_2)
scene.addPolygon(poly_2_scaled, QtGui.QPen(QtGui.QColor(0, 20, 255)))
This is exactly what QGraphicsItems do when using the basic setScale() and setRotation() transformations.
Shape origin point and item position
Remember that QGraphicsItems are always created with their position at 0, 0.
This might not seem obvious especially for basic shapes: when you create a QGraphicsRectItem giving its x, y, width, height, the position will still be 0, 0. When dealing with complex geometry management, it's usually better to create basic shapes with the origin/reference at 0, 0 and then move the item at x, y.
For complex polygons like yours, a possibility could be to translate the centroid of the polygon at 0, 0, and then move it at the actual centroid coordinates:
item = scene.addPolygon(polygon.translated(-cx, -cy))
item.setPos(cx, cy)
item.setScale(1.5)
This might make things easier for development (the mapped points will always be consistent with the item position), and the fact that you don't need to change the transform origin point anymore makes reverse mapping even simpler.
I have a set of points in a text file: random_shape.dat.
The initial order of points in the file is random. I would like to sort these points in a counter-clockwise order as follows (the red dots are the xy data):
I tried to achieve that by using the polar coordinates: I calculate the polar angle of each point (x,y) then sort by the ascending angles, as follows:
"""
Script: format_file.py
Description: This script will format the xy data file accordingly to be used with a program expecting CCW order of data points, By soting the points in Counterclockwise order
Example: python format_file.py random_shape.dat
"""
import sys
import numpy as np
# Read the file name
filename = sys.argv[1]
# Get the header name from the first line of the file (without the newline character)
with open(filename, 'r') as f:
header = f.readline().rstrip('\n')
angles = []
# Read the data from the file
x, y = np.loadtxt(filename, skiprows=1, unpack=True)
for xi, yi in zip(x, y):
angle = np.arctan2(yi, xi)
if angle < 0:
angle += 2*np.pi # map the angle to 0,2pi interval
angles.append(angle)
# create a numpy array
angles = np.array(angles)
# Get the arguments of sorted 'angles' array
angles_argsort = np.argsort(angles)
# Sort x and y
new_x = x[angles_argsort]
new_y = y[angles_argsort]
print("Length of new x:", len(new_x))
print("Length of new y:", len(new_y))
with open(filename.split('.')[0] + '_formatted.dat', 'w') as f:
print(header, file=f)
for xi, yi in zip(new_x, new_y):
print(xi, yi, file=f)
print("Done!")
By running the script:
python format_file.py random_shape.dat
Unfortunately I don't get the expected results in random_shape_formated.dat! The points are not sorted in the desired order.
Any help is appreciated.
EDIT: The expected resutls:
Create a new file named: filename_formatted.dat that contains the sorted data according to the image above (The first line contains the starting point, the next lines contain the points as shown by the blue arrows in counterclockwise direction in the image).
EDIT 2: The xy data added here instead of using github gist:
random_shape
0.4919261070361315 0.0861956168831175
0.4860816807027076 -0.06601587301587264
0.5023029456281289 -0.18238249845392662
0.5194784026079869 0.24347943722943777
0.5395164357511545 -0.3140611471861465
0.5570497147514262 0.36010146103896146
0.6074231036252226 -0.4142604617604615
0.6397066014669927 0.48590810704447085
0.7048302091822873 -0.5173701298701294
0.7499157837544145 0.5698170011806378
0.8000108666123336 -0.6199254449254443
0.8601249660418364 0.6500974025974031
0.9002010323281716 -0.7196585989767801
0.9703341483292582 0.7299242424242429
1.0104102146155935 -0.7931355765446666
1.0805433306166803 0.8102046438410078
1.1206193969030154 -0.865251869342778
1.1907525129041021 0.8909386068476981
1.2308285791904374 -0.9360074773711129
1.300961695191524 0.971219008264463
1.3410377614778592 -1.0076702085792988
1.4111708774789458 1.051499409681228
1.451246943765281 -1.0788793781975592
1.5213800597663678 1.1317798110979933
1.561456126052703 -1.1509956709956706
1.6315892420537896 1.2120602125147582
1.671665308340125 -1.221751279024005
1.7417984243412115 1.2923406139315234
1.7818744906275468 -1.2943211334120424
1.8520076066286335 1.3726210153482883
1.8920836729149686 -1.3596340023612745
1.9622167889160553 1.4533549783549786
2.0022928552023904 -1.4086186540731989
2.072425971203477 1.5331818181818184
2.1125020374898122 -1.451707005116095
2.182635153490899 1.6134622195985833
2.2227112197772345 -1.4884454939000387
2.292844335778321 1.6937426210153486
2.3329204020646563 -1.5192876820149541
2.403053518065743 1.774476584022039
2.443129584352078 -1.5433264462809912
2.513262700353165 1.8547569854388037
2.5533387666395 -1.561015348288075
2.6234718826405867 1.9345838252656438
2.663547948926922 -1.5719008264462806
2.7336810649280086 1.9858362849271942
2.7737571312143436 -1.5750757575757568
2.8438902472154304 2.009421487603306
2.883966313501766 -1.5687258953168035
2.954099429502852 2.023481896890988
2.9941754957891877 -1.5564797323888229
3.0643086117902745 2.0243890200708385
3.1043846780766096 -1.536523022432113
3.1745177940776963 2.0085143644234558
3.2145938603640314 -1.5088557654466737
3.284726976365118 1.9749508067689887
3.324803042651453 -1.472570838252656
3.39493615865254 1.919162731208186
3.435012224938875 -1.4285753640299088
3.5051453409399618 1.8343467138921687
3.545221407226297 -1.3786835891381335
3.6053355066557997 1.7260966810966811
3.655430589513719 -1.3197205824478546
3.6854876392284703 1.6130086580086582
3.765639771801141 -1.2544077134986225
3.750611246943765 1.5024152236652237
3.805715838087476 1.3785173160173163
3.850244800627849 1.2787337662337666
3.875848954088563 -1.1827449822904361
3.919007794704616 1.1336638361638363
3.9860581363759846 -1.1074537583628485
3.9860581363759846 1.0004485329485333
4.058012891753723 0.876878197560016
4.096267318663407 -1.0303482880755608
4.15638141809291 0.7443374218374221
4.206476500950829 -0.9514285714285711
4.256571583808748 0.6491902794175526
4.3166856832382505 -0.8738695395513574
4.36678076609617 0.593855765446675
4.426894865525672 -0.7981247540338443
4.476989948383592 0.5802489177489183
4.537104047813094 -0.72918339236521
4.587199130671014 0.5902272727272733
4.647313230100516 -0.667045454545454
4.697408312958435 0.6246979535615904
4.757522412387939 -0.6148858717040526
4.807617495245857 0.6754968516332154
4.8677315946753605 -0.5754260133805582
4.917826677533279 0.7163173947264858
4.977940776962782 -0.5500265643447455
5.028035859820701 0.7448917748917752
5.088149959250204 -0.5373268398268394
5.138245042108123 0.7702912239275879
5.198359141537626 -0.5445838252656432
5.2484542243955445 0.7897943722943728
5.308568323825048 -0.5618191656828015
5.358663406682967 0.8052154663518301
5.41877750611247 -0.5844972451790631
5.468872588970389 0.8156473829201105
5.5289866883998915 -0.6067217630853987
5.579081771257811 0.8197294372294377
5.639195870687313 -0.6248642266824076
5.689290953545233 0.8197294372294377
5.749405052974735 -0.6398317591499403
5.799500135832655 0.8142866981503349
5.859614235262157 -0.6493565525383702
5.909709318120076 0.8006798504525783
5.969823417549579 -0.6570670995670991
6.019918500407498 0.7811767020857934
6.080032599837001 -0.6570670995670991
6.13012768269492 0.7562308146399057
6.190241782124423 -0.653438606847697
6.240336864982342 0.7217601338055886
6.300450964411845 -0.6420995670995664
6.350546047269764 0.6777646595828419
6.410660146699267 -0.6225964187327819
6.4607552295571855 0.6242443919716649
6.520869328986689 -0.5922077922077915
6.570964411844607 0.5548494687131056
6.631078511274111 -0.5495730027548205
6.681173594132029 0.4686727666273125
6.7412876935615325 -0.4860743801652889
6.781363759847868 0.3679316979316982
6.84147785927737 -0.39541245791245716
6.861515892420538 0.25880333951762546
6.926639500135833 -0.28237987012986965
6.917336127605076 0.14262677798392165
6.946677533279001 0.05098957832291173
6.967431210462995 -0.13605442176870675
6.965045730326905 -0.03674603174603108
I find that an easy way to sort points with x,y-coordinates like that is to sort them dependent on the angle between the line from the points and the center of mass of the whole polygon and the horizontal line which is called alpha in the example. The coordinates of the center of mass (x0 and y0) can easily be calculated by averaging the x,y coordinates of all points. Then you calculate the angle using numpy.arccos for instance. When y-y0 is larger than 0 you take the angle directly, otherwise you subtract the angle from 360° (2𝜋). I have used numpy.where for the calculation of the angle and then numpy.argsort to produce a mask for indexing the initial x,y-values. The following function sort_xy sorts all x and y coordinates with respect to this angle. If you want to start from any other point you could add an offset angle for that. In your case that would be zero though.
def sort_xy(x, y):
x0 = np.mean(x)
y0 = np.mean(y)
r = np.sqrt((x-x0)**2 + (y-y0)**2)
angles = np.where((y-y0) > 0, np.arccos((x-x0)/r), 2*np.pi-np.arccos((x-x0)/r))
mask = np.argsort(angles)
x_sorted = x[mask]
y_sorted = y[mask]
return x_sorted, y_sorted
Plotting x, y before sorting using matplotlib.pyplot.plot (points are obvisously not sorted):
Plotting x, y using matplotlib.pyplot.plot after sorting with this method:
If it is certain that the curve does not cross the same X coordinate (i.e. any vertical line) more than twice, then you could visit the points in X-sorted order and append a point to one of two tracks you follow: to the one whose last end point is the closest to the new one. One of these tracks will represent the "upper" part of the curve, and the other, the "lower" one.
The logic would be as follows:
dist2 = lambda a,b: (a[0]-b[0])*(a[0]-b[0]) + (a[1]-b[1])*(a[1]-b[1])
z = list(zip(x, y)) # get the list of coordinate pairs
z.sort() # sort by x coordinate
cw = z[0:1] # first point in clockwise direction
ccw = z[1:2] # first point in counter clockwise direction
# reverse the above assignment depending on how first 2 points relate
if z[1][1] > z[0][1]:
cw = z[1:2]
ccw = z[0:1]
for p in z[2:]:
# append to the list to which the next point is closest
if dist2(cw[-1], p) < dist2(ccw[-1], p):
cw.append(p)
else:
ccw.append(p)
cw.reverse()
result = cw + ccw
This would also work for a curve with steep fluctuations in the Y-coordinate, for which an angle-look-around from some central point would fail, like here:
No assumption is made about the range of the X nor of the Y coordinate: like for instance, the curve does not necessarily have to cross the X axis (Y = 0) for this to work.
Counter-clock-wise order depends on the choice of a pivot point. From your question, one good choice of the pivot point is the center of mass.
Something like this:
# Find the Center of Mass: data is a numpy array of shape (Npoints, 2)
mean = np.mean(data, axis=0)
# Compute angles
angles = np.arctan2((data-mean)[:, 1], (data-mean)[:, 0])
# Transform angles from [-pi,pi] -> [0, 2*pi]
angles[angles < 0] = angles[angles < 0] + 2 * np.pi
# Sort
sorting_indices = np.argsort(angles)
sorted_data = data[sorting_indices]
Not really a python question I think, but still I think you could try sorting by - sign(y) * x doing something like:
def counter_clockwise_sort(points):
return sorted(points, key=lambda point: point['x'] * (-1 if point['y'] >= 0 else 1))
should work fine, assuming you read your points properly into a list of dicts of format {'x': 0.12312, 'y': 0.912}
EDIT: This will work as long as you cross the X axis only twice, like in your example.
If:
the shape is arbitrarily complex and
the point spacing is ~random
then I think this is a really hard problem.
For what it's worth, I have faced a similar problem in the past, and I used a traveling salesman solver. In particular, I used the LKH solver. I see there is a Python repo for solving the problem, LKH-TSP. Once you have an order to the points, I don't think it will be too hard to decide on a clockwise vs clockwise ordering.
If we want to answer your specific problem, we need to pick a pivot point.
Since you want to sort according to the starting point you picked, I would take a pivot in the middle (x=4,y=0 will do).
Since we're sorting counterclockwise, we'll take arctan2(-(y-pivot_y),-(x-center_x)) (we're flipping the x axis).
We get the following, with a gradient colored scatter to prove correctness (fyi I removed the first line of the dat file after downloading):
import numpy as np
import matplotlib.pyplot as plt
points = np.loadtxt('points.dat')
#oneliner for ordering points (transform, adjust for 0 to 2pi, argsort, index at points)
ordered_points = points[np.argsort(np.apply_along_axis(lambda x: np.arctan2(-x[1],-x[0]+4) + np.pi*2, axis=1,arr=points)),:]
#color coding 0-1 as str for gray colormap in matplotlib
plt.scatter(ordered_points[:,0], ordered_points[:,1],c=[str(x) for x in np.arange(len(ordered_points)) / len(ordered_points)],cmap='gray')
Result (in the colormap 1 is white and 0 is black), they're numbered in the 0-1 range by order:
For points with comparable distances between their neighbouring pts, we can use KDTree to get two closest pts for each pt. Then draw lines connecting those to give us a closed shape contour. Then, we will make use of OpenCV's findContours to get contour traced always in counter-clockwise manner. Now, since OpenCV works on images, we need to sample data from the provided float format to uint8 image format. Given, comparable distances between two pts, that should be pretty safe. Also, OpenCV handles it well to make sure it traces even sharp corners in curvatures, i.e. smooth or not-smooth data would work just fine. And, there's no pivot requirement, etc. As such all kinds of shapes would be good to work with.
Here'e the implementation -
import numpy as np
import matplotlib.pyplot as plt
from scipy.spatial.distance import pdist
from scipy.spatial import cKDTree
import cv2
from scipy.ndimage.morphology import binary_fill_holes
def counter_clockwise_order(a, DEBUG_PLOT=False):
b = a-a.min(0)
d = pdist(b).min()
c = np.round(2*b/d).astype(int)
img = np.zeros(c.max(0)[::-1]+1, dtype=np.uint8)
d1,d2 = cKDTree(c).query(c,k=3)
b = c[d2]
p1,p2,p3 = b[:,0],b[:,1],b[:,2]
for i in range(len(b)):
cv2.line(img,tuple(p1[i]),tuple(p2[i]),255,1)
cv2.line(img,tuple(p1[i]),tuple(p3[i]),255,1)
img = (binary_fill_holes(img==255)*255).astype(np.uint8)
if int(cv2.__version__.split('.')[0])>=3:
_,contours,hierarchy = cv2.findContours(img.copy(),cv2.RETR_TREE,cv2.CHAIN_APPROX_NONE)
else:
contours,hierarchy = cv2.findContours(img.copy(),cv2.RETR_TREE,cv2.CHAIN_APPROX_NONE)
cont = contours[0][:,0]
f1,f2 = cKDTree(cont).query(c,k=1)
ordered_points = a[f2.argsort()[::-1]]
if DEBUG_PLOT==1:
NPOINTS = len(ordered_points)
for i in range(NPOINTS):
plt.plot(ordered_points[i:i+2,0],ordered_points[i:i+2,1],alpha=float(i)/(NPOINTS-1),color='k')
plt.show()
return ordered_points
Sample run -
# Load data in a 2D array with 2 columns
a = np.loadtxt('random_shape.csv',delimiter=' ')
ordered_a = counter_clockwise_order(a, DEBUG_PLOT=1)
Output -
sorry for such specific question guys , I think people only with knowledge of Maya will answer tho. In Maya I have cubes different sizes and I need to find with python which face of cube is pointing Y axis down. (Pivot is in center) Any tips will be appreciated
Thanks a lot :)
import re
from maya import cmds
from pymel.core.datatypes import Vector, Matrix, Point
obj = 'pCube1'
# Get the world transformation matrix of the object
obj_matrix = Matrix(cmds.xform(obj, query=True, worldSpace=True, matrix=True))
# Iterate through all faces
for face in cmds.ls(obj + '.f[*]', flatten=True):
# Get face normal in object space
face_normals_text = cmds.polyInfo(face, faceNormals=True)[0]
# Convert to a list of floats
face_normals = [float(digit) for digit in re.findall(r'-?\d*\.\d*', face_normals_text)]
# Create a Vector object and multiply with matrix to get world space
v = Vector(face_normals) * obj_matrix
# Check if vector faces downwards
if max(abs(v[0]), abs(v[1]), abs(v[2])) == -v[1]:
print face, v
If you just need a quick solution without vector math and Pymel or the the API, you can use cmds.polySelectConstraint to find the faces aligned with a normal. All you need to do is select all the faces, then use the constraint to get only the ones pointing the right way. This will select all the faces in a mesh that are pointing along a given axis:
import maya.cmds as cmds
def select_faces_by_axis (mesh, axis = (0,1,0), tolerance = 45):
cmds.select(mesh + ".f[*]")
cmds.polySelectConstraint(mode = 3, type = 8, orient = 2, orientaxis = axis, orientbound = (0, tolerance))
cmds.polySelectConstraint(dis=True) # remember to turn constraint off!
The axis is the x,y,z axis you want and tolerance is the slop in degrees you'll tolerate. To get the downward faces you'd do
select_faces_by_axis ('your_mesh_here', (0,0,-1))
or
select_faces_by_axis ('your_mesh_here', (0,0,-1), 1)
# this would get faces only within 1 degree of downard
This method has the advantage of operating mostly in Maya's C++, it's going to be faster than python-based methods that loop over all the faces in a mesh.
With pymel the code can be a bit more compact. Selecting the faces pointing downwards:
n=pm.PyNode("pCubeShape1")
s = []
for f in n.faces:
if f.getNormal(space='world')[1] < 0.0:
s.append(f)
pm.select(s)
I want to generate a surface which should look like a hemisphere.. What I have done so far is to read an already existing BEM mesh and try to show the scalar values on it. But now I have to show the scalar values on a hemisphere instead of the Bem mesh. And I don't know how to generate using a triangular mesh that looks like an hemisphere.
This hemisphere needs to contain a set of N number of points(x,y,z)[using the mlab.triangular_mesh] and at each vertex I need to represent N data(float) either as a value or using variations in colormap(eg: blue(lowest value of the data) to red(highest value of the data)). data=its an array of size 2562, a set of float values, could be randomly generated as its part of another codes. Points were part of another set of code too.its of shape(2562,3). but the shape is not a hemisphere
This was the program I used for viewing using the BEM surface
fname = data_path + '/subjects/sample/bem/sample-5120-5120-5120-bem-sol.fif'
surfaces = mne.read_bem_surfaces(fname, add_geom=True)
print "Number of surfaces : %d" % len(surfaces)
head_col = (0.95, 0.83, 0.83) # light pink
colors = [head_col]
try:
from enthought.mayavi import mlab
except:
from mayavi import mlab
mlab.figure(size=(600, 600), bgcolor=(0, 0, 0))
for c, surf in zip(colors, surfaces):
points = surf['rr']
faces = surf['tris']
s=data
mlab.triangular_mesh(points[:, 0], points[:, 1], points[:, 2],faces,color=c, opacity=1,scalars=s[:,0])
#mesh= mlab.triangular_mesh(x,y,z,triangles,representation='wireframe',opacity=0) #point_data=mesh.mlab_source.dataset.point_data
#point_data.scalars=t
#point_data.scalars.name='Point data'
#mesh2= mlab.pipeline.set_active_attribute(mesh,point_scalars='Point data')
As others have pointed out your question is not very clear, and does not include an easily reproducible example -- your example would take considerable work for us to reproduce and you have not described the steps you have taken very clearly.
What you are trying to do is easy. Scalars can be defined for each vertex (i.e., each VTK point):
surf = mlab.triangular_mesh(x,y,z,triangles)
surf.mlab_source.scalars = t
And you need to set a flag to get them to appear, which I think might be your problem:
surf.actor.mapper.scalar_visibility=True
Here is some code to generate a half-sphere. It produces a VTK polydata. I'm not 100% sure if the mayavi source is the same source type as triangular_mesh but I think it is.
res = 250. #desired resolution (number of samples on sphere)
phi,theta = np.mgrid[0:np.pi:np.pi/res, 0:np.pi:np.pi/res]
x=np.cos(theta) * np.sin(phi)
y=np.sin(theta) * np.sin(phi)
z=np.cos(phi)
mlab.mesh(x,y,z,color=(1,1,1))