Create a triangular mesh using Delaunay methods - python

I'm trying create a triangular mesh using python.
As I know the boundary points, I think the delaunay method is more appropriated.
I tried use scipy. the code is very simple
from scipy.spatial import Delaunay
pixelpoints = np.transpose(np.nonzero(binaryImage))
tri = Delaunay(pixelpoints)
import matplotlib.pyplot as plt
plt.triplot(pixelpoints[:,0], pixelpoints[:,1], tri.simplices.copy())
plt.plot(pixelpoints[:,0], pixelpoints[:,1], 'o')
plt.show()
But I don't want this. I'd like to mesh inside the image bounds. Also, I don't want to mesh inside the holes.
Can I control the number of triangles to cover a surface?
Is there an alternative way to do this?
Thank you.

You can easily remove additional triangles using the Polygon.IsPointInside(tCentroid) where tCentroid is the triangle centroid. IsPointInside() can be derived by this: http://geomalgorithms.com/a03-_inclusion.html.

The Triangle program supports both these needs: refining triangles to a prescribed size and removing triangles outside the polygon or in holes. There seems to be a python interface floating around: the API describes how to specify holes and a maximum triangle area.

Related

How to export 3d delaunay triangle mesh with python?

I want to use scipy.spatial.delaunay get the triangle mesh from the 3d point cloud. And i use trimesh to save the triangle mesh as .ply form. But the result seems bad. All points play roles as vertices, so it works slowly and bad... how can i get a better mesh output with this delaunay method?
Code:
#pcd is 3d point cloud
tri = Delaunay(pcd)
#faces
f = tri.simplices
# Mesh
mesh = trimesh.Trimesh(vertices=pos_combine, faces=f)
# show mesh and export it
mesh.show()
mesh.export(file_obj="mesh.ply")
I write meshio for this. Just do
import meshio
meshio.Mesh(pcd, {"triangles": tri.simplices}).write("out.ply")
should do the job.

Is there a way to crop/set a scipy Voronoi diagram?

I've implemented the Sibon/natural neighbor spatial interpolation using scipy's Voronoi and Shapely's polygons.
But I need to restrict my voronoi diagram as there are some polygons that are unrealistically extended way out of the domain I'm interpolating in which causes inaccuracies. Also there are non-finite regions in the Voronoi diagram for the outer points which I'd rather not have.
I often see that the Voronoi diagram is bounded in a rectangular box but I don't know how to implement it. I have tried to look for it in the scipy documentation without success.
Edit: found how to do it thanks to this post:Colorize Voronoi Diagram
Thanks to anyone that took the time to read and/or write.

How to convert 3D cloud datapoints to mesh using python?

I have a set of 3D data points that looks similar to sphere. I need these data points to be connected as a watertight mesh so that it can be used for simulation.
I have worked with Meshlab and obtained a reasonable mesh but not watertight.
After this, I have tried with Open3D python library by using ball pivot algorithm. From this, I am unable to obtain water tight mesh as expected. I tried to work with hole_fixer external library (Hole_fixer), but finding and error in installing using cmake.
I have inserted the code and also "xyz" datapoints used for open3D.
import numpy as np
import open3d as o3d
dataname = 'meshdata2.xyz'
point_cloud = np.loadtxt(dataname, skiprows=1)
pcd = o3d.geometry.PointCloud()
pcd.points = o3d.utility.Vector3dVector(point_cloud[:,:3])
pcd.estimate_normals()
distances = pcd.compute_nearest_neighbor_distance()
avg_dist = np.mean(distances)
radius = 5*avg_dist
bpa_mesh = o3d.geometry.TriangleMesh.create_from_point_cloud_ball_pivoting(pcd, o3d.utility.DoubleVector([radius, radius*2, radius*0.5]))
print(str(bpa_mesh.is_watertight()))
o3d.visualization.draw_geometries([bpa_mesh])
Link for "xyz file": xyz_file_link
Mesh obtained from Open3D: Mesh_from_open3D
I would like to know how to obtain water tight mesh for these datapoints.
Any leads will be appreciated.
Regards,
Sunag R A.
To achieve a water tight mesh, you can use o3d.geometry.TriangleMesh.create_from_point_cloud_poisson.
However, Poisson reconstruction requires consistent normal orientation. In your case, you can just orient all normals toward the center of your point cloud. To do that:
import numpy as np
import open3d as o3d
pcd = o3d.io.read_point_cloud('./meshdata2.xyz')
pcd.estimate_normals()
# to obtain a consistent normal orientation
pcd.orient_normals_towards_camera_location(pcd.get_center())
# or you might want to flip the normals to make them point outward, not mandatory
pcd.normals = o3d.utility.Vector3dVector( - np.asarray(pcd.normals))
# surface reconstruction using Poisson reconstruction
mesh, _ = o3d.geometry.TriangleMesh.create_from_point_cloud_poisson(pcd, depth=9)
# paint uniform color to better visualize, not mandatory
mesh.paint_uniform_color(np.array([0.7, 0.7, 0.7]))
o3d.io.write_triangle_mesh('a.ply', mesh)
Mesh obtained using the above code snippet:
For point clouds with complex topology, it might not be easy to obtain a consistent normal orientation, read my other answer for more info.
If Open3D does not produce watertight meshes (e.g. due to this bug), one can use the Python bindings of MeshLab:
import pymeshlab
ms = pymeshlab.MeshSet()
ms.load_new_mesh("meshdata2.xyz")
ms.compute_normal_for_point_clouds()
ms.generate_surface_reconstruction_ball_pivoting()
# or ms.generate_surface_reconstruction_screened_poisson()
ms.meshing_remove_unreferenced_vertices()
ms.save_current_mesh("meshdata2.ply")
As already pointed out by the OP, the surface reconstruction filters of MeshLab do not seem to be favorable for the given test dataset.

Is it an orthogonal projection in the Nearside Perspective?

I'm currently working on world map visualizations. For now, I can use a home-made software for visualizations and point projections (Java), but I would like to upgrade the soft to be able to use a similar tool in Python.
Thus, I wanted to use cartopy with the module PROJ4, not to re-code everything, and use the wonderfull existing libraries.
It perfectly works for the PlateCarree projection, but when I want to use the Nearside Perspective, I observe a small difference between the two methods.
The two following pictures are extracted from the Java software (1) and the cartopy plot (2).
Cartopy (0.17) is used with matplotlib (3.0.2) and proj4 (4.9.1). In both pictures, we are observing at lon=lat=0° and at 400 km.
Here is the first image (Java):
Java visualization
Here is the second one (Cartopy):
Cartopy representation
As one can observe, lands are over-represented in the cartopy plot. Asuming that I want to get exactly the same projection as the one in my Java software (same representation as the "TrueView angles" in Telecom fields), I discovered in the cartopy crs module:
class NearsidePerspective(_Satellite):
"""
Perspective view looking directly down from above a point on the globe.
In this projection, the projected coordinates are x and y measured from
the origin of a plane tangent to the Earth directly below the perspective
point (e.g. a satellite).
"""
So I got this question: which projection is this about? Are the angles kept, which would means that I have an undetected problem? Or is it an orthogonal projection on the tangent plane? In this case, angles are not conserved, and I would need a solution to apply another projection (the correct one in my case). I might use the wrong one...
Thanks for your time,
Lou
I'm not sure if it's an orthogonal projection, but what CartoPy is using is directly from Proj4:
https://proj4.org/operations/projections/nsper.html
I think coordinates in this Nearside Perspective coordinates are Cartesian distances (distances from the origin on a plane), not angles. It sounds like angles are what's being used for your projection. Have you looked at using the Geostationary projection, but with a different satellite height?
https://scitools.org.uk/cartopy/docs/latest/crs/projections.html#geostationary
I can say that in this projection, the coordinates are angles (multiplied by the satellite height). Might be what you're looking for.

Interpolation with Delaunay Triangulation

Having a cloud point shaped like some sort of distorted paraboloid, I would like to use Delaunay Triangulation to interpolate the points. I have tried other techniques (f.ex. splines) but did not manage to enforce the desired behavior.
I was wondering if there's a quick way to use the results of scipy.spatial.Delaunay in a way where I can give the (x,y) coords and get the z-coord of the point on the simplex (triangle).
From the documentation looks like I can pull out the index of the simplex but I am not sure how to take it from there.
You can give the Delaunay triangulation to scipy.interpolate.LinearNDInterpolator together with the set of Z-values, and it should do the job for you.
If you really want to do the interpolation yourself, you can build it up from find_simplex and transform.

Categories