I have a set of 3D data points that looks similar to sphere. I need these data points to be connected as a watertight mesh so that it can be used for simulation.
I have worked with Meshlab and obtained a reasonable mesh but not watertight.
After this, I have tried with Open3D python library by using ball pivot algorithm. From this, I am unable to obtain water tight mesh as expected. I tried to work with hole_fixer external library (Hole_fixer), but finding and error in installing using cmake.
I have inserted the code and also "xyz" datapoints used for open3D.
import numpy as np
import open3d as o3d
dataname = 'meshdata2.xyz'
point_cloud = np.loadtxt(dataname, skiprows=1)
pcd = o3d.geometry.PointCloud()
pcd.points = o3d.utility.Vector3dVector(point_cloud[:,:3])
pcd.estimate_normals()
distances = pcd.compute_nearest_neighbor_distance()
avg_dist = np.mean(distances)
radius = 5*avg_dist
bpa_mesh = o3d.geometry.TriangleMesh.create_from_point_cloud_ball_pivoting(pcd, o3d.utility.DoubleVector([radius, radius*2, radius*0.5]))
print(str(bpa_mesh.is_watertight()))
o3d.visualization.draw_geometries([bpa_mesh])
Link for "xyz file": xyz_file_link
Mesh obtained from Open3D: Mesh_from_open3D
I would like to know how to obtain water tight mesh for these datapoints.
Any leads will be appreciated.
Regards,
Sunag R A.
To achieve a water tight mesh, you can use o3d.geometry.TriangleMesh.create_from_point_cloud_poisson.
However, Poisson reconstruction requires consistent normal orientation. In your case, you can just orient all normals toward the center of your point cloud. To do that:
import numpy as np
import open3d as o3d
pcd = o3d.io.read_point_cloud('./meshdata2.xyz')
pcd.estimate_normals()
# to obtain a consistent normal orientation
pcd.orient_normals_towards_camera_location(pcd.get_center())
# or you might want to flip the normals to make them point outward, not mandatory
pcd.normals = o3d.utility.Vector3dVector( - np.asarray(pcd.normals))
# surface reconstruction using Poisson reconstruction
mesh, _ = o3d.geometry.TriangleMesh.create_from_point_cloud_poisson(pcd, depth=9)
# paint uniform color to better visualize, not mandatory
mesh.paint_uniform_color(np.array([0.7, 0.7, 0.7]))
o3d.io.write_triangle_mesh('a.ply', mesh)
Mesh obtained using the above code snippet:
For point clouds with complex topology, it might not be easy to obtain a consistent normal orientation, read my other answer for more info.
If Open3D does not produce watertight meshes (e.g. due to this bug), one can use the Python bindings of MeshLab:
import pymeshlab
ms = pymeshlab.MeshSet()
ms.load_new_mesh("meshdata2.xyz")
ms.compute_normal_for_point_clouds()
ms.generate_surface_reconstruction_ball_pivoting()
# or ms.generate_surface_reconstruction_screened_poisson()
ms.meshing_remove_unreferenced_vertices()
ms.save_current_mesh("meshdata2.ply")
As already pointed out by the OP, the surface reconstruction filters of MeshLab do not seem to be favorable for the given test dataset.
Related
I am making maps of meteorological data (x,y-coordinates in m) using matplotlib.pyplot.contourf(). I want to plot a coastline, but all the examples I find on internet use lat-lon data (with cartopy or basemap).
Is there a way (without transforming the data to a lat-lon grid) to plot a coastline on my cartesian grid? I know size of the grid, and its center's lat-lon coordinates.
I haven't tried anything but look for similar examples, which I could not find.
The solution is to use cartopy's gnomonic projection: https://scitools.org.uk/cartopy/docs/v0.15/crs/projections.html#gnomonic , e.g.
proj =ccrs.Gnomonic(central_latitude=0, central_longitude= 0)
The origin of the data need to be specified (in lat-lon), and it expects the data coordinates to be distance in meters from that origin. Then, the normal cartopy features (like coastlines) work as usual.
I am trying to create a 3D mesh in Open3D given a set of 3D points. My code is the following:
pcd = o3d.geometry.PointCloud()
pcd.points = o3d.utility.Vector3dVector(points)
hull, _ = pcd.compute_convex_hull()
hull_ls = o3d.geometry.LineSet.create_from_triangle_mesh(hull)
hull_ls.paint_uniform_color((1, 0, 0))
o3d.visualization.draw_geometries([pcd, hull_ls])
The visualization I obtain is the following:
The main issue is that there are 2 points that are not connected correctly. See highlighted area.
What I am trying to obtain instead is something like this:
The second image was drawn using Open3D LineSet, since I have the points and order in which they need to be connected. The issue with the second visualization is that I am not able to export is a mesh file, since it is just a set of line.
Any help is appreciated.
The expected mesh is not a convex hull. Therefore, compute_convex_hull will not produce the expected result anyways.
You should be able to export a LineSet using write_line_set and read using read_line_set.
I want to use scipy.spatial.delaunay get the triangle mesh from the 3d point cloud. And i use trimesh to save the triangle mesh as .ply form. But the result seems bad. All points play roles as vertices, so it works slowly and bad... how can i get a better mesh output with this delaunay method?
Code:
#pcd is 3d point cloud
tri = Delaunay(pcd)
#faces
f = tri.simplices
# Mesh
mesh = trimesh.Trimesh(vertices=pos_combine, faces=f)
# show mesh and export it
mesh.show()
mesh.export(file_obj="mesh.ply")
I write meshio for this. Just do
import meshio
meshio.Mesh(pcd, {"triangles": tri.simplices}).write("out.ply")
should do the job.
I'm using Open3D to create mesh, and on the official web we can compute triangle normal of triangle mesh, but how to visualize the normal of the surface?
Thank you for the help
once you have calculated the normals, you can render the normals by pressing ctrl + 9 in the visualizer, e.g.
import open3d as o3d
mesh = o3d.io.read_triangle_mesh('path_to_mesh')
mesh.compute_vertex_normals()
o3d.visualization.draw_geometries([mesh])
This will give you something like this:
If you want to see the vertex normals as lines, I'm not sure this is supported in open3d for meshes yet. But you can convert the mesh to a point cloud:
pcd = o3d.geometry.PointCloud()
pcd.points = o3d.utility.Vector3dVector(np.asarray(mesh.vertices))
pcd.estimate_normals()
and then visualize the normals by pressing n in the visualizer. This will give you something like this:
I'm trying create a triangular mesh using python.
As I know the boundary points, I think the delaunay method is more appropriated.
I tried use scipy. the code is very simple
from scipy.spatial import Delaunay
pixelpoints = np.transpose(np.nonzero(binaryImage))
tri = Delaunay(pixelpoints)
import matplotlib.pyplot as plt
plt.triplot(pixelpoints[:,0], pixelpoints[:,1], tri.simplices.copy())
plt.plot(pixelpoints[:,0], pixelpoints[:,1], 'o')
plt.show()
But I don't want this. I'd like to mesh inside the image bounds. Also, I don't want to mesh inside the holes.
Can I control the number of triangles to cover a surface?
Is there an alternative way to do this?
Thank you.
You can easily remove additional triangles using the Polygon.IsPointInside(tCentroid) where tCentroid is the triangle centroid. IsPointInside() can be derived by this: http://geomalgorithms.com/a03-_inclusion.html.
The Triangle program supports both these needs: refining triangles to a prescribed size and removing triangles outside the polygon or in holes. There seems to be a python interface floating around: the API describes how to specify holes and a maximum triangle area.