for my research i am using pygalmesh to create a mesh from an array. The problem i am trying to solve is a vascular network. In it, theres multiple vessel that are not meshed while the surrounding tissue is meshed. I am able to create the array but i am having trouble setting the correct variable. In the readme documentation theres a lot of variable for which i can't find any infos.
For example :
max_edge_size_at_feature_edges=0.025,
min_facet_angle=25,
max_radius_surface_delaunay_ball=0.1,
max_facet_distance=0.001
Is there a file where they explain all those variable, like what do they actually change in the mesh and all the ones that can be put in?
My current goal with meshing would be to reduce the number of 2d element around my vessels to reduce my array dimension in the later computation.
Thx
PS : If there are other meshing alternative that you know of that can mimic pygalmesh meshing from an array and are easy to use, let me know!
Related
Desired Outcome I would like to count Polygons from a given image using a Python script. I am looking for a Python library/module or code example that can trained, like an AI, on attached image such that when I provide another similar image it can count the polygons correctly.
Practical use case for this would be to count the number of homes in a given area. See attached imageMap of neighborhood with houses outlined with red polygons
Any suggestions where to start?
I'm new to Stack overflow so if there is any further clarification required on this posted question, please ask (rather than simply marking this question down or disabling it).
Thank you!
If you input the image as text, you should use numpy or kinda sorts of libs.
If you input as .jpg/.png/... , you should try to use tensorflow or numpy as AI.
I have a random distribution of points in 2D space, and a 2D grid. I want to log in which cell in the grid each point is. For example, grid[5,6] will return [2, 53, 70, 153], which are the indices of the points located inside the cell [5,6].
It is crucial that the data will be saved in the grid by indexing the circles, and not the other way round, since later I'm going to use this grid structure to compare close points to each other, and the grid will allow me to see which points are close.
I'm working in python, and the points are stored as a 2D numpy array.
What is the best way to implement the grid data structure? Notice that the number of circles in each cell is non constant and unknown.
Thanks a lot!
P.S. As a non-native speaker, I think the title for my question is cumbersome and unclear, but I couldn't find a better way to summarize my question. If anyone has a better way to express that, please feel free to fix my title. Thanks!
I'm trying to build a simulation that will take place in a 1000x1000x1000 space. For each point in space, I need to be able to encode 2 or 3 properties.
I also need to be able to do some basic operations on the space, such as, given a point, find the properties of the 26 adjacent 3D neighbors of the point.
All points will lie on vertices in the 1000x1000x1000 space (i.e. every point is discrete).
I wrote up a version in python using numpy and it is much too slow. I've also looked for libraries that would speed it up but couldn't find anything.
Does anyone know of a python library that would provide useful helper functions and would be appropriate for a simulation of this size?
Using Numpy to together with the Numba python-compiler for the more intricate algorithms can take you a long way.
Also, I think you are refering to a "stencil" algorithm, and numba has specific stencil-functionality that could help you.
But start with a smaller grid during development!
Image mosaics use a set of predefined squared images to build a larger image (example here).
There are a lot of solutions and it's quite trivial to achieve this effect. However, it becomes much harder with the following constraints:
The shape of the original mosaics is abstract. Any convex polygon could do.
Each mosaic can only be used once.
There is no need for the mosaics to be absolutely packed (i.e. occupying 100% of the canvas), but they should be as packed as possible without overlapping.
I'm trying to automatize the ancient art of tesselation, specifically the Opus palladianum technique.
My idea is to use simulated annealing or some other heuristic to optimize the position and rotation of each irregular mosaic, swaping two in each iteration, trying to minimize some energy function that reflects the similarity to the target image as well as the "packness" of the tiles.
I'm trying to achieve this in python, any ideas and help would be greatly appreciated.
Example:
I expect that you may probably use GA (Genetic Algorithm) with a "non-overlapping" constraint to do this job.
Parameters for individual (each convex polygon) are:
initial position
rotation
(size ?)
And your fit function will be build to give best note to each individual when polygon are not overlapping (and close to other individual)
You may see this video and this one as example.
Regards
To frame this question: I am a complete beginner in terms of 3d rendering, and I would like to get my feet wet.
My goal is to create a command-line script (ideally in Python) which takes some kind of 3d model file (e.g. a sphere), maps a texture onto it, and outputs the result as an image file. That is, I would like my program to essentially be able to "do" the following:
From my reading, this appears to be something known as "uv mapping", but almost everything I've found on the subject in on how to do this using Blender, and I would prefer avoiding this: in a 2d analogy, it seems to me that Blender is like Photoshop where I'm looking for something like ImageMagick. Beyond that, I haven't been able to find much.
The closest I have found is this other stackoverflow question:
uv mapping works bad on low resolution (warning: lot of images)
But I don't quite understand what's going on there, because it doesn't import a 3d model at all -- it is my [perhaps mistaken?] understanding that EXR is a 2d image format.
Any guidance on how to get started would be greatly appreciated!