Arcgis selecting polygons from irregular grid to remove isolated cells - python

I have an irregular grid of triangles in one polygon shapefile. These cells are themed to show only triangles above my threshold level for 'interest'. Adjacent triangles, that are visible, are considered real. Spatially isolated triangles need to be removed as they could be spurious.
I can filter using definition query to remove the triangles below threshold but I cannot figure out how to remove the isolate triangles.
I'm aware that I probably need to use polygon neighbors
screenshot from Arcgis
please send help!

I was facing similar kind of issue,so I did workaround and set appropriate threshold
from shapely.geometry import Polygon
coords1 = [(54.950899, 60.169158), (54.953492, 60.169158), (54.950958, 60.169990)]
poly1 = Polygon(coords1)
coords2 = [(24.950899, 60.169158), (24.953492, 60.169158), (24.950958, 60.169990)]
poly2 = Polygon(coords2)
poly1.distance(poly2)
# 29.997407
poly1.distance(poly1)
#0.0
You can set the threshold value to identify spatially isolated triangles
P.S. This workaround worked for me. This is solution is for your reference. Here Random polygons are taken.
Reference:
https://automating-gis-processes.github.io/site/index.html

I would use the Near tool using the same features as the Input Features and Near Features. After if runs, check the attribute table for the new field NEAR_DIST, storing distances to the nearest features.
All records with a NEAR_DIST = 0 touch a polygon. Where the NEAR_DIST > 0 will be the spatially isolated polygons that you are after.

Related

Clip lower value polygons with larger value polygons in shapefile

I have a dataset of circular polygons that correspond to tree crowns.
Many polygons overlap with each other, and some are even completely covered by larger polygons (or larger polygons covered by many small polygons). I would like to clip polygons based on attribute value (tree height), where the maximum height polygons clip the polygons with lower height values.
Image below describes the situation, where 1 is the lowest tree height and 3 is the tallest:
I attempted using this workflow in QGIS (https://gis.stackexchange.com/questions/427555/cut-polygons-with-each-other-based-on-attribute-value), but it takes very long and was unusable for larger datasets.
I would prefer to use Python, but if you can accomplish with any other programming language I would accept. Thanks in advance!
Test dataset located at:
https://github.com/arojas314/data-sharing/blob/main/niwo010_treepolys.zip
I attempted but only got as far as splitting the polygons with the boundaries (lines) of each polygon, creating smaller polygons where they over lap:
import shapely
import geopandas as gpd
# 1. convert polys to lines
tree_lines = tree_polys_valid.boundary
# 2. Split polygons by lines
merged_lines = shapely.ops.linemerge(tree_lines.values)
border_lines = shapely.ops.unary_union(merged_lines)
decomposition = shapely.ops.polygonize(border_lines)
# 3. Convert into GeoSeries
poly_series = gpd.GeoSeries(list(decomposition))

locating cell containing point in irregular 3D grid

I have an irregular 3D grid which looks something like this:
Typical dimensions of the grid are 100/100/100 cells. Each cell is spatially defined by the coords of the 8 corner nodes. The 4 vertices of the each face of a cell are not necessarily co-planar, so I represent each face as a pair of triangles and thus a cell as a polyhedron consisting of 12 triangles (2 per face). I am trying to locate the IJK index of the cell that contains an XYZ point using Python. I bisect sequentially the cell range in the I, J and K directions and test which half of the grid the point lies using the method described here Testing whether a 3D point is inside a 3D polyhedron to locate the point. Unfortunately, this does not work in some cases. In the above figure, point A is physically outside the grid but inside the current bisection range (defined by the brown dotted lines) while point B is inside the grid but outside the current range. I think the reason for this is that triangles representing the faces of the cells within the current range (eg the large brown triangles in the figure) are not co-planar with the triangles that comprise the individual cell faces within that range (eg those shaded yellow, blue etc). I have tried to show this in 2D below:
The current bisection range is shown by the brown dotted line and brown vertices. Initially, the red point is within the current range. We bisect in the X direction (bisection 1) and the red point is within the current range (dotted brown line) so we discard the right half. We now bisect in the Y direction (bisection 2) and the red point is outisde this range so we discard the top half. We eventually arrive at the final step when we have a single index in each of the I & J directions. As shown here, this places the red point in the wrong cell.
Would appreciate any suggestions for an alternative algorithm to the one I am currently trying to implement. Stepping back, I am actually interested in calculating the faces within the grid intersected by a series of line segments, so am using the "point in a polyhedron" method as an intermediate step. I looked at geomdl which could represent each face as a NURBS object but does not seem to implement intersection between a ray and a NURBS object. I also had a quick look at the Python bindings to CGAL but that looked like a massive learning curve to climb, so put that aside. Thanks in advance!

Create polygon grid that covers Earth

TL;DR:
Exact question would be how to generate semi-equal polygons of desired size e.g around 100x100m 1000x1000m, 5000x5000m grid that will cover Earth ?**
Background story:
I'm building a python based microservice that for given LAT,LON (WGS84) will return a json with some data, eg. matched country/city/or selected polygon grid.
Part with country/city/clutter works fine so far as I'm using shapefile and R-tree for quick check if Point is within area.
I'm struggling with the following case: imagine I have high numbers of GPS based samples with some data that I would like to e.g average over some geo-bins (grid).
I'm trying to divide Earth in semi-rectangular areas (for Merkator projection) that I could later on use with "contains" or "within" functions.
Currently it is done by SQL query and "GROUP BY" using SIN/COS and rounding
Samples
into BINS
(LINKS TO PICTURES)
Since with shapefiles and upcoming data from the requests I'm working with WGS84 my idea was to jump into merkator (or webmerkator) generate geopandas Polygons and use to_crs function to jump back to WGS84.
world = world[(world.name != "Antarctica") & (world.name != "Fr. S. Antarctic Lands")]
world = world.to_crs({'init': 'epsg:3857'})
plotworld = world.plot( figsize=(20,10))
plotworld.set_title("Merkator")
# Keep map proportionate
plotworld.axis('equal')
#Draw saple polygon rectangle (in merkator)
x_point_list = [0.5*1e7,0.75*1e7,0.75*1e7,0.5*1e7]
y_point_list = [-0*1e7,0*1e7,0.25*1e7,0.25*1e7]
polygon_geom = Polygon(zip(x_point_list, y_point_list))
crs = {'init': 'epsg:3857'}
polygon = gp.GeoDataFrame(index=[0], crs=crs, geometry=[polygon_geom])
polygon.plot(ax=plotworld,color='red')
#transform to WGS84
world = world.to_crs({'init': 'epsg:4326'})
polygon = polygon.to_crs({'init': 'epsg:4326'})
plotworld2= world.plot( figsize=(20,10))
polygon.plot(ax=plotworld2,color='red')
My question is: how to generate semi-equal polygons of desired size e.g around 100x100mx 1000x1000m, 5000x5000m grid that will cover Earth?
I've gone through a number of geopandas/shapely sites, that shows some tutorial about some shapes, bins however no One mention how to draw/generate bins with desired size.
I truly understand that dimension of the polygons will vary a bit, but it does not hurt me that much.
Any help appreciated!

How to select from the output of scipy Delaunay triangulation only simplices under certain volume (or under total line length)?

I am using the Delaunay triangulation on a set of points, trying to isolate clusters of points in a regular pattern.
My first experience with using the qhull.Delaunay object so bear with me...
from scipy.spatial import Delaunay
tri = Delaunay(array)
Currently looks like:
and I've found I can print (tri.simplices) to get the list. I want to isolate only those that are in the obvious clusters, which I imagine could be done by removing those with line length or volume over a certain threshold, but I'm unsure how to manipulate the result to do this?
Found the answer - posting in case it is useful for others.
The Delaunay output gives you the list of the coordinates for each point, and a nested list of which three points form each triangle.
To access their area, first you convert this into a list of Shapely polygons, then your polygons are your oyster.
from shapely.geometry.polygon import Polygon
coord_groups = [tri.points[x] for x in tri.simplices]
polygons = [Polygon(x) for x in coord_groups]
#area of the first polygon
polygons[0].area

Efficient way to measure region properties using shapely?

First of all, I apologize to post this easy question. I have a polygon
from shapely.geometry import Polygon
polygon = Polygon([(560023.4495758876400000 6362057.3904932579000000),(560023.4495758876400000 6362060.3904932579000000),(560024.4495758876400000 6362063.3904932579000000),(560026.9495758876400000 6362068.3904932579000000),(560028.4495758876400000 6362069.8904932579000000),(560034.9495758876400000 6362071.8904932579000000),(560036.4495758876400000 6362071.8904932579000000),(560037.4495758876400000 6362070.3904932579000000),(560037.4495758876400000 6362064.8904932579000000),(560036.4495758876400000 6362063.3904932579000000),(560034.9495758876400000 6362061.3904932579000000),(560026.9495758876400000 6362057.8904932579000000),(560025.4495758876400000 6362057.3904932579000000),(560023.4495758876400000 6362057.3904932579000000)])
My goal is compute the minor and the major axis of this polygon, following the Figure example:
I find this example in scikit-image but before to use a second module I wish to ask if there is in shapely module a method to calculate these indices.
thanks in advance
This question is a bit old but I ran into this myself recently, here's what I did:
from shapely.geometry import Polygon, LineString
polygon = Polygon([(560023.4495758876400000, 6362057.3904932579000000),(560023.4495758876400000, 6362060.3904932579000000),(560024.4495758876400000, 6362063.3904932579000000),(560026.9495758876400000, 6362068.3904932579000000),(560028.4495758876400000, 6362069.8904932579000000),(560034.9495758876400000, 6362071.8904932579000000),(560036.4495758876400000, 6362071.8904932579000000),(560037.4495758876400000, 6362070.3904932579000000),(560037.4495758876400000, 6362064.8904932579000000),(560036.4495758876400000, 6362063.3904932579000000),(560034.9495758876400000, 6362061.3904932579000000),(560026.9495758876400000, 6362057.8904932579000000),(560025.4495758876400000, 6362057.3904932579000000),(560023.4495758876400000, 6362057.3904932579000000)])
# get the minimum bounding rectangle and zip coordinates into a list of point-tuples
mbr_points = list(zip(*polygon.minimum_rotated_rectangle.exterior.coords.xy))
# calculate the length of each side of the minimum bounding rectangle
mbr_lengths = [LineString((mbr_points[i], mbr_points[i+1])).length for i in range(len(mbr_points) - 1)]
# get major/minor axis measurements
minor_axis = min(mbr_lengths)
major_axis = max(mbr_lengths)
Shapely makes it easy to compute the mbr via minimum_rotated_rectangle, but it doesn't appear that the opposite sides are of exact equal length. Because of this, the above calculates the length of each side, then takes the min/max.
First calculate the Minimum Bounding Rectangle of the polygon - see the process described in How to find the minimum-area-rectangle for given points?, except you will start with the convex hull. In Shapely, use the .convex_hull() method to calculate the convex hull of your polygon.
Then once you have the MBR, you can find the major/minor axes.

Categories