I have a table full of longitude/ latitude pairs in decimal format (e.g., -41.547, 23.456). I want to display the values in "Easting and Northing"/ UTM format. Does geopy provide a way to convert from decimal to UTM? I see in the code that it will parse UTM values, but I don't see how to get them back out and the geopy Google Group has gone the way of all things.
Nope. You need to reproject your points, and geopy isn't going to do that for you.
What you need is libgdal and some Python bindings. I always use the bindings in GeoDjango, but there are other alternatives.
EDIT: It is just a mathematical formula, but it's non-trivial. There are thousands of different ways to represent the surface of the Earth. See here for a huge but incomplete list.
There are two parts to a geographic projection of the Earth-- a coordinate system and a datum. The latter is essentially a three-dimensional model of the planet. When you say you want to convert latitude/longitude points to UTM values, you're missing a couple of pieces of the puzzle.
Let's assume that your lat/long points are based on the WGS84 datum, because that's a pretty common standard for lat/long points these days. You want to convert those points to a UTM coordinate system. But to which UTM coordinate system? There are 60 of them.
I think I may have over-complicated things. All I wanted was the dms values (so 42.519540,
-70.896716 becomes 42º31'10.34" N 70º53'48.18" W). You can get this by creating a geopy point object with your long and lat, then calling format(). However, as of this writing, format() is broken and requires the patch here.
Related
I'm trying to accomplish a fairly simple goal, which is that I have a starting LatLong coordinate, I convert this into UTM, thus ending up with some particular zone number and letter (I'm using the Python library UTM), I offset the UTM eastings and northings by some values, and then convert everything back into LatLong. This generally seems to work fine, except for edge conditions if my offset causes the current UTM coordinates to go out of bounds of the current zone and into a new zone.
I'm very new to working with these geographical coordinate systems, so does it even make sense for me to say can I do offsets while preserving the current zone, or does it loop back around? Is it possible to have UTM coordinates that are technically out of bounds within the current zone, but convert properly back to the proper Lat/Long coordinates, or will they be wrong?
Thanks!
You can have coordinates outside zone, see e.g. the subsection Overlapping grids in UTM.
In fact, such coordinate system (or better the MGRS) was designed also for such cases: on a military ground (battles) you should not care much about changing zones or to do transformations (and so spheric or ellipsoid coordinates).
Just test that your libraries allows such values: some libraries are more strict (they may requires normalized coordinates). By UTM design they should allows coordinates outside proper zone, but a test is always better.
maybe somebody knows something, since I am not able to find anything that makes sense to me.
I have a dataset positions (lon, lat) and I want to snap them to the nearest road and calculate the distance between them.
So far I discovered OSM, however I can't find a working example on how to use the API using python.
If any of you could help, I am thankful for ever little detail.
Will try to find it out by myself in the meantime and publish the answer if successful (couldn't find any similar question so maybe it will help someone in the future)
Welcome! OSM is a wonderful resource, but is essentially a raw dataset that you have to download and do your own processing on. There are a number of ways to do this, if you need a relatively small extract of the data (as opposed to the full planet file) the Overpass API is the place to look. Overpass turbo (docs) is a useful tool to help with this API.
Once you have the road network data you need, you can use a library like Shapely to snap your points to the road network geometry, and then either calculate the distance between them (if you need "as the crow flies" distance), or split the road geometry by the snapped points and calculate the length of the line. If you need real-world distance that takes the curvature of the earth into consideration (as opposed to the distance as it appears on a projected map), you can use something like Geopy.
You may also want to look into the Map Matching API from Mapbox (full disclosure, I work there), which takes a set of coordinates, snaps them to the road network, and returns the snapped geometry as well as information about the route, including distance.
You might use KDTree of sklearn for this. You fill an array with coordinates of candidate roads (I downloaded this from openstreetmap). Then use KDTree to make a tree of this array. Finally, use KDTree.query(your_point, k=1) to get the nearest point of the tree (which is the nearest node of the coordinate roads). Since searching the tree is very fast (essentially log(N) for N points that form the tree), you can query lots of points.
I am just a starter to Spatial Analysis and am stuck at a point.
I have a crime data set where the points are given in latitude and longitude. I have another dataset (a shape file of Chicago) and I would like to plot all the lat-long points on top of map plot using the polygons from the shape file.
The problem is that the shape file contains polygon information in a different format which I am unaware of. I retrieve the shape file from
https://data.cityofchicago.org/Facilities-Geographic-Boundaries/Boundaries-Neighborhoods/9wp7-iasj
From the above download I use the Neighborhoods_2012b.shp file
Latitude Longitude from crime data:
POINT (-87.680162979 41.998718085)
POINT (-87.746717696 41.934629749)
Polygon shapes in the Chicago Shapefile: (All are positive values)
POLYGON ((1182322.0429 1876674.730700001, 1182...
POLYGON ((1176452.803199999 1897600.927599996,...
I tried transforming the Latitude and Longitude information into different projection (Mercator) such as (epsg:3857, epsg:3395), but these projection give me both positive and Negative values
epsg:3857:
POINT (-9760511.095493518 5160787.421333898)
POINT (-9767919.932699846 5151192.321624438)
I even tried transforming all Lat-Long into UTM (using the python UTM library), which hopefully gives me all positive value but still it doesn't seem the right format as the plots are at very different scale.
Using UTM python Library (utm.from_latlon)
POINT (4649857.621612935 443669.2483944244)
POINT (4642787.870839979 438095.1726599361)
I am not sure how to handle this situation, Is there a way to know what type of projection is used given the spatial points?
I'd be glad for any help.
The prj file says:
PROJCS["NAD_1983_StatePlane_Illinois_East_FIPS_1201_Feet",GEOGCS["GCS_North_American_1983",DATUM["D_North_American_1983",SPHEROID["GRS_1980",6378137.0,298.257222101]],PRIMEM["Greenwich",0.0],UNIT["Degree",0.0174532925199433]],PROJECTION["Transverse_Mercator"],PARAMETER["False_Easting",984250.0],PARAMETER["False_Northing",0.0],PARAMETER["Central_Meridian",-88.33333333333333],PARAMETER["Scale_Factor",0.999975],PARAMETER["Latitude_Of_Origin",36.66666666666666],UNIT["Foot_US",0.3048006096012192]]
I opened the layer with QGIS and it did not use the prj file directly. However, with the information of the prj file, you can use the CRS selector to retrieve it. Search parameters are : NAD83 Illinois East here. Choose the one that is in Feet as suggested by the prj file. EPSG = 6455 is a good one for instance. I think you now have enough information to continue...
I am working on a GPS data that has the latitude/longitude data speed vehicle ids and so on.
Each day different time vehicle speeds are different for each side of the road.
I created this graph with plotly mapbox and color difference is related with speed of vehicle.
So my question is Can I use any cluster algorithm for find side of vehicle? I tried DBSCAN but I could not find a clear answer.
It depends on the data you have about the different spots, is you know time and speed at each point you can estimate the range within the next point should fail in, and afterwards order them in function of distance. Otherwise it is going to be complicated without more information that position and speed with all those points.
ps there is a heavy computational method to try to estimate the route by using tangent to determine the angle between segments os consecutive points
Many GPS hardware sets will compute direction and insert this information into the standard output data alongside latitude, longitude, speed, etc. You might check if your source data contains information about direction or "heading" often specified in degrees where zero degrees is regarded as North, 90 degrees is East, etc. You may need to parse the data and convert from binary or ascii hexadecimal values to integer values depending on the data structure specifications which vary for different hardware designs. If such data exists in your source, this may be a simpler and more reliable approach to determining direction.
This is the first time I'm using GeoDjango with postGIS. After installation and some tests with everything running fine I am concerned about query performance when table rows will grow.
I'm saving in a geometry point longitudes and latitudes that I get from Google geocoding (WGS84, or SRID 4326). My problem is that distance operations are very common in my application. I often need to get near spots from a landmark. Geometry maths are very complex, so even if I have an spatial index, it will probably take too long in the future having more than 1000 spots in a nearby area.
So is there any way to project this geometry type to do distance operations faster? does anyone know a Django library that can render a Google map containing some of these points?
Any advices on how to speed up spatial queries on GeoDjango?
If you can fit your working area into a map projection, that will always be faster, as there are fewer math calls necessary for things like distance calculations. However, if you have truly global data, suck it up: use geography. If you only have continental USA data, use something like EPSG:2163 http://spatialreference.org/ref/epsg/2163/
The more constrained your working area, the more accurate results you can get in a map projection. See the state plane projections for highly constrained, accurate projections for regional areas in the USA. Or UTM projections for larger sub-national regions.
I'm researching on this topic. As far as I have found, coordinates that you get from geopy library are in SRID 4326 format, so you can store them in a geometry field type without problems. This would be an example of a GeoDjango model using geometry:
class Landmark(models.Model):
point = models.PointField(spatial_index = True,
srid = 4326,
geography = True)
objects = models.GeoManager()
By the way, be very careful to pass longitude / latitude to the PointField, in that exact order. geopy returns latitude / longitude coordinates, so you will need to reverse them.
For transforming points in one coordinate system to another we can use GEOS with GeoDjango. In the example I will transform a point in 4326 to the famous Google projection 900913:
from django.contrib.gis.geos import Point
punto = Point(40,-3)
punto.set_srid(900913)
punto.transform(4326)
punto.wkt
Out[5]: 'POINT (0.0003593261136478 -0.0000269494585230)'
This way we can store coordinates in projection systems, which will have better performance maths.
For showing points in a Google map in the admin site interface. We can use this great article.
I have decided to go on with geography types, and I will convert them in the future, in case I need to improve performance.
Generally, GeoDjango will create and use spatial indexes on geometry columns where appropriate.
For an application dealing primarily with distances between points, the Geography type (introduced in PostGIS 1.5, and supported by GeoDjango) may be a good fit. GeoDjango says it gives "much better performance on WGS84 distance queries" [link].