My question is very simple and can be understood in one line:
Is there a way, tool, etc. using Google Maps to get an overlay of all surface which is below a certain time of travel?
I hope the question is clear, but I coulnd't find anything related on the web.
If you have any information, I'll take it!
Here is what I mean illustrated by an example:
I search for a new place to live. I know the precise address of my office.
I would in this case be able to overlay on Google Maps the surface which is under a certain time of travel (let's say 30 minuts by car).
We would have a long surface colored around motorways (because you get faster), and colors around roads.
I think you are looking for something like Mapnificient: it shows you areas you can reach with public transportation in a given time (video).
A similar site with even more options is How Far Can I Travel. Here you can choose between inputting your speed of travel and a travel time OR a distance. Optionally, you can also specify how accurate you want the results to be.
Now, how to create such examples yourself? See this related question where the accepted answer explains step-by-step, how you can get travel time data from the Google Maps API.
Finally, for $8.75, you can buy the article Stata utilities for geocoding and generating travel time and travel distance information by Adam Ozimek and Daniel Miles that describes traveltime, a command that uses Google Maps to provide spatial information for data.
The traveltime command takes latitude and longitude information and
finds travel distances between points, as well as the time it would
take to travel that distance by either driving, walking, or using
public transportation.
Other than the ones in #BioGeek answer, here are some more:
Nokia Here Maps API can give you the exact shape of the output. They call it time-based isoline. See here: Requesting a time based isoline
For travel times under 10 minutes, Isoscope is available at this address.
Also this looks promising: Route360
Update:
Route360 can be used for free in the following places:
Africa
Austria
Australia and New Zealand
British Isles
British Columbia
Czech Republic
Denmark
France
Germany
Italy
Malaysia, Singapore, and Brunei
Mexico
Norway
Portugal
Spain
Sweden
Switzerland
USA
I think you are looking for the Google Distance Matrix API. It returns not routes, but duration and distance for each pair of origin and destination. It has a usage limit of 100 elements per 10 seconds.
So you can make an educated guess about the distance that matches the desired time of travel, choose six or eight equally distributed points on a circle of that radius, and query the corresponding durations. Then refine the distances according to the results and calculate intermediary points. This way you can get a (quite rough) map in a few iterations.
I don't think there is a simple way of doing this, but here's an idea:
You'd need to first get the long/lat coordinates of your start position. You will then need to work out say, 50 coordinates around that start position that are say, 1 kilometer away from it (the answer here can help). You'd then need to traverse around each of these points and ask for the driving time to get there from your start position using Google Driving Directions API.
You'd then need to traverse the points again to find the points that are below the time of travel allowed (e.g. 30 mins in your question), move these points another kilometer or so away (in the same direction that the point originally moved from the start position) and repeat the driving time request until all are above the time of travel allowed. Finally you end up with 50 coordinates which you can plot onto a Google Maps image as a polygon using the Google Javascript API for mapping polygons.
This method requires a lot of requests to Google so you'll need to think about Google's limit on the number of requests you can do a day.
You can use the GraphHopper Directions API - this API part is also open source.
As we've added public transit and this feature recently it currently does not work together. But this is planned and until then you can enjoy road network isochrones :)
Disclaimer: I'm one of the GraphHopper founders.
You can draw a circle around your current position and check for a road at an angle every X degrees.
Another idea is to use a contour plot and isolines.
Related
I'm dealing with a problem consisting of propagating N satellites (around 6k) over some time (usually 2 weeks) with time step of 15s. Additionally I have few observers with known position on the Earth in ITRF and known field of view (in Az Alt). What I want to accomplish is check when and what satellites are visible for said observers. Right now I'm using combination of skyfield and pyephem to do the job. Skyfield gives me ITRF coordinates of my observer and ITRF coordinates of satellite (which I need to find if said satellite is visible for observer), unfortunately I need to check if satellite is eclipsed, but can't handle it efficently. For that I use pyephem, just to check if satellite is eclipsed, but this need additional computations. Maybe someone had similar problems and know a better method?
TLDR:
I need to find every observed satellite over some period of time given position on the Earth of the observer, its field of view (in Az Alt) and TLE catalog of satellites
I'd like to ask if anyone knows how can i get list of all objects that are in the Solar system. I mean all planets and their natural satellites. Or first 400 objects that are the closest to the barycenter of the solar system. The only thing i can get are planets and not exactly planets but their barycenters, so these aren't even correct coordinates.
So my husband is a planetary astronomer who studies moons of the outer planets (and Pluto). I thought he'd be able to tell me the answer right off. His response was "It's complicated, there is no central repository."
Most astronomers use the SPICE Toolkit and its Python wrapper. This toolkit has built within it databases for many of the objects that astronomers are interested it.
You can also find the ephemeris of most bodies at JPL Horizons. He has only used it to get one ephemeris at a time, but it may have the ability to generate multiple ephemerides.
Information about minor planets (Pluto, Ceres, Pallus, etc) can be found at the Minor Planet Center.
maybe somebody knows something, since I am not able to find anything that makes sense to me.
I have a dataset positions (lon, lat) and I want to snap them to the nearest road and calculate the distance between them.
So far I discovered OSM, however I can't find a working example on how to use the API using python.
If any of you could help, I am thankful for ever little detail.
Will try to find it out by myself in the meantime and publish the answer if successful (couldn't find any similar question so maybe it will help someone in the future)
Welcome! OSM is a wonderful resource, but is essentially a raw dataset that you have to download and do your own processing on. There are a number of ways to do this, if you need a relatively small extract of the data (as opposed to the full planet file) the Overpass API is the place to look. Overpass turbo (docs) is a useful tool to help with this API.
Once you have the road network data you need, you can use a library like Shapely to snap your points to the road network geometry, and then either calculate the distance between them (if you need "as the crow flies" distance), or split the road geometry by the snapped points and calculate the length of the line. If you need real-world distance that takes the curvature of the earth into consideration (as opposed to the distance as it appears on a projected map), you can use something like Geopy.
You may also want to look into the Map Matching API from Mapbox (full disclosure, I work there), which takes a set of coordinates, snaps them to the road network, and returns the snapped geometry as well as information about the route, including distance.
You might use KDTree of sklearn for this. You fill an array with coordinates of candidate roads (I downloaded this from openstreetmap). Then use KDTree to make a tree of this array. Finally, use KDTree.query(your_point, k=1) to get the nearest point of the tree (which is the nearest node of the coordinate roads). Since searching the tree is very fast (essentially log(N) for N points that form the tree), you can query lots of points.
I am trying to obtain a list of all the William Hill Betting shops in the UK using the google places api. However, using the google places api will only let me return up to 60 results. Does anyone know of a way around this or an alternative approach to the problem I am having?
The places API has a pagetoken optional parameter that is used to return the next set of results.
https://developers.google.com/places/web-service/search#find-place-responses
Edit: I didn't realize the you could only get a maximum number of 60 results. To get around this you can search a smaller area that won't have more than 60 results.
You can use a combination of location (latitude and longitude) and radius. You can create a box and iteratively shift the coordinates left to right and up and down until your entire area is covered. You can use something like geopy to help you calculate the coordinates and then send the request to the google places API
I want to use GeoDjango to do basic location searches. Specifically I want to give the search function a ZIP code/city/county and find all the ZIP codes/cities/counties within 5mi, 10mi, 20mi, etc. I found the following paragraph in the documentation:
Using a geographic coordinate system may introduce complications for the developer later on. For example, PostGIS does not have the capability to perform distance calculations between non-point geometries using geographic coordinate systems, e.g., constructing a query to find all points within 5 miles of a county boundary stored as WGS84. [6]
What does this exactly mean if I want to use PostGIS and to be able to do the searches described above across the USA? The docs suggest using a projected coordinate system to cover only a specific region. I need to cover the whole country so this I suppose is not an option.
Basically in the end I want to be able to find neighbouring ZIP codes/cities/counties given a starting location and distance. I don't really care how this is done on a technical level.
Also where would I find a database that contains the geographic boundaries of ZIP codes/cities/counties in the USA that I can import into a GeoDjango model?
UPDATE
I found a database of that contains the latitude and longitude coordinates of all ZIP codes in the USA here. My plan is to import these points into a GeoDjango model and use PostGis to construct queries that can find other points within x miles from a given point. This gets around the issue raised in the documentation because all the ZIP codes are treated as points instead of as polygons. This is fine for my use case because perfect accuracy is not something I care about.
The good: the data file is free
The bad: this data is from the 2000 census so it is quite dated
The somewhat hopeful: the United States Census Bureau conducts a census every 10 years and it is almost 2010
The conclusion: it's good enough for me
To get around the limitation in the quote, you can just take the centroid of the zipcode region provided by the user, and then from that point find all zipcode regions that intersect a 5, 10 or whatever mile circle emanating from that point. I'm not sure how that would be achieved in geodjango, but with postgis it's definitely possible.
The limitation you quoted basically says you can't write a query that says "give me all points that are within 5 miles on the inside of the border of Ohio."
In [1]: o = Place.objects.get(pk=2463583) # Oakland, CA
In [2]: sf = Place.objects.get(pk=2487956) # San Francisco, CA
In [3]: o.coords.transform(3410) # use the NSIDC EASE-Grid Global projection
In [4]: sf.coords.transform(3410) # use the NSIDC EASE-Grid Global projection
In [5]: o.coords.distance(sf.coords) # find the distance between Oakland and San Francisco (in meters)
Out[5]: 14401.942808571299