Python GPS Radius / Distance - python

I have a list in a "file.txt" with GPS coordinates, in it I have the format "latitude, longitude". I will try to explain the example or code I would like in Python, language I try learning.
GPS = current position + RADIUS / MARGIN = 0.9 (900 meters)
The current GPS position would be "collected" from the serial in "/dev/ttyS0", using a GPS module connected to Raspberry Pi3 ( Raspbian ).
I need to know if my current position (using RADIUS / MARGIN of 900 meters) is TRUE or FALSE according to the list of coordinates that i have in the "file.txt".
file.txt
-34.61517, -58.38124
-34.61517, -58.38124
-34.61527, -58.38123
-34.61586, -58.38121
-34.61647, -58.38118
-34.61762, -58.38113
-34.61851, -58.38109
-34.61871, -58.38109
-34.61902, -58.38108
-34.61927, -58.38108
-34.61953, -58.38108
-34.61975, -58.38106
-34.61979, -58.38112
-34.6198, -58.38113
-34.61981, -58.38115
-34.61983, -58.38116
-34.61986, -58.38117
-34.61993, -58.38118
-34.62011, -58.38119
-34.62037, -58.38121
-34.62059, -58.38122
-34.62075, -58.38122
-34.6209, -58.38122
-34.62143, -58.38117
-34.62157, -58.38115
-34.62168, -58.38115
-34.6218, -58.38114
-34.62191, -58.38115
-34.62199, -58.38116
-34.62206, -58.38119
-34.62218, -58.38123
-34.62227, -58.38128
-34.62234, -58.38134
-34.62241, -58.3814
-34.62249, -58.38149
-34.62254, -58.38156
-34.62261, -58.38168
-34.62266, -58.38179
-34.62273, -58.38194
-34.62276, -58.38201
-34.62283, -58.38238
-34.62282, -58.38261
-34.62281, -58.38291
-34.62281, -58.38309
-34.62281, -58.38313
-34.62281, -58.3836
-34.62281, -58.38388
-34.62282, -58.38434
-34.62282, -58.38442
-34.62283, -58.3845
-34.62283, -58.38463
-34.62285, -58.38499
-34.62287, -58.3853
-34.6229, -58.38581
-34.62291, -58.38589
-34.62292, -58.38597
-34.62297, -58.38653
-34,623, -58,3868
-34.62303, -58.3871
-34,623, -58,38713
-34.62299, -58.38714
-34.62298, -58.38715
-34.62298, -58.38716
-34.62297, -58.38717
-34.62297, -58.38728
-34.62297, -58.38735
-34.62298, -58.38755
-34.62299, -58.3877
-34.62305, -58.38829
-34.62308, -58.38848
-34.6231, -58.38865
-34.62311, -58.38874
-34.62316, -58.3892
-34.62318, -58.38933
Sample Image 1
Is this possible in Python?
Thanks in advance (:

I don't understand if it's exactly this that you wanted to know. This solution refers to the problem "Given a point and my current position, is my distance from that point minor than a specific value?".
If that's the case and distances are small enough (less than 1 km), you can use the Pythagorean theorem:
distance = c*6371*180/pi*sqrt((currentPosition.lat - targetLat)**2 +
(currentPosition.long - targetLong)**2)
where c is a coefficient you have to find in your zone (in Italy it's 0.8 for example, just divide the real value - you can obtain it with Google Maps - by the result you get setting c at 1), 6371 is the Earth's radius and pi is 3.14159; then you can just compare the distance with the maximum distance you want with
distance < maxDistance
In this case, maxDistance is 0.9 .
Notice that this formula is approximated, but, given the low distances you're dealing with, it can be accurate enough. You should use trigonometry if distances are higher (for example that doesn't make sense if you want to compare two points in two different continents). In that case, this is the formula you should use - the great circle formula:
distance = 6371*acos(sin(lat1)*sin(lat2)+cos(lat1)cos(lat2)cos(long1-long2))
where (lat1,long1) and (lat2,long2) are the spherical cohordinates of the two points you are measuring. Then compare the distance with the maxDistance like the previous expression, and you're done.
If you want to solve the problem for a set of points in a txt file, just read those values and iterate over them using a for loop or a for-each.
Reference https://en.wikipedia.org/wiki/Great-circle_distance for further details.

Related

How to to calculate unnested watersheds in GRASS GIS?

I am running into a few issues using the GRASS GIS module r.accumulate while running it in Python. I use the module to calculate sub watersheds for over 7000 measurement points. Unfortunately, the output of the algorithm is nested. So all sub watersheds are overlapping each other. Running the r.accumulate sub watershed module takes roughly 2 minutes for either one or multiple points, I assume the bottleneck is loading the direction raster.
I was wondering if there is an unnested variant in GRASS GIS available and if not, how to overcome the bottleneck of loading the direction raster every time you call the module accumulate. Below is a code snippet of what I have tried so far (resulting in a nested variant):
locations = VectorTopo('locations',mapset='PERMANENT')
locations.open('r')
points=[]
for i in range(len(locations)):
points.append(locations.read(i+1).coords())
for j in range(0,len(points),255):
output = "watershed_batch_{}#Watersheds".format(j)
gs.run_command("r.accumulate", direction='direction#PERMANENT', subwatershed=output,overwrite=True, flags = "r", coordinates = points[j:j+255])
gs.run_command('r.stats', flags="ac", input=output, output="stat_batch_{}.csv".format(j),overwrite=True)
Any thoughts or ideas are very welcome.
I already replied to your email, but now I see your Python code and better understand your "overlapping" issue. In this case, you don't want to feed individual outlet points one at a time. You can just run
r.accumulate direction=direction#PERMANENT subwatershed=output outlet=locations
r.accumulate's outlet option can handle multiple outlets and will generate non-overlapping subwatersheds.
The answer provided via email was very usefull. To share the answer I have provided the code below to do an unnested basin subwatershed calculation. A small remark: I had to feed the coordinates in batches as the list of coordinates exceeded the max length of characters windows could handle.
Thanks to #Huidae Cho, the call to R.accumulate to calculate subwatersheds and longest flow path can now be done in one call instead of two seperate calls.
The output are unnested basins. Where the largers subwatersheds are seperated from the smaller subbasins instead of being clipped up into the smaller basins. This had to with the fact that the output is the raster format, where each cell can only represent one basin.
gs.run_command('g.mapset',mapset='Watersheds')
gs.run_command('g.region', rast='direction#PERMANENT')
StationIds = list(gs.vector.vector_db_select('locations_snapped_new', columns = 'StationId')["values"].values())
XY = list(gs.vector.vector_db_select('locations_snapped_new', columns = 'x_real,y_real')["values"].values())
for j in range(0,len(XY),255):
output_ws = "watershed_batch_{}#Watersheds".format(j)
output_lfp = "lfp_batch_{}#Watersheds".format(j)
output_lfp_unique = "lfp_unique_batch_{}#Watersheds".format(j)
gs.run_command("r.accumulate", direction='direction#PERMANENT', subwatershed=output_ws, flags = "ar", coordinates = XY[j:j+255],lfp=output_lfp, id=StationIds[j:j+255], id_column="id",overwrite=True)
gs.run_command("r.to.vect", input=output_ws, output=output_ws, type="area", overwrite=True)
gs.run_command("v.extract", input=output_lfp, where="1 order by id", output=output_lfp_unique,overwrite=True)
To export the unique watersheds I used the following code. I had to transform the longest_flow_path to point as some of the longest_flow_paths intersected with the corner boundary of the watershed next to it. Some longest flow paths were thus not fully within the subwatershed. See image below where the red line (longest flow path) touches the subwatershed boundary:
enter image description here
gs.run_command('g.mapset',mapset='Watersheds')
lfps= gs.list_grouped('vect', pattern='lfp_unique_*')['Watersheds']
ws= gs.list_grouped('vect', pattern='watershed_batch*')['Watersheds']
files=np.stack((lfps,ws)).T
#print(files)
for file in files:
print(file)
ids = list(gs.vector.vector_db_select(file[0],columns="id")["values"].values())
for idx in ids:
idx=int(idx[0])
expr = f'id="{idx}"'
gs.run_command('v.extract',input=file[0], where=expr, output="tmp_lfp",overwrite=True)
gs.run_command("v.to.points", input="tmp_lfp", output="tmp_lfp_points", use="vertex", overwrite=True)
gs.run_command('v.select', ainput= file[1], binput = "tmp_lfp_points", output="tmp_subwatersheds", overwrite=True)
gs.run_command('v.db.update', map = "tmp_subwatersheds",col= "value", value=idx)
gs.run_command('g.mapset',mapset='vector_out')
gs.run_command('v.dissolve',input= "tmp_subwatersheds#Watersheds", output="subwatersheds_{}".format(idx),col="value",overwrite=True)
gs.run_command('g.mapset',mapset='Watersheds')
gs.run_command("g.remove", flags="f", type="vector",name="tmp_lfp,tmp_subwatersheds")
I ended up with a vector for each subwatershed

how to calculate all points (latitude and longitude) between 2 points every 10 km?

I have 2 points on a map (lat,lon):
point 1: (40.437,-3.7325) call it Madrid
point 2: (48.853,2.351074) call it Paris.
Say I draw a line between the 2 points (Mardid --> Paris) I wish to have a list of all lat/lon between the 2 points.
I'm using the Haversine formula to calculate the distance between the 2 points but couldn't figure out how to add 10km in the direction of point2?
Looking for a solution in Python.
import pyproj
madrid = (40.437,-3.7325) # lat, lon
paris = (48.853, 2.351074)
# look, we're using proper geometry, not a sphere approximtion!
g = pyproj.Geod(ellps="WGS84")
fwd_azimuth, back_azimuth, distance = g.inv(paris[1], paris[0], madrid[1], madrid[0])
# distance is 1051578.506043991
spacing = 10000 # it's in meters
npth = distance // spacing
points = g.fwd_intermediate(paris[1], paris[0], fwd_azimuth, npth, spacing)
The result of list(points) is:
[105,
10000.0,
1060000.0,
array('d', [2.2841147476974495, 2.2173634769260593, 2.150819074399177, 2.0844804335061093, 2.018346454270266, 1.9524160433074975, 1.886688113784584, 1.8211615853779506, 1.755835384232539, 1.6907084429209114, 1.6257797004025019, 1.5610481019831033, 1.4965125992745119, 1.432172150154378, 1.3680257187263005, 1.3040722752800191, 1.2403107962518984, 1.1767402641855804, 1.113359667692788, 1.050168001414417, 0.9871642659817574, 0.9243474679779276, 0.8617166198995521, 0.7992707401185934, 0.7370088528443952, 0.6749299880859474, 0.6130331816143366, 0.5513174749254073, 0.48978191520261016, 0.4284255552800733, 0.36724745360588695, 0.30624667420553875, 0.24542228664561305, 0.18477336599765692, 0.12429899280225021, 0.06399825303331674, 0.0038702380625750266, -0.05608595537575356, -0.11587122522005133, -0.17548646411623947, -0.23493255945243385, -0.29421039339350097, -0.3533208429153225, -0.4122647798389152, -0.4710430708643054, -0.5296565776042774, -0.5881061566178034, -0.6463926594434017, -0.7045169326321932, -0.762479817780831, -0.8202821515641912, -0.8779247657678653, -0.935408487320482, -0.9927341383258104, -1.049902536094676, -1.1069144931766535, -1.163770817391641, -1.2204723118611298, -1.2770197750393688, -1.3334140007442956, -1.389655778188284, -1.4457458920087038, -1.501685122298269, -1.5574742446352121, -1.6131140301133118, -1.6686052453715994, -1.7239486526240326, -1.7791450096888881, -1.8341950700179903, -1.8890995827257604, -1.943859292618073, -1.9984749402208979, -2.0529472618088387, -2.1072769894334136, -2.161464850951148, -2.2155115700515773, -2.2694178662849596, -2.32318445508986, -2.3768120478205614, -2.4303013517742893, -2.4836530702182533, -2.5368679024164877, -2.589946543656604, -2.642889685276232, -2.695698014689428, -2.7483722154128123, -2.8009129670915507, -2.8533209455252124, -2.905596822693419, -2.957741266781308, -3.0097549422048506, -3.0616385096360426, -3.113392626027825, -3.165017944638929, -3.216515115058533, -3.267884783230732, -3.319127591478851, -3.3702441785296267, -3.421235179537185, -3.4721012261068913, -3.5228429463190163, -3.573460964752241, -3.6239559025070514, -3.674328377228901, -3.724579003131279]),
array('d', [48.77470199636237, 48.69636414316576, 48.61798666503888, 48.539569785051285, 48.46111372472616, 48.38261870405296, 48.304084941500015, 48.225512654026964, 48.14690205709707, 48.06825336468954, 47.98956678931156, 47.91084254201041, 47.83208083238533, 47.753281868599345, 47.67444585739103, 47.59557300408608, 47.51666351260883, 47.43771758549373, 47.35873542389656, 47.27971722760575, 47.20066319505346, 47.12157352332656, 47.04244840817768, 46.96328804403592, 46.88409262401768, 46.80486233993726, 46.72559738231744, 46.64629794039994, 46.566964202155795, 46.48759635429563, 46.4081945822799, 46.328759070328914, 46.24929000143294, 46.1697875573621, 46.0902519186762, 46.01068326473454, 45.931081773705536, 45.85144762257637, 45.77178098716245, 45.69208204211687, 45.612350960939736, 45.53258791598748, 45.45279307848195, 45.37296661851962, 45.29310870508056, 45.2132195060374, 45.13329918816417, 45.05334791714519, 44.97336585758367, 44.893353173010425, 44.813310025892434, 44.7332365776413, 44.65313298862177, 44.572999418159945, 44.49283602455168, 44.412642965070695, 44.33242039597678, 44.25216847252387, 44.17188734896792, 44.091577178574994, 44.01123811362902, 43.930870305439605, 43.85047390434976, 43.770049059743606, 43.68959592005384, 43.60911463276944, 43.52860534444292, 43.448068200697946, 43.3675033462365, 43.28691092484621, 43.20629107940761, 43.1256439519012, 43.04496968341459, 42.964268414149515, 42.88354028342878, 42.80278542970319, 42.72200399055841, 42.64119610272169, 42.56036190206868, 42.47950152363003, 42.39861510159805, 42.317702769333245, 42.23676465937086, 42.15580090342729, 42.07481163240646, 41.993796976406195, 41.91275706472454, 41.83169202586589, 41.75060198754728, 41.669487076704414, 41.58834741949782, 41.50718314131885, 41.42599436679559, 41.3447812197989, 41.26354382344815, 41.18228230011717, 41.10099677143994, 41.01968735831632, 40.93835418091777, 40.856997358692944, 40.77561701037325, 40.69421325397846, 40.6127862068221, 40.53133598551695, 40.44986270598043]),
None]
Why reinvent the wheel? There are free libraries which can do that:
C++ The original Charles Karney's GeographicLib
C# The C# port Flitesys.GeographicLib
You first measure the initial heading for a flight from Madrid to Paris (call Geodesic.Inverse(..)) and then you travel 10km, 20km, 30km and so on in that direction (by calling Geodesic.Direct(..)).
This process is known as line densification in typical geo software packages. A good answer to this problem can be found (in python) below on GIS stack exchange:
https://gis.stackexchange.com/questions/372912/how-to-densify-linestring-vertices-in-shapely-geopandas

Google ortools CVRP - different distance matrix by vehicle

In ortools, I know you can run the CVRP with different capacities per vehicle. However, can you pass a different distance matrix based upon the vehicle? For example, two cities may be 1000 miles apart, but it may be much faster to get there by airplane than by automobile, so in doing CVRP work I may wish to pass a time matrix, not an actual distance matrix. That time matrix would be different based upon the vehicle type.
Should be close to this:
callback_indices = []
for vehicle_idx in range(data['n_vehicles']):
def vehicle_callback(from_index, to_index, i=vehicle_idx):
from_node = manager.IndexToNode(from_index)
to_node = manager.IndexToNode(to_index)
return data['vehicle_costs'][i] * data['time_matrices'][i][from_node][to_node]
callback_index = routing.RegisterTransitCallback(vehicle_callback)
callback_indices.append(callback_index)
routing.AddDimensionWithVehicleTransits(
callback_indices,
0,
max,
False,
'DimensionName')
You can pass a vector/list of evaluators.
Here is the C++ API/

Longitude of lunar nodes using Skyfield

I am trying to find out the longitude of ascending/descending moon nodes using Skyfield but unable to find any reference in documentation. Is it possible?
Also do any of the JPL Files provide this data already?
Update:
Skyfield’s almanac module now supports this computation directly! See: Lunar Nodes
Original answer, for those wanting these details:
It is easy to at least find them relative to the J2000 ecliptic — which might be fine for dates far from the year 2000 as well, since I think that only the definition of ecliptic longitude changes with the passing years, but not latitude (which is what the nodes care about)?
In any case, you'd precede like this. Let's say you want the ascending node. It must happen within the next 30 days, because that's more than a full orbit of the Moon, so let's look for the day on which the latitude of the Moon passes from negative to positive:
from skyfield.api import load
ts = load.timescale()
eph = load('de421.bsp')
earth = eph['earth']
moon = eph['moon']
t = ts.utc(2018, 1, range(14, 14 + 30))
lat, lon, distance = earth.at(t).observe(moon).ecliptic_latlon()
angle = lat.radians
for i in range(len(angle)):
if angle[i] < 0 and angle[i+1] > 0:
break
print(t[i].utc_jpl(), angle[i])
print(t[i+1].utc_jpl(), angle[i+1])
The result is the discovery that the ascending node must happen sometime on January 31st:
A.D. 2018-Jan-31 00:00:00.0000 UT -0.0188679292421
A.D. 2018-Feb-01 00:00:00.0000 UT 0.00522392011676
To find the exact time, install the SciPy library, and ask one of its solvers to find the exact time at which the value reaches zero. You just have to create a little function that takes a number and returns a number, by converting the number to a Skyfield time and then the angle back to a plain number:
from scipy.optimize import brentq
def f(jd):
t = ts.tt(jd=jd)
angle, lon, distance = earth.at(t).observe(moon).ecliptic_latlon()
return angle.radians
node_t = brentq(f, t[i].tt, t[i+1].tt)
print(ts.tt(jd=node_t).utc_jpl())
The result should be the exact moment of the node:
A.D. 2018-Jan-31 18:47:54.5856 UT

Calculation star position in the sky, PyEphem

I have difficulties with finding current coordinates (RA, DEC) for star in sky.
In net I have found only this one tutorial, how to use ephem library: http://asimpleweblog.wordpress.com/2010/07/04/astrometry-in-python-with-pyephem/
As I understood I need to:
create observer
telescope = ephem.Observer()
telescope.long = ephem.degrees('10')
telescope.lat = ephem.degrees('60')
telescope.elevation = 200
Create a body Object star
here is trouble, I have only (RA,DEC) coordinates for star
Calculate position by .calculate(now())
by new coordinates find altitude
One more question about accuracy of this library, how accurate it is? I have compared juliandate and sidestreal time between this program and kstars, looks like quite similar.
and this http://www.jgiesen.de/astro/astroJS/siderealClock/
PS! Or may be some one can reccomend better library for this purposes.
I guess you're looking for FixedBody?
telescope = ephem.Observer()
telescope.long = ephem.degrees('10')
telescope.lat = ephem.degrees('60')
telescope.elevation = 200
star = ephem.FixedBody()
star._ra = 123.123
star._dec = 45.45
star.compute(telescope)
print star.alt, star.az
I don't know about the accuracy; pyephem uses the same code as xephem, and eg the positions of the planets are given by rounded-down VSOP87 solutions (accuracy better than 1 arcsecond); kstars appears to use the full VSOP solution.
But this will really depend on your need; eg don't rely on it blindly guiding your telescope, there are better solutions for that.
star = ephem.FixedBody(ra=123.123, dec=45.45)
in my case fixedbody creation does not work, should be
star = ephem.FixedBody()
star._ra = ephem.hours('10:10:10')
star._dec = ephem.degrees('10:10:10')

Categories