Using .isel() to select noncontinuous range of data with xarray - python

I'm working with MERRA-2 data using xarray, and I'm attempting to plot data only over the Pacific Ocean. However, my data are in units of degrees East, such that the International Dateline is represented with index 0 and a value of -180, and the Prime Meridian/Greenwich Meridian is at index 288 (576/2) with a value approximating zero. To a rough estimation, the Pacific Ocean can be found in this coordinate system between 0 and 96, as well as from 480 to 575 (these indices corresponded to 120 E and 120 W, respectively.) How can I use .isel() (or .loc() or .sel(), if one of those is more appropriate) to choose JUST the region I'm describing?
For reference, the below picture is the result of using data.isel(lon=slice(96,480)).plot(), which is my current best guess. Do I need to use a cyclic point? To my understanding, that would solve the opposite problem I have.

Related

Python convert angles between two different planes of coordinates?

I do not know how to word my problem , but I am mostly interested on the logic I could use in python, since my knowledge in python is not that much regarding usage of math.
I have 2 different planes, one is fixed where North is always 0, east is 90 , south is 180, and west is 270. I have another plane of reference too where 0 is always in front of me.
Now for me this sound simple in math in my mind and in paper, but in python I just dont have a good graps on how to reflect this, wherever I am facing, its always 0 in my own plane of reference, however in the other plane of reference (north, east, south, west) I am facing any angle. And for example, lets say I have something at 70 degrees in my own plane of reference, and I know that in the Compass reference I am facing to 270 degrees (which means that at 0 degrees in my reference, i am facing to 270 degrees in compass), I want to determine at which angle (compass) is an object that in my own plane of reference is at around 170 degrees. Mathematically,I can do this by simply adding 100 degrees to Compass reference, and once I reach 360 I go back to 0 . so that means that the object is at 10 degrees in compass.
I know the answer would be simple in terms of programming , maybe an if degrees > 360, then degrees = 0. But no idea if there is an easier way in python to consider all cases (degrees <0, degrees >0) .
You can use the modulo operator which returns the remainder of a division:
calculated_angle %= 360 # will keep the value between 0 & 359
So once your calculated angle (acheived by adding the angle from your frame of reference + the one on your compass) is stored in calculated_angle.
You can use the modulo operator to ensure that if the angle exceeds 360 deg, it returns back to 0 and starts counting up again. So 460 % 360 would be 100. And if the angle goes below 0, it starts back up from 360 again. So -50 % 360 would be 310.

Is there a way to get 20 or 30 km/ h speed limit roads and pedestrian only roads in OSMnx?

I am trying to get the length (in m) AND surface (in square m) of all the walkable streets in a given city, for example Paris. From the documentation, I found this code to get the area of all walkable streets in square meters.
How can I know if "pedestrian only" streets are also included in this, apart from all pavements?
Also, is there a way to (separately from point 1.) get the zones/ streets where traffic is limited to 20 or 30 km/h?
Below is the code from the documentation which shows the surface and length of all walkable streets in Paris.
# Get the network graph for all streets and paths that pedestrians can use
G = ox.graph_from_place('Paris, France', network_type='walk')
fig, ax = ox.plot_graph(G, node_size=0, bgcolor='k')
# what sized area does our network cover in square meters?
G_proj = ox.project_graph(G)
nodes_proj = ox.graph_to_gdfs(G_proj, edges=False)
graph_area_m = nodes_proj.unary_union.convex_hull.area
graph_area_m
# show some basic stats about the network, "street_length_total" shows the length of all streets in the upper graph
ox.basic_stats(G_proj, area=graph_area_m, clean_intersects=True, circuity_dist='euclidean')
# street_length_total = sum of all edges in the undirected
How can I know if "pedestrian only" streets are also included in this, apart from all pavements?
I'd recommend familiarizing yourself with OSM's tagging, including how pedestrian related data are handled. Then you can easily inspect your graph, or convert it to a GeoDataFrame, or filter its nodes/edges by certain key:value tag pairs.
Also, is there a way to (separately from point 1.) get the zones/ streets where traffic is limited to 20 or 30 km/h?
Yes. If max speed data exist in OSM for a given edge, you will find it in the edge's maxspeed attribute. You can filter by these attribute values.

How can I connect the paths of two ojects?

I have time series data for the position of two objects. The second object roughly follows the path of the first object. I want to join the two objects with a curved line that best represents the combined paths of the two objects. This is post-processing, so I already know the future paths of both objects. I can use information about where the second object will be to compute the path. Link to .csv file of source data in Google Drive - blue is columns 0,1 and yellow is columns 3,4.
My source data looks like this:
The objects are spaced fairly equally. Object two reaches the position of object one in around 50 frames. My initial approach was to take the past 25 frames of object blue object, and the future 25 frames of the yellow object. I used signal.savgol() to smooth the results (shown in pink).
positions = leading_object[frame_number - 25: frame_number]
positions += trailing_object[frame_number: frame_number + 25
x,y = zip(*positions)
window_length = int(len(x)*.5)
if window_length//2 == window_length/2: window_length -= 1
x = signal.savgol_filter(x, window_length, polyorder)
y = signal.savgol_filter(y, window_length, polyorder)
positions = list(zip(x,y))
This works okay, but the smoothed line jogs from one path to another. I'd like the path to be smooth.
Link to complete code used to generate animations.
You are essentially trying to do curve fitting for a curve that joins the two positions and interpolates some points of the two lines. As things stand the problem is a little overdetermined in that you have rather too many points. This leads to 'kinks' in the curve.
Perhaps choosing fewer points e.g. 5th, 10th, 15th of each partial trajectory to give 6 points plus your fixed endpoints would work better.
I would then choose a curve fitting strategy that gives good continuity for the derivatives such as a non uniform rational B-spline (NURB) or maybe a Chebyshev polynomial.

python interp2d strange checkers

I am interpolating an array (2d_values) of size 105 by 109 with scipy.interpolate.interp2d.
function=interp2d(2d_x_coords,2d_y_cords,2d_values)
interpolated=float(function(2d_x_finer_coords,2d_y_fner_coords))
I am having an issue where interpolated comes out with proper values in most locations but in certain areas of interpolated there are checkerboards and stripes of huge positive and negative numbers when the data is supposed to be between (0 and ~300).
2d_values is a relatively continuous field with a few places with large jumps between adjacent coordinates, and is a map projection of latitude and longitude coordinates so the coordinates are not a regular grid but are distorted as a flat map is.
the picture on the right is 2d_values and the picture on the left is interpolated
this is the code used to perform this
Using griddata instead yielded great results with no issues.

Difficulties with RA/Dec and Alt/Azi conversions with pyEphem

I'm trying to go from alt/azi to RA/Dec for a point on the sky at a fixed location, trying out pyEphem. I've tried a couple of different ways, and I get sort of the right answer, within a couple of degrees, but I'm expecting better, and I can't figure out where the problems lie.
I've been using Canopus as a test case (I'm not after stars specifically, so I can't use the in-built catalogue). So in my case, I know that at
stn = ephem.Observer()
# yalgoo station, wa
stn.long = '116.6806'
stn.lat = '-28.3403'
stn.elevation = 328.0
stn.pressure = 0 # no refraction correction.
stn.epoch = ephem.J2000
stn.date = '2014/12/15 14:32:09' #UTC
Stellarium, checked with other web sites tell me Canopus should be at
azi, alt '138:53:5.1', '56:09:52.6' or in equatorial RA 6h 23m 57.09s/ Dec. -52deg 41' 44.6"
but trying:
cano = ephem.FixedBody()
cano._ra = '6:23:57.1'
cano._dec = '-52:41:44.0'
cano._epoch = ephem.J2000
cano.compute( stn)
print( cano.az, cano.alt)
>>>(53:22:44.0, 142:08:03.0)
about 3 degrees out. I've also tried the reverse,
ra, dec = stn.radec_of('138:53:5.1', '56:09:52.6')
>>>(6:13:18.52, -49:46:09.5)
where I'm expecting 6:23 not 6:13. Turning on refraction correction makes a small difference, but not enough, and I've always understood aberration and nutation were much smaller effects than this offset as well?
As a follow up, I've tried manual calculations, based on 'Practical Astronomy with your calculator'; so for dec:
LAT = math.radians(-28.340335)
LON = math.radians(116.680621667)
ALT = math.radians(56.16461)
AZ = math.radians(138.88475)
sinDEC = (math.sin( LAT)*math.sin( ALT)
+ math.cos( LAT)*math.cos( ALT)*math.cos( AZ) )
DEC = math.asin( sinDEC)
DEC_deg = math.degrees(DEC)
print( 'dec = ', DEC_deg )
>>>('dec = ', -49.776032754148986)
again, quite different from '56:09:52.6', but reasonably close to pyEphem - so now I'm thoroughly confused! So now I'm suspecting the problem is my understanding, rather than pyEphem - could someone enlighten me about the correct way to go do RADEC/ALTAZI conversions, and why things are not lining up?!
First some notes
Atmospheric scattering and relative speed between observer and object
have the maximal error (near horizon) up to 0.6 degree which is nowhere near your error.
how can altitude be over 90 degrees?
you got swapped data for azimut and altitude
I put your observer data into mine program and result was similar to yours
but I visually search for that star instead of putting the coordinates. Result was also about 3-4 degrees off in RA axis
RA=6.4h Dec=-52.6deg
azi=142.4deg alt=53.9deg
mine engine is in C++, using Kepler's equation
Now what can be wrong:
mine stellar catalog can be wrongly converted
rotated wrongly with some margin but strongly doubt that it is 3 degrees. Also perspective transforms can add some error while rendering at 750AU distance from observer. I never tested for Southern sky (not visible from mine place).
we are using different Earth reference frame then the data you comparing to
I found out that some sites like NASA Horizon use different reference frame which does not correspond with mine observations. Look here
calculate the time when the sun is X degrees below/above the Horizon
at the start of the answer is link to 2 sites with different reference frames when you compare the result they are off. The second link is corresponding with mine observations the rest is dealing (included source code) with Kepler's equation based Solar system simulation. The other sublinks are also worth looking into.
I could have a bug in mine simulation/data
I have referenced data to this engine which could partially hide any computation errors from mine observer position so handle all above text with taken that it mind.
you could use wrong time/Julian date to stellar time conversions
if your time is off then the angles will not match...
How to resolve this?
pick up your Telescope, set up equatoreal coordinate system/mount to it and measure Ra/Dec Azi/Alt for known (distant) object in reality and compare with computed positions. Only this way you can decide which value is good or wrong (for reference frame you are using). Do this on star not planet !!! Do this on high altitude angles not near Horizon !!!
How to transform between azimutal and equatoreal coordinates
I compute transform matrix Eath representing earth's coordinate system (upper right) in heliocentric coordinate system as global coordinate system (left) then I compute another matrix NEH representing observer on Earth's surface (North,East,High/Altitude ... lower right).
After this it is just a matter of matrix and vector multiplications and conversion between Cartesian and spherical coordinate systems look here:
Representing Points on a Circular Radar Math approach
for more insight to azimutal coordinates. if you use just that simple equation like in your example then you do not account for many things... The Earth position is computed by Kepler's equation, rotation is given by daily rotation, nutation and precession included.
I use 64 bit floating point values which can create round errors but not that high ...
I use geometric North Pole as observer reference (this could add some serious error near poles).
The biggest thing that can affect this is the speed of light but that account for near earth 'moving' objects like planets not stars (except Sun) because their computed position is visible after some time ... For example Sun-Earth distance is about 8 light minutes so we see the Sun where it was 8 minutes ago. If the effemerides data is geometrical only (not account for this) then this can lead to high errors if not computed properly.
Newer effemerides models use gravity integration instead of Kepler so their data must be geometrical and the final output is then corrected by the time shift ...

Categories