I have a pandas dataframe containing MULTIPOLYGON coordinates in (LON, LAT) format. I need to use this coordinates to add a polygon to an ipyleaflet map but I need to change the order of the coordinates to (LAT, LON)
df['Footprint'][0]
''MULTIPOLYGON (((-3.870231 39.827106,-3.49322 41.329609,-6.624273 41.739006,-6.931492 40.237854,-3.870231 39.827106)))''
# Here in locations, I have manually changed the order
polygon = Polygon(
locations=[(39.827106,-3.870231),(41.329609,-3.49322),(41.739006,-6.624273),(40.237854,-6.931492),(39.827106,-3.870231)],
color="green",
fill_color="green"
)
m = Map(center=(39.5531, -3.6914), zoom=6)
m.add_layer(polygon);
m
Any idea on how to do the trick?
Related
I have a satellite image file. Loaded into dask array. I want to get pixel value (nearest) of a latitude, longitude of interest.
Satellite image is in GEOS projection. I have longitude and latitude information as 2D numpy arrays.
Satellite Image file
I have loaded it into a dask data array
from satpy import Scene
import matplotlib as plt
import os
cwd = os.getcwd()
fn = os.path.join(cwd, 'EUMETSAT_data/1Jan21/MSG1-SEVI-MSG15-0100-NA-20210101185741.815000000Z-20210101185757-1479430.nat')
files = [fn]
scn = Scene(filenames=files, reader='seviri_l1b_native')
scn.load(["VIS006"])
da = scn['VIS006']
This is what the dask array looks like:
I read lon lats from the area attribute with the help of satpy:
lon, lat = scn['VIS006'].attrs['area'].get_lonlats()
print(lon.shape)
print(lat.shape)
(1179, 808)
(1179, 808)
I get a 2d numpy array each, for longitude and latitude that are coordinates but I can not use them for slicing or selecting.
What is the best practice/method to get nearest lat long, pixel information?
How do I project the data onto lat long coordinates that I can then use for indexing to arrive at the pixel value.
At the end, I want to get pixel value (nearest) of lat long of interest.
Thanks in advance!!!
The AreaDefinition object you are using (.attrs['area']) has a few methods for getting different coordinate information.
area = scn['VIS006'].attrs['area']
col_idx, row_idx = area.get_xy_from_lonlat(lons, lats)
scn['VIS006'].values[row_idx, col_idx]
Note that row and column are flipped. The get_xy_from_lonlat method should work for arrays or scalars.
There are other methods for getting X/Y coordinates of each pixel if that is what you're interesting in.
You can find the location with following:
import numpy as np
px,py = (23.0,55.0) # some location to take out values:
dist = np.sqrt(np.cos(lat*np.pi/180.0)*(lon-px)**2+(lat-py)**2); # this is the distance matrix from point (px,py)
kkout = np.squeeze(np.where(np.abs(dist)==np.nanmin(dist))); # find location where distance is minimum
print(kkout) # you will see the row and column, where to take out data
#serge ballesta - thanks for the direction
Answering my own question.
Project the latitude and longitude (platecaree projection) onto the GEOS projection CRS. Find x and y. Use this x and y and nearest select method of xarray to get pixel value from dask array.
import cartopy.crs as ccrs
data_crs = ccrs.Geostationary(central_longitude=41.5, satellite_height=35785831, false_easting=0, false_northing=0, globe=None, sweep_axis='y')
lon = 77.541677 # longitude of interest
lat = 8.079148 # latitude of interst
# lon lat system in
x, y = data_crs.transform_point(lon, lat, src_crs=ccrs.PlateCarree())
dn = ds.sel(x=x,y=y, method='nearest')
Unfortunately my projection from Irish Transverse Mercator (ITM) to WGS84 latitude-longitude seems to have gone wrong as the plotted coordinates don't line up with a map of Dublin sourced from the CSO (see below).
My transformed coordinates plotted on a map of Dublin
The transformed data was sourced from the Irish Valuation Office and the ITM X & Y coordinates were fed into a function adapted from a previous stackoverflow discussion which uses geopandas' built-in points_from_xy method to transform coordinates between Coordinate Reference Systems:
def create_geodf_from_GPS (df, latitude, longitude, crs):
locations = gpd.points_from_xy(longitude, latitude)
geo_df = gpd.GeoDataFrame(df, geometry=locations)
geo_df.crs = crs
return geo_df
VO_geo = create_geodf_from_GPS(VO, VO[" X ITM"], VO[" Y ITM"], crs = 'epsg:2157')
VO_geo = VO_geo.to_crs('epsg:4326')
Does anyone have any idea what may have gone wrong here?
Very simple fix thanks to #joris
Altered function using x & y as arguments for gpd.points_from_xy instead of the previously mixed up longitude & latitude:
def create_geodf_from_GPS (df, x, y, crs):
locations = gpd.points_from_xy(x, y)
geo_df = gpd.GeoDataFrame(df, geometry=locations)
geo_df.crs = crs
return geo_df
Now plotting the data in WGS84 latitude-longitude works as expected:
VO_geo = create_geodf_from_GPS(VO, x=VO[" X ITM"], y=VO[" Y ITM"], crs = 'epsg:2157')
VO_geo.to_crs('epsg:4326').plot()
Note: the data had to be cleaned to remove obvious outliers by filtering out non-Dublin data using geopandas' (gpd) spatial join function
VO_geo_clean = gpd.sjoin(VO_geo.to_crs('epsg:4326'), map_of_Dublin)
Result:
VO data plotted over a map of Dublin
I have CSV file, which contains the coordinates of points (more than 100 rows). Within CSV file there are 2 columns: Latitude, Longitude.
These points are the top left corners of some polygons. (squares)
All of the polygons has the same size (for example 100x100 meter).
Latitude Longitude
56.37769816725615 -4.325049868061924
55.37769816725615 -3.325049868061924
51.749167440074324 -4.963575226888083
...
I can load the CSV to dataframe, I can make points (or 4 points within row) from the coordinates with GeoPandas.
But how can I make Polygons for each row, which connects the 4 points?
Thanks for your help.
df = pd.read_csv('ExportPolyID.csv',nrows=10)
gdf= geopandas.GeoDataFrame(df,geometry=geopandas.points_from_xy(df.long, df.lat))
gdf['point2']= gdf.translate(2,2)
gdf['point3']=gdf.translate(3,3)
gdf['point4']=gdf.translate(4,4)
#After this I have 4 points for each row, but I can't connect them to create Polygons
If you want to define square in meters, make sure you are using projected CRS (http://geopandas.org/projections.html#re-projecting).
Then you can use something like this (there might be more effective ways, but this one is explicit):
from shapely.geometry import Polygon
lat = [0, 2, 4]
lon = [0, 2, 4]
gdf = gpd.GeoDataFrame()
gdf['lat'] = lat
gdf['lon'] = lon
dim = 1 # define the length of the side of the square
geoms = []
for index, row in gdf.iterrows():
ln = row.lon
lt = row.lat
geom = Polygon([(ln, lt), ((ln + dim), lt), ((ln + dim), (lt - dim)), (ln, (lt - dim))])
geoms.append(geom)
gdf['geometry'] = geoms
This will generate square polygons from set coordinates of size dim x dim with point defined by given coords being top left.
I have a dataframe with a column of linestrings. I want to convert the linestrings to its corresponding latitude/longitude so that I can plot it with basemap. My code is as follows:
gdf = gpd.read_file('./call2016.shp') #read the data into a variable
streetsaslinestring = gdf.loc[: , "geometry"] #getting the linestring column
Next, I want to convert the data as described as lon/lat.
streetsinlatlong = convert_etrs89_to_lonlat(streetsaslinestring)
streetsinlatlong.to_file('./streetslonglat.shp') #store it as .shp in order to plot it with basemap
m.readshapefile('./streetslonglat', 'streets') #read as shape file
The geometry column looks like this:geometry column
How can I convert the longstring data?
I think you can simply use
Lats, Lons = LineStringObject.coords.xy
It will return the latitude and longitude array separately.
I have tried debugging my code and I've realised that ultimately it breaks down when I try to save my AltAz coordinates into a .csv file because its not a numpy array, its a SkyCoord object. Could someone suggest a simple of way of converting a large table of Equatorial coordinates to AltAz or how I can get my code to save to file.
# Get time now
time = astropy.time.Time.now()
time.delta_ut1_utc = 0
# Geodetic coordinates of observatory (example here: Munich)
observatory = astropy.coordinates.EarthLocation(
lat=48.21*u.deg, lon=11.18*u.deg, height=532*u.m)
# Alt/az reference frame at observatory, now
frame = astropy.coordinates.AltAz(obstime=time, location=observatory)
# Look up (celestial) spherical polar coordinates of HEALPix grid.
theta, phi = hp.pix2ang(nside, np.arange(npix))
# Convert to Equatorial coordinates
radecs = astropy.coordinates.SkyCoord(
ra=phi*u.rad, dec=(0.5*np.pi - theta)*u.rad)
# Transform grid to alt/az coordinates at observatory, now
altaz = radecs.transform_to(frame)
#Transpose array from rows to columns
altaz_trans=np.transpose(altaz)
np.savetxt('altaz.csv',altaz_trans,fmt='%s', delimiter=',')
You'll want to use the to_string() method on altaz. That will give you a list of strings, each entry of which is has an altitude an azimuth number (they are separated by a space, so you can .split() them or whatever). Then you can write it out with numpy or your other library of choice.
Alternately, if you want to go straight to a file, you can create an astropy Table, and have columns 'alt' and 'az' that you respectively set equal to altaz.alt and altaz.az. Then you can .write(format='ascii') that table.