Determine coordinates at highest point of raster - python

I have a raster from which I want to derive the coordinates of the highest point (elevation) in the raster.
Getting the highest elevation is easy, but I don't know how to get its coordinates.
What I have so far:
# required modules
from osgeo import gdal
from osgeo import osr
import numpy as np
import rasterio
# allow GDAL to use python exceptions
gdal.UseExceptions()
# save paths to the files needed
input_raster = 'data/dem.tif'
# open input raster file with errorcatching
try:
ds = gdal.Open(input_raster)
except RuntimeError as err:
print (err)
exit(keep_kernel=True)
if ds is None:
print ('Unable to open %s' % input_raster)
exit(keep_kernel=True)
#access size of file
cols = ds.RasterXSize
rows = ds.RasterYSize
#access band and data as numpy arrays
band = ds.GetRasterBand(1)
data1 = band.ReadAsArray(0, 0, cols, rows).astype(float)
#set nodata values to Nan
nodata_val = band.GetNoDataValue()
print(nodata_val)
data_masked = np.ma.masked_equal(data1,nodata_val)
#determine highest elevation value und its coordinates
highest_val = data_masked.max()
geotransform = ds.GetGeoTransform()
originX = geotransform[0]
originY = geotransform[3]
pixelWidth = geotransform[1]
pixelHeight = geotransform[5]
I'm stuck and thankful for any advice.

Get the indices of the max value(s):
indices = np.where(data_masked == data_masked.max())
Also decide what to do when there are multiple cells with the maximum value.
Compute the coordinates with the transforms:
x = indices[0][0] * pixelWidth + originX

Related

Write all numpy arrays to binary file in a loop

I have this code:
from osgeo import gdal
import numpy as np
ds = gdal.Open('image.tif')
# loop through each band
for bi in range(ds.RasterCount):
band = ds.GetRasterBand(bi + 1)
# Read this band into a 2D NumPy array
ar = band.ReadAsArray()
print('Band %d has type %s'%(bi + 1, ar.dtype))
ar.astype('uint16').tofile("converted.raw")
As a result, I get the converted.raw file, but it only contains data from the last iteration of the for loop. How to make a file that will contain data from all iterations together.
Use np.save
Ex:
ds = gdal.Open('image.tif')
# loop through each band
with open("converted.raw", "wb") as outfile:
for bi in range(ds.RasterCount):
band = ds.GetRasterBand(bi + 1)
# Read this band into a 2D NumPy array
ar = band.ReadAsArray()
print('Band %d has type %s'%(bi + 1, ar.dtype))
np.save(outfile, ar.astype('uint16'))

Find pixel coordinates from lat/long point in .geotiff using python and gdal

I have (lat,long) coordinate describing the position of a point in a .geotiff image.
I wish to find the equivalent pixel coordinates of the lat,long ones inside the image.
I succeded using gdaltransform from the command line with the following instruction :
gdaltransform -i -t_srs epsg:4326 /path/imagename.tiff
-17.4380493164062 14.6951949085676
But i would like to retrieve such type of equivalence from python code. I tried the following :
from osgeo import osr
source = osr.SpatialReference()
source.ImportFromUrl(path + TIFFFilename)
target = osr.SpatialReference()
target.ImportFromEPSG(4326)
transform = osr.CoordinateTransformation(target,source )
point_xy = np.array(transform.TransformPoint(-17.4380493164062,14.6951949085676))
But it returns this error :
NotImplementedError: Wrong number or type of arguments for overloaded function 'CoordinateTransformation_TransformPoint'.
Possible C/C++ prototypes are:
OSRCoordinateTransformationShadow::TransformPoint(double [3])
OSRCoordinateTransformationShadow::TransformPoint(double [3],double,double,double)
What am i doing wrong ? I tried to work around this error but without any success. Is there an other way to do it ?
EDIT 1 :
I achieved a single transformation via gdaltransform commands in terminal :
gdaltransform -i -t_srs epsg:4326 /path/image.tiff
-17.4380493164062 14.6951949085676
As i need to retrieve the pixel in a pythonic way, i tried calling the command using subprocess like :
# TRY 1:
subprocess.run(['gdaltransform','-i',' -t_srs','epsg:4326','/pat/img.tiff\n'], stdout=subprocess.PIPE)
# TRY 2 :
cmd = '''gdaltransform -i -t_srs epsg:4326 /home/henri/Work/imdex_visio/AllInt/Dakar_X118374-118393_Y120252-120271_PHR1A_2016-03-10T11_45_39.781Z_Z18_3857.tiff
-17.4380493164062 14.6951949085676'''
subprocess.Popen(cmd,stdout=subprocess.PIPE, shell=True)
But it does not work. Maybe because of the way the command itself behaves, like not actually returning a result and ending itself, but displaying the result and staying busy.
According to the cookbook you are flipping the use of transform and point. You should call transform on the point given the transform, not the other way around. It also seems like you are flipping source and target, but you do that two times, so it will work.
However I believe that target.ImportFromUrl(path + TIFFFilename) will not work. Instead you can extract the spatial reference from the geotiff using gdal.
Something like the following should work
from osgeo import osr, ogr, gdal
# Extract target reference from the tiff file
ds = gdal.Open(path + TIFFFilename)
target = osr.SpatialReference(wkt=ds.GetProjection())
source = osr.SpatialReference()
source.ImportFromEPSG(4326)
transform = osr.CoordinateTransformation(source, target)
point = ogr.Geometry(ogr.wkbPoint)
point.AddPoint(-17.4380493164062, 14.6951949085676)
point.Transform(transform)
print(point.GetX(), point.GetY())
This provides you with the coordinates in your geotiffs reference, however this is not pixel coordinates.
To convert the point to pixels you could use something like the following (the minus for the line might have to be flipped, based on where in the world you are)
def world_to_pixel(geo_matrix, x, y):
"""
Uses a gdal geomatrix (gdal.GetGeoTransform()) to calculate
the pixel location of a geospatial coordinate
"""
ul_x= geo_matrix[0]
ul_y = geo_matrix[3]
x_dist = geo_matrix[1]
y_dist = geo_matrix[5]
pixel = int((x - ul_x) / x_dist)
line = -int((ul_y - y) / y_dist)
return pixel, line
So your final code would look something like
from osgeo import osr, ogr, gdal
def world_to_pixel(geo_matrix, x, y):
"""
Uses a gdal geomatrix (gdal.GetGeoTransform()) to calculate
the pixel location of a geospatial coordinate
"""
ul_x= geo_matrix[0]
ul_y = geo_matrix[3]
x_dist = geo_matrix[1]
y_dist = geo_matrix[5]
pixel = int((x - ul_x) / x_dist)
line = -int((ul_y - y) / y_dist)
return pixel, line
# Extract target reference from the tiff file
ds = gdal.Open(path + TIFFFilename)
target = osr.SpatialReference(wkt=ds.GetProjection())
source = osr.SpatialReference()
source.ImportFromEPSG(4326)
transform = osr.CoordinateTransformation(source, target)
point = ogr.Geometry(ogr.wkbPoint)
point.AddPoint(-17.4380493164062, 14.6951949085676)
point.Transform(transform)
x, y = world_to_pixel(ds.GetGeoTransform(), point.GetX(), point.GetY())
print(x, y)
The proposed solution might work in most of the cases as row/column rotation is typically zero, but it should be at least checked or better included:
import numpy as np
from osgeo import gdal
def world_to_pxl(gt, x, y):
# 'Affine transformation': W = A * pxl + ul
# world[2, 1] = a[2, 2] * pxl[2, 1] + upper_left[2,1]
world = np.array([[x], [y]]) # world coordinates
upper_left = np.array(
[
[gt[0]], [gt[3]] # upper left corner of image
]
)
a = np.array([[gt[1], gt[2]],
[gt[4], gt[5]]])
# Reformulate: A^-1 * (W - ul) = pxl
pxl = np.matmul(np.linalg.inv(a), (world - upper_left))
row = pxl[0] # x_pixel
col = pxl[1] # y_line
return row, col

Python 2D array -- How to plug in x and retrieve y value?

I have been looking for an answer since yesterday but no luck. So I have a 1D spectrum (.fits) file with flux value at each wavelength. I have converted them into a 2D array (x,y)=(wavelength, flux) and want to write a program which will return flux(y) at some assigned wavelengths(x). I have tried this:
#modules
import scipy
import numpy as np
import pyfits as pf
#Target Global Vaiables
hdulist_tg = pf.open('cutmask1-2.0001.fits')
hdr_tg = hdulist_tg[0].header
flux_tg = hdulist_tg[0].data
crval_tg = hdr_tg['CRVAL1'] #Starting wavelength
cdel_tg = hdr_tg['CDELT1'] #Wavelength axis width
wave_tg = crval_tg + np.arange(3183)*cdel_tg #Create an x-axis
wavelist = [6207,6315,6369,6438,6490,6565,6588]
wave_flux=[]
diff = 10
for wave in wave_tg:
for flux in flux_tg:
wave_flux.append((wave,flux))
for item in wave_flux:
wave = item[0]
flux = item[1]
#Where I got my actual wavelength that exists in wave_tg
diffmatch = np.abs(wave - wavelist[0])
if diffmatch < diff:
flux_wave = flux
diff = diffmatch
wavematch = wave
print wavelist[0],flux_wave,wavematch
but the program always return the same flux value even though the wavelength is different. Please help...
I would skip the creation of the two dimensional table altogether and just use interp:
fluxvalues = np.interp(wavelist, wave_tg, flux_tg)
For the file you posted, the code you posted doesn't work due to the hard-coded length of the wave_tg array. I would therefore recommend you rather use
wave_tg = crval_tg + np.arange(len(flux_tg))*cdel_tg
Also, for some reason it seems that the file you posted doesn't actually go up to the wavelengths you are looking up. You might need to check that you are calculating the corresponding wavelengths correctly or check that you are looking up the right wavelengths.
I've made some changes in your code:
using numpy ot create wave_flux as a ndarray using np.hstack(), np.repeat() and np.tile()
using fancy indexing to get the values matching your search
The resulting code is:
#modules
import scipy
import numpy as np
import pyfits as pf
#Target Global Vaiables
hdulist_tg = pf.open('cutmask1-2.0001.fits')
hdr_tg = hdulist_tg[0].header
flux_tg = hdulist_tg[0].data
crval_tg = hdr_tg['CRVAL1'] #Starting wavelength
cdel_tg = hdr_tg['CDELT1'] #Wavelength axis width
wave_tg = crval_tg + np.arange(3183)*cdel_tg #Create an x-axis
wavelist = [6207,6315,6369,6438,6490,6565,6588]
wave_flux = np.vstack(( np.repeat(wave_tg, len(flux_tg)),
np.tile(flux_tg, len(wave_tg)) )).transpose()
wave_ref = wavelist[0]
diff = 10
print wave_flux[ np.abs(wave_flux[:,0]-wave_ref) < diff ]
Which will return a sub-group of wave_flux with the wave values in column 0 and flux values in column 1:
[[ 6197.10300138 500.21020508]
[ 6197.10300138 523.24102783]
[ 6197.10300138 510.6390686 ]
...,
[ 6216.68436446 674.94732666]
[ 6216.68436446 684.74255371]
[ 6216.68436446 712.20098877]]

Python GDAL: Georeference array using other file for projection

I have an array of data, for each point I know the latitude and longitude of that point, and I'd like to write the data to a GTiff with projection taken from another file. How do I properly georeference the new file?
This is what I'm attempting just now:
import numpy as np
import gdal
from gdalconst import *
from osgeo import osr
def GetGeoInfo(FileName):
SourceDS = gdal.Open(FileName, GA_ReadOnly)
GeoT = SourceDS.GetGeoTransform()
Projection = osr.SpatialReference()
Projection.ImportFromWkt(SourceDS.GetProjectionRef())
return GeoT, Projection
def CreateGeoTiff(Name, Array, driver,
xsize, ysize, GeoT, Projection):
DataType = gdal.GDT_Float32
NewFileName = Name+'.tif'
# Set up the dataset
DataSet = driver.Create( NewFileName, xsize, ysize, 1, DataType )
# the '1' is for band 1.
DataSet.SetGeoTransform(GeoT)
DataSet.SetProjection( Projection.ExportToWkt() )
# Write the array
DataSet.GetRasterBand(1).WriteArray( Array )
return NewFileName
def ReprojectCoords(x, y,src_srs,tgt_srs):
trans_coords=[]
transform = osr.CoordinateTransformation( src_srs, tgt_srs)
x,y,z = transform.TransformPoint(x, y)
return x, y
# Some Data
Data = np.random.rand(5,6)
Lats = np.array([-5.5, -5.0, -4.5, -4.0, -3.5])
Lons = np.array([135.0, 135.5, 136.0, 136.5, 137.0, 137.5])
# A raster file that exists in the same approximate aregion.
RASTER_FN = 'some_raster.tif'
# Open the raster file and get the projection, that's the
# projection I'd like my new raster to have, it's 'projected',
# i.e. x, y values are numbers of pixels.
GeoT, TargetProjection, DataType = GetGeoInfo(RASTER_FN)
# Meanwhile my raster is currently in geographic coordinates.
SourceProjection = TargetProjection.CloneGeogCS()
# Get the corner coordinates of my array
LatSize, LonSize = len(Lats), len(Lons)
LatLow, LatHigh = Lats[0], Lats[-1]
LonLow, LonHigh = Lons[0], Lons[-1]
# Reproject the corner coordinates from geographic
# to projected...
TopLeft = ReprojectCoords(LonLow, LatHigh, SourceProjection, TargetProjection)
BottomLeft = ReprojectCoords(LonLow, LatLow, SourceProjection, TargetProjection)
TopRight = ReprojectCoords(LonHigh, LatHigh, SourceProjection, TargetProjection)
# And define my Geotransform
GeoTNew = [TopLeft[0], (TopLeft[0]-TopRight[0])/(LonSize-1), 0,
TopLeft[1], 0, (TopLeft[1]-BottomLeft[1])/(LatSize-1)]
# I want a GTiff
driver = gdal.GetDriverByName('GTiff')
# Create the new file...
NewFileName = CreateGeoTiff('Output', Data, driver, LatSize, LonSize, GeoTNew, TargetProjection)
If all you want to do is save the data to a raster for use in QGIS, you can simply construct a new Geotiff (or any other GDAL format) from your data. There is no need for a 'target raster' unless you want to do some form of reprojection or interpolation.
Here is an example:
import gdal
import osr
import numpy as np
data = np.random.rand(5,6)
lats = np.array([-5.5, -5.0, -4.5, -4.0, -3.5])
lons = np.array([135.0, 135.5, 136.0, 136.5, 137.0, 137.5])
xres = lons[1] - lons[0]
yres = lats[1] - lats[0]
ysize = len(lats)
xsize = len(lons)
ulx = lons[0] - (xres / 2.)
uly = lats[-1] - (yres / 2.)
driver = gdal.GetDriverByName('GTiff')
ds = driver.Create('D:\\test.tif', xsize, ysize, 1, gdal.GDT_Float32)
# this assumes the projection is Geographic lat/lon WGS 84
srs = osr.SpatialReference()
srs.ImportFromEPSG(4326)
ds.SetProjection(srs.ExportToWkt())
gt = [ulx, xres, 0, uly, 0, yres ]
ds.SetGeoTransform(gt)
outband = ds.GetRasterBand(1)
outband.WriteArray(data)
ds = None
In this example i assumed that your lat/lon's refer to the center of a pixel, since GDAL works with the edge, adding half a pixelsize is necessary.

Extract Point From Raster in GDAL

I have a raster file and a WGS84 lat/lon point.
I would like to know what value in the raster corresponds with the point.
My feeling is that I should use GetSpatialRef() on the raster object or one of its bands and then apply a ogr.osr.CoordinateTransformation() to the point to map it to the raster's space.
My hope would then be that I could simply ask the rasters' bands what is at that point.
However, the raster object doesn't seem to have a GetSpatialRef() or a way to access a geo-located point, so I'm somewhat at a loss for how to do this.
Any thoughts?
Say i have a geotiff file test.tif. Then followin code should look up value somewhere near the pixel. I am not that confident for the part looking up cell, and will fix there is error. This page should help, "GDAL Data Model"
Also, you may go to gis.stackexchange.com to find experts, if you haven't.
import gdal, osr
class looker(object):
"""let you look up pixel value"""
def __init__(self, tifname='test.tif'):
"""Give name of tif file (or other raster data?)"""
# open the raster and its spatial reference
self.ds = gdal.Open(tifname)
srRaster = osr.SpatialReference(self.ds.GetProjection())
# get the WGS84 spatial reference
srPoint = osr.SpatialReference()
srPoint.ImportFromEPSG(4326) # WGS84
# coordinate transformation
self.ct = osr.CoordinateTransformation(srPoint, srRaster)
# geotranformation and its inverse
gt = self.ds.GetGeoTransform()
dev = (gt[1]*gt[5] - gt[2]*gt[4])
gtinv = ( gt[0] , gt[5]/dev, -gt[2]/dev,
gt[3], -gt[4]/dev, gt[1]/dev)
self.gt = gt
self.gtinv = gtinv
# band as array
b = self.ds.GetRasterBand(1)
self.arr = b.ReadAsArray()
def lookup(self, lon, lat):
"""look up value at lon, lat"""
# get coordinate of the raster
xgeo,ygeo,zgeo = self.ct.TransformPoint(lon, lat, 0)
# convert it to pixel/line on band
u = xgeo - self.gtinv[0]
v = ygeo - self.gtinv[3]
# FIXME this int() is probably bad idea, there should be
# half cell size thing needed
xpix = int(self.gtinv[1] * u + self.gtinv[2] * v)
ylin = int(self.gtinv[4] * u + self.gtinv[5] * v)
# look the value up
return self.arr[ylin,xpix]
# test
l = looker('test.tif')
lon,lat = -100,30
print l.lookup(lon,lat)
lat,lon =28.816944, -96.993333
print l.lookup(lon,lat)
Yes, the API isn't consistent. The raster (the data source) has a GetProjection() method instead (which returns WKT).
Here is a function that does what you want (drawn from here):
def extract_point_from_raster(point, data_source, band_number=1):
"""Return floating-point value that corresponds to given point."""
# Convert point co-ordinates so that they are in same projection as raster
point_sr = point.GetSpatialReference()
raster_sr = osr.SpatialReference()
raster_sr.ImportFromWkt(data_source.GetProjection())
transform = osr.CoordinateTransformation(point_sr, raster_sr)
point.Transform(transform)
# Convert geographic co-ordinates to pixel co-ordinates
x, y = point.GetX(), point.GetY()
forward_transform = Affine.from_gdal(*data_source.GetGeoTransform())
reverse_transform = ~forward_transform
px, py = reverse_transform * (x, y)
px, py = int(px + 0.5), int(py + 0.5)
# Extract pixel value
band = data_source.GetRasterBand(band_number)
structval = band.ReadRaster(px, py, 1, 1, buf_type=gdal.GDT_Float32)
result = struct.unpack('f', structval)[0]
if result == band.GetNoDataValue():
result = float('nan')
return result
Its documentation is as follows (drawn from here):
spatial.extract_point_from_raster(point, data_source, band_number=1)
data_source is a GDAL raster, and point is an OGR point object. The
function returns the value of the pixel of the specified band of
data_source that is nearest to point.
point and data_source need not be in the same reference system, but
they must both have an appropriate spatial reference defined.
If the point does not fall in the raster, RuntimeError is raised.
project = self.ds.GetProjection()
srPoint = osr.SpatialReference(wkt=project)
done... with that, the vector file has adopted the projection from input raster file

Categories