how to perform spline interpolation on GPS coordinations? - python

I have GPS coordinations in a csv file that I predict it using a regression model, just two columns with longitudes and latitudes that represent a race track. Now I want to plot it on Google maps to see how it looks like.
When I do that, I noticed that the curve is not smooth which make sense since I predicted those value with my regression model and they are not taking directly from a GPS.
I made a search on how to solve this problem and I find out that usually a spline interpolation is used for this, but I have no idea how to use it. All the examples that I found in the internet assume that we have the x which are the data and y which is the function, in my case there is no function, I just give the data to the model and it predict those values that's it. so if I have longitudes and latitudes, is it possible to make some sort of interpolation so that the curve would look smooth if I plot it?
Example:
let's say those are my data
latitudes = array([58.846563, 58.846573, 58.846586, 58.846601, 58.846618, 58.846637,
58.846658, 58.846681, 58.846705, 58.846731])
longitudes = array([9.903741, 9.903733, 9.903724, 9.903713, 9.9037 , 9.903686,
9.90367 , 9.903652, 9.903633, 9.903612])
and when I plot this data it give me some sort of a plot where each point is connected to the other point with a straight line but what I want is to smooth it up. Is this possible to do only if I have longitudes and latitudes as variables and nothing more? I'd appreciate any help

Related

Line Detection within Scatter Plots

I want to detect lines within Scatter Plots, using python.
Specifically, my data set is of the Cartesian coordinates of points. (This data was gathered using four ultrasound sensors on servos).
Here are some data sets ,
And here are the lines I'd like to detect.
the problem is to write a python program that returns the start and endpoints of high certainty lines, given a list of points on the scatter plot.
The difficulty is that piecewise linear regression can't be applied directly since data is vertically stacked.
Is there a well known solution to this problem? Or maybe an ingenuous application of piecewise linear regression could work?
I'd really appreciate some functioning python code!

Plot the probability density function in a way that the output can be a smooth curve

I am trying to plot the pdf of a dataset in a way that the pdf appears as a smooth curve.
For that, I was using seaborn.kdeplot. The problem with this is - the dataset has a strict range, and the KDE plot tends to cross the range at both edges. To limit the pdf curve within the range I tried to use the clip parameter, but it makes the edges abrupt, not a smooth starting and ending. The abrupt changes at the ends do not look good visually, therefore, I am looking for other ways to plot the pdf.
Could you please provide some insights on this issue? Is there any other way that I can plot the pdf?
As an example, pls find the following code:
data = np.random.uniform(0,1,100)
sns.kdeplot(data)
sns.kdeplot(data, clip=(0, 1))

Calculate curve consisting of clothoids (euler curve) and radiuses

I am trying to calculate points for curve, that is required for testing purposes (get a [xy] coordinates in cartesian system).
The desired curve looks like this:
I am able to calculate the first clothoid (using python), but can not get the other curves together.
Can someone please help me with this problem?
Thank you.

Interpolation of irregularly spaced data into 3d grid

I'm working with some instrument data that has records the temperature at a specific latitude, longitude, and pressure (height) coordinate. I need to create a 3d grid from this instrument data that I can then use to take a vertical cross sections of the interpolated gridded data. I've looked at pretty much every interpolation function/library I can find and I'm still having trouble just wrapping my head around how to do this.
I'd prefer not to use Mayavi, since it seems to bug out on my school's server and I'd rather not try to deal with fixing it right now.
The data is currently in 4 separate 1d arrays and I used those to mock up some scatter plots of what I'm trying to get.
Here is the structure of my instrument data points:
And here is what I'm trying to create:
Ultimately, I'd like to create some kind of 3d contour from these points that I can take slices of. Each of the plotted points has a corresponding temperature attached to it, which is really what I think is throwing me off in terms of dimensions and whatnot.
There are a few options to go from the unstructured data which you have to a structured dataset.
The simplest option might be to use the scipy interpolate.griddata method which can interpolate unstructured points using, linear or cubic interpolation.
Another option is to define your grid and then average all of the unstructured points which fall into each grid cell, giving you some gridded representation of the data. You could use a tool such as CIS to do this easily (full disclosure, I wrote this package to do exactly this kind of thing).
Or, there are more complicated methods of interpolating the data by trying to determine the most likely value of the grid points based on the unstructured data, for example using kriging with the pyKriging package, though I've never used this.

How interpolate 3D coordinates

I have data points in x,y,z format. They form a point cloud of a closed manifold. How can I interpolate them using R-Project or Python? (Like polynomial splines)
It depends on what the points originally represented. Just having an array of points is generally not enough to derive the original manifold from. You need to know which points go together.
The most common low-level boundary representation ("brep") is a bunch of triangles. This is e.g. what OpenGL and Directx get as input. I've written a Python software that can convert triangular meshes in STL format to e.g. a PDF image. Maybe you can adapt that to for your purpose. Interpolating a triangle is usually not necessary, but rather trivail to do. Create three new points each halfway between two original point. These three points form an inner triangle, and the rest of the surface forms three triangles. So with this you have transformed one triangle into four triangles.
If the points are control points for spline surface patches (like NURBS, or Bézier surfaces), you have to know which points together form a patch. Since these are parametric surfaces, once you know the control points, all the points on the surface can be determined. Below is the function for a Bézier surface. The parameters u and v are the the parametric coordinates of the surface. They run from 0 to 1 along two adjecent edges of the patch. The control points are k_ij.
The B functions are weight functions for each control point;
Suppose you want to approximate a Bézier surface by a grid of 10x10 points. To do that you have to evaluate the function p for u and v running from 0 to 1 in 10 steps (generating the steps is easily done with numpy.linspace).
For each (u,v) pair, p returns a 3D point.
If you want to visualise these points, you could use mplot3d from matplotlib.
By "compact manifold" do you mean a lower dimensional function like a trajectory or a surface that is embedded in 3d? You have several alternatives for the surface-problem in R depending on how "parametric" or "non-parametric" you want to be. Regression splines of various sorts could be applied within the framework of estimating mean f(x,y) and if these values were "tightly" spaced you may get a relatively accurate and simple summary estimate. There are several non-parametric methods such as found in packages 'locfit', 'akima' and 'mgcv'. (I'm not really sure how I would go about statistically estimating a 1-d manifold in 3-space.)
Edit: But if I did want to see a 3D distribution and get an idea of whether is was a parametric curve or trajectory, I would reach for package:rgl and just plot it in a rotatable 3D frame.
If you are instead trying to form the convex hull (for which the word interpolate is probably the wrong choice), then I know there are 2-d solutions and suspect that searching would find 3-d solutions as well. Constructing the right search strategy will depend on specifics whose absence the 2 comments so far reflects. I'm speculating that attempting to model lower and higher order statistics like the 1st and 99th percentile as a function of (x,y) could be attempted if you wanted to use a regression effort to create boundaries. There is a quantile regression package, 'rq' by Roger Koenker that is well supported.

Categories