How to correlate a sample curve with a reference curve - python

I have a sensor that is continually collecting data (shown in blue) every minute that outputs a voltage output. I have a reference sensor collecting data (shown in red) that outputs in the units that I am interested. I am interested in determining a scaling factor so that I can scale the blue sensor's data to match the red sensor's data.
Normally, I would do a simple linear regression between the values of two sensors at any given time, which would give me a scaling factor based on the slope of the regression. I have noticed, however, that red sensor is slower at sensing a change in the environment, and can anywhere from 6-15 minutes behind -- this makes a regression difficult because at any given time, the two sensors may be measuring different things.
I was wondering if there is any sort of curve fitting that can be performed such that I can extract a scaling factor so that I can scale the blue sensor's data to match the red sensors.
I typically work in Python, so any Python packages (e.g. Numpy/Scipy) that would help with this would be especially helpful.

Thanks for the help. What I ended up doing was finding all the local maxima and minima on the reference curve, then used those peak locations to search for the same maxima or minima on the sample curve. I basically used the reference curve's maxima/minima points as the center of a "window" and I would search for the highest/lowest point on the sample curve within a few minutes of the center point.
Once I had found all the matched maxima/minima on the sample curve, I then could perform a linear regression between these points to determine a scaling factor.

Related

How to check 3D plots within reference path?

For a drone competition, I have:
A- reference 3D trajectory (around 30 points or less)
B- tolerance from that ideal trajectory (say in cm or less)
C- experimental 3D points read from the drone's GPS (about 5000 points)
A bit like in:
RED: reference (A)
BLUE: experimental (C)
GREEN: "tolerance" (A+B, well not really, but you get the idea...)
I'd like to know the best way to check that the "experimental 3D points" (C) all falls within the "reference 3D trajectory" (A) plus/minus the expected "tolerance" (B).
Closest matches I've found here is:
Finding out if a curve is inside or outside a boundary in python. X axis with different resolutions (python)
Approximating data with a multi segment cubic bezier curve and a distance as well as a curvature contraint
Fit Curve-Spline to 3D Point Cloud
How to compare two 3D curves in Python?
https://stackoverflow.com/questions/8980101/what's-the-best-method-to-compare-original-trajectory-with-two-compressed-trajec
From github I've found something close:
https://pypi.org/project/similaritymeasures/
Yet I'd like to be sure I can compare trajectories with different number of points ("reference" +/- "tolerance" provides a simplified "tunnel"). That difference of dataset is the main drawback, since the drone can stop, perhaps even go backward a little before resuming the race.
Maybe displaying the result using Mathpy and/or Payton as well.
While the timing is a factor, it would be nice to do some stats like the min/max deviation from the "reference 3D trajectory" (A).
Probably out of scope, but:
https://www.researchgate.net/publication/281188521_Model_Predictive_Path-Following_Control_of_an_ARDrone_Quadrotor
https://www.researchgate.net/publication/247935750_UAV_Motion_Estimation_using_Hybrid_Stereoscopic_Vision

Getting 3D Position Coordinates from an IMU Sensor on Python

I am planning to acquire position in 3D cartesian coordinates from an IMU (Inertial Sensor) containing Accelerometer and Gyroscope. I'm using this to track the objects position and trajectory in 3D.
1- From my limited knowledge I was under the assumption that Accelerometer alone would be enough, resulting in acceleration in xyz axis A(Ax,Ay,Az) and would need to be integrated twice to get velocity and then position, but integrating would add an unknown constant value, this error called drift increases with time. How to remove this error?
2- Furthermore, why is there a need for gyroscope in the first place, cant we just translate the x-y-z axis acceleration to displacement, if accelerometer tells the axis of motion then why check orientation from Gyroscopes. Sorry this is a very basic question, everywhere I checked both Gyro+Accel were used but don't know why.
3- Even when stationary and not in any motion there is earth's gravitation force acting on the sensor which will always give values more than that attributed by the motion of sensor. How do you remove the gravity?
Once this has been done ill apply Kalman Filters to them to fuse them and to smooth the values. How accurate is this method for trajectory estimation of an object for environments where GPS is not an option. I'm getting the Accelerometer and Gyroscope values from arduino and then importing to Python where it will be plotted on a 3D graph updating in real time. Any help would be highly appreciated, especially links to similar codes.
1 - An accelerometer can be calibrated to account for some of this drift but in the end no sensor is perfect and inaccuracy will inevitably cause drift. To fix this you would need some filter such as the Kalman filter to use the accelerometer for short high frequency data, and a secondary sensor such as a camera to periodically get the absolute position and update the internal position. This is the fundamental idea behind the Kalman filter.
2 - Accelerometers aren't very good for high frequency rotational data. Just using the accelerometers data would mean the system could not differentiate between a horizontal linear acceleration and rotational position. The gyroscope is used for the high frequency data while the accelerometer is used for low frequency data to adjust and counteract the rotational drift. A Kalman filter is one possible solution to this problem and there are many great online resources explaining this.
3 - You would have to use the methods including gyro / accel sensor fusion to get the 3d orientation of the sensor and then use vector math to subtract 1g from that orientation.
You would most likely be better off looking at some online resources to get the gist of it and then using a pre-built sensor fusion system whether it be a library or an fusion system on the accelerometer (on most accelerometers today including the mpu6050). These onboard systems typically do a better job then a simple Kalman filter and can combine other sensors such as magnetometers to gain even more accuracy.

Curvature of a one-pixel wide curve

I have a numpy array depicting a one-pixel wide, discrete, connected curve. This curve is obtained by the Skeletonization operation of image processing. I am trying to find the curvature of the above curve at an arbitrary point, to detect bends/kinks (which will have high curvature value).
I tried to implement the above using the general formula for curvature. However, since this a pixelated, discrete curve, whose generating function is unknown, I tried to resort to using numpy gradient instead.
The problem I see with the above is that, since the curve is one-pixel wide, at any point the slope can be only one of 0, 1 or infinity. As a result, the curvature values that I get are mostly meaningless or useless.
I am looking for some suggestion on where to start in order to get a smooth curve out of the above, so that I can calculate curvature in a more meaningful way. Can somebody suggest any mathematical operation or convolution that I can apply to achieve the same? Below is a representative binary image that I have.
P.S. I am very, very new to image processing, so references to standard algorithms (in math books) or library implementations would be very helpful.
An established way to do this is to fit a low-order parametric curve to each of the skeletonized points using two or more neighbouring points. Then you compute curvature at the point using the fitted curve parameters with an analytic formula. Several curve models can be used. The two main models are:
A circle. The radius of curvature, R is the reciprocal of the curvature. For a curve, it equals the radius of the circular arc which best approximates the curve at that point. You can fit a circle to a set of 2D data points using various methods. A python library that has implemented several is here.
A quadratic. This can be fitted to the point and its neighbours, then curvature can be estimated through second-order differentiation of the curve here. You can use numpy.polyfit to fit this model. A simple strategy is to first estimate the tangent vector at the point, by fitting a local line (e.g. with polyfit using an order 1 curve). The you rotate the points to align the tangent vector with the x axis. Finally you fit a 1D quadratic f(x) to the rotated points using polyfit.
The tricky thing with making any curvature estimator is that curvature can be estimated at different scales. For example, do I want my estimator to be sensitive to high frequency detail or is this actually noise? This decision manifests in the choice of neighbourhood size. Too small, and errors from noise and discretization lead to unstable estimates. However too large, and there may be large modelling error (error by approximating the curve as a parametric function). Generally you have to select the best neighbourhood size yourself.
You're also going to have some poor curvature estimates at junction points, but that's largely unavoidable as curvature is not well defined there. A naïve fix could be to segment all paths at junction points, and then estimate curvature on each path individually.
Toby gave an excellent suggestion regarding junction points: detect the junction points and take each line in between those independently.
Detecting junction points (and end points). This is quite simple: all pixels that are set and have more than two neighbors are junction points. All pixels that are set and have exactly one neighbor are end points. Detect all those points and put their coordinates in a list.
Finding the lines in between pairs of points. Starting at each coordinate in your list, look for a line starting there. Note that for the junction points, you'll have at least three lines starting there. If you do this, you'll find each line two times. You can remove duplicates by reversing the lines that end to the left of where they start (and if the two end points are on the same image column, take the one on top as the start). Now they will be directly comparable, so you can delete the duplicates (or not store them in the first place). Note that just comparing start and end point is not sufficient as you can have different lines with the same start and end points.
Tracing each line. The step above requires that you trace each line. See if you can figure it out, it's fun! Here is a description of an algorithm that traces the outline of objects, you can use it as inspiration as this problem is very similar. Store a vector with x-coordinates and one with y-coordinates for each line.
Smoothing the lines. As you noticed, consecutive steps are in one of 8 directions, so angles are strongly discretized. You can prevent this by smoothing the coordinate vectors. This is a quick-and-dirty trick, but it works. Think of these vectors as 1D images, and apply a smoothing filter (I prefer the Gaussian filter for many reasons). Here you filter the vector with x-coordinates separately from the vector with y-coordinates.
Computing the curvature. Finally, you can compute the curvature of the curve, as the norm of the derivative of the unit normal to the curve. Don't forget to take the distance between points into account when computing derivatives!

detect point on a curve boundary

I have boundaries of semi-circle or ellipse shaped objects. Example image is
The boundary can be slightly jagged (when you zoom in). I am looking to detect a point of interest (location x and y) on these curves, where we see a definite change in the shape, such as
There can be two outputs:
No point of interest: we cannot find specific features
Point of interest with x and y location
Currently, I am using Python and OpenCV. I cannot think of a efficient and effective way to solve this problem. Any help will be really appreciated.
Nothing says that others will agree with my closure vote, so ...
I suggest two steps:
Fit an ellipse to the given points. I'm sure you've already found curve-fitting algorithms (and perhaps software packages) by now -- and asking for those is specifically proscribed on Stack Overflow.
Code a small anomaly detector, which works on the difference between the fitted curve and the actual data points.
Step 2 depends heavily on your definition of "point of interest". What are the criteria? I notice that your second point of interest actually lies very close to the fitted curve; it's the region on either side the deviates inward.
I suggest that you do your fitting in polar coordinates, and then consider the result in terms of theta and radius. Think of "flattening" the two curves as a single unit, so that the central angle (theta) is the new x-coordinate, and the distance from the center is the new y-coordinate.
Now, subtract the two curves and plot the difference (or just store this new curve as an array of points). Look for appropriate anomalies in these differences. This is where you have to decide what you need. Perhaps a sufficient deviation in the "r" value (radius, distance from center"); perhaps a change in the gradient (find a peak/valley, but not a gently sloping bulge). Do you want absolute difference, or an integral of deviation (area between the fit and the anomaly). Do you want it linear or squared ... or some other function? Does the width of the anomaly figure into your criteria?
That's what you need to decide. Does this get you moving?

Signal processing-remove noise for a series of spectra

I am a chemist and measured a series of spectra of one compound under increasing temperature. (-200 degree to 0 degree). The shape of spectra is very similar at different temperature. The only difference is the intensity: at higher temperature the intensity is lower.
My problem is at high temperature, e.g. 0 degree, the real signal's intensity is quite close to the background noise's amplitude, which make the spectra at high temperature very noisy. I tried some simple smoothing method but the result is not good.
The noise is much less affected by the temperature change than the real signal(which means we can assume the background noise doesn't change too much). Thus, I wonder is there any method that can remove the noise(background) using the series of spectra I have, since they share the "common" background noise.
Any information (e.g. name of method, tools in python or R, reference) will be helpful. Thanks for your help!

Categories