I want to draw dotted curve between the boundaries. For eg, consider the following image:
From the image, I have the dark outline - what I want is the dotted part through code. As of now, I have the following algorithm:
For each point say A, find it's two nearest points
Form these two points, find it's slope, and then the slope of the line perpendicular to it.
The using the perpendicular slope and point A, determine the line and also where it intersects the parallel point, say B
Draw a point half way between A and B on this perpendicular line.
This method is like Brute force. I think using numpy, opencv or plotting libraries, this might be a trivial thing. Do you have any suggestions?
Thank you
Related
I am fairly new to openCV and am not sure how to proceed.
I have this thresholded image:
And using this image, I need to calculate the distance between two points. The points are also unknown. Illustrated here:
I need to calculate 'd' value. It is the distance from the midpoint of the middle line to where the top line would have been. I am not sure how to proceed with identifying the points and getting the distance, any help would be appreciated!
While I'm not sure about the selecting points part of your problem, calculating the distance between two points is trivial. You can imagine your two points as the beginning and end of the hypotenuse of a right triangle, and use Pythagoreans Theorem to find the length of that hypotenuse.
The math module has a distance method which takes two points, or you can just write the function yourself very simply.
I have boundaries of semi-circle or ellipse shaped objects. Example image is
The boundary can be slightly jagged (when you zoom in). I am looking to detect a point of interest (location x and y) on these curves, where we see a definite change in the shape, such as
There can be two outputs:
No point of interest: we cannot find specific features
Point of interest with x and y location
Currently, I am using Python and OpenCV. I cannot think of a efficient and effective way to solve this problem. Any help will be really appreciated.
Nothing says that others will agree with my closure vote, so ...
I suggest two steps:
Fit an ellipse to the given points. I'm sure you've already found curve-fitting algorithms (and perhaps software packages) by now -- and asking for those is specifically proscribed on Stack Overflow.
Code a small anomaly detector, which works on the difference between the fitted curve and the actual data points.
Step 2 depends heavily on your definition of "point of interest". What are the criteria? I notice that your second point of interest actually lies very close to the fitted curve; it's the region on either side the deviates inward.
I suggest that you do your fitting in polar coordinates, and then consider the result in terms of theta and radius. Think of "flattening" the two curves as a single unit, so that the central angle (theta) is the new x-coordinate, and the distance from the center is the new y-coordinate.
Now, subtract the two curves and plot the difference (or just store this new curve as an array of points). Look for appropriate anomalies in these differences. This is where you have to decide what you need. Perhaps a sufficient deviation in the "r" value (radius, distance from center"); perhaps a change in the gradient (find a peak/valley, but not a gently sloping bulge). Do you want absolute difference, or an integral of deviation (area between the fit and the anomaly). Do you want it linear or squared ... or some other function? Does the width of the anomaly figure into your criteria?
That's what you need to decide. Does this get you moving?
In OpenCV, for a given point (x,y), what is the best way to find the point closest to it that belongs to a known contour cnt? (I assume the point lies outside the contour.)
dist = cv2.pointPolygonTest(cnt,(x,y),True)
pointPolygonTest returns the distance of the closest contour point but I do not see a way to get to the actual point.
Of course, I could loop over the list of contour points and recalculate the distance finding the one with minimum distance. (A couple of questions on SO explain more sophisticated ways for finding the closest point out of a list of points to a given point.)
Alternatively, I could draw a circle with radius dist and see where the circle and the contour touch.
Both options seem a little clunky, so I wonder if I am missing something more straight forward.
I am trying to build a python application that inspects a part that consists of a straight line followed by a curved arc (as seen in picture). The goal is to find the curvature and arc length of the curved section. My approach so far has been to threshold and skelotinze to produce a series of points. I then use least squares fitting to fit a piecewise mathematical function that is composed of a line and an arc. This works great for level parts.
However, if the parts are placed such that the straight section is not flat (parallel with horizontal axis) I run into problems. My solution thus far has been to fit a line to the left quarter of the image to detect the slope of the straight section. When doing my least squares fitting I then add in this factor (ie. add mx + b to the piecewise function). This produces a result that is close.
The error I found in this is approach is actually kind of interesting. Least squares fitting tries to minimize error in the y-axis for both flat and rotated parts. This is fine for level parts, but for rotated parts the error should really be defined as the distance between the data and the curve in a line perpendicular to the straight section so that the error is measured the same in both circumstances.
Any help in overall design of this problem would be greatly appreciated. I didn't think template matching was a good solution as my parts have different parameters (ie. curvature, arc length).
Here is an example where the straight section is a horizontal line:
I have a set of points extracted from an image. I need to join these points to from a smooth curve. After drawing the curve on the image, I need to find the tangent to the curve and represent it on the image. I looked at cv2.approxPolyDP but it already requires a curve??
You can build polyline, if order of points is defined. Then it is possible to simplify this polyline with Douglas-Peucker algorithm (if number of points is too large). Then you can construct some kind of spline interpolation to create smooth curve.
If your question is related to the points being extracted in random order, the tool you need is probably the so called 2D alpha-shape. It is a generalization of the convex hull and will let you trace the "outline" of your set of points, and from there perform interpolation.