I am thinking about the simplest way of rotating an object in coordination system which looks like this:
0
-90 90
-180/180
I need to rotate an object until it reaches a given angle. It has to support rotation in both directions. I receive information about the object's current rotation. I have to create a while loop condition when it should stop rotating. It cannot be simple equality statement as the information I receive is not that precise.
EDIT:
The object is a drone which sends me data about its current rotation along z-axis. I rotate it by sending a request to rotate in a given direction. Based on information about its current rotation and the angle by which I want him to rotate (plus the direction of rotation), I need to set up a condition when it should stop rotating and send an apropriate requst.
Theoretically, you would need to have another variable that took the direction of rotation (clockwise or anti-clockwise). If the direction was clockwise, you would have angles assigned 0, 90, 180, 270 in clockwise direction. Same for anticlockwise.
Your while loop would be for angle < given angle, since with the variables being assigned values based on the direction of rotation you wouldn't have to worry about +/- values for the angles.
This is probably a really basic answer and not what you were hoping for, but I hope it helped a little!
Related
want to do : make function that draw depthmap for 3D object
Problem : Calculate 'Z-buffer' myself
Questions :
Q1. Do I need to transform both the camera and the object to the view(camera) coordinate system to calculate the 'z-buffer' ?
Q2. Since the direction coming out of the monitor by the right-handed coordinate system is the +z direction, the near and far values of the clipping plane are positive value in theory, but is it correct to calculate them as negative value (-far ,-near) in actual calculation?
i need help in understanding rotation values in webots. How do I calculate and set them?
I want my robot to rotate in direction of the certain object.
For example, if the ball is rolling around the robot, the robot tries to get the position of the ball and rotate to it, so that the robot is always facing the ball.
Does anybody have an idea how can I do it?
My thoughts on coding it:
Get position of the ball
Get position of the robot
Calculate the angle between them
Rotate the robot by calculated angle
Thanks in advance!
You don't have to calculate the angle, it is enough to find the position of the ball in a 2D image plane. If the ball is left from the image center the robot should rotate left and if the ball is right from the image center the robot should rotate right.
You can find an example here:
https://github.com/lukicdarkoo/webots-example-visual-tracking
and you can see the result here:
https://lukicdarkoo.github.io/webots-example-visual-tracking/
I have a set of polygons and they can overlap with each other, like this:
I want to modify them in such a way that they don't overlap and the resulting surface area stays the same. Something like this:
It is okay if the shape or the position changes. The main thing is that they should not overlap with each other and the area should not change much (I know the area changed a little in the second image but I drew it manually thus let's just assume that the areas did not change).
I am trying to do it programmatically with the help of Python. Basically I stored polygons in a PostGIS database and with the help of a script I want to retrieve them and modify them.
I am very new to GIS and thus this seems like a difficult task.
What is the correct way of doing it? Is there an algorithm that solves this kind of problems?
Take a look at ST_buffer and try passing a signed float as the second argument (degrees to reduce radius by)
SELECT buffer(the_geom,-0.01) as geom
Be careful with negative buffers as you could run into issues if the buffer size exceeds the radius, see here.
Here is what I did:
Iterated over all the polygons and found overlapping polygons. Then, I moved the polygon in different directions and found the best moving direction by calculating the minimum resulting overlapping area. Then I simply moved the polygon in that best direction until there is no overlapping area.
I have a device called the Myo Armband and am using the orientation data it's outputting to control the position of an 'arm' in python (using Vizard for the rendering). Right now, I have it where I can do
arm.setQuat(armbandQuat)
and it kinda works. Up and down are accurate, but rotation about the vertical axis starts in a random direction each time. To fix this, I made it so that when you press the spacebar it records the offset quaternion. What I want to happen is that whenever you press space, the arm onscreen goes to a set '0' position and all motion about the vertical axis will be relative to that. In other words, the user will press space with their arm horizontal to calibrate it.
What do I need to do with these two quaternions to setQuat to the correctly rotated coordinates? Originally I thought you could just multiply the two quaternions together (or the inverse of the second), but if one is 0,0,0,0 then the arm will be frozen at 0,0,0,0.
I can also get and set Euler coordinates instead, if that's easier. It seems like quaternions are more of a perfect system for rotations like this though.
I think it should be somewhat simple, but I've spent hours looking up the answer and trying things out and nothing works.
It may be a very simple question, if you have the answer please share.
Provided a series (say for t0..tn) of the matrices (2D arrays) of velocities in X and Y directions (UX,UY) by means of application of Lattice Boltzmann method (LBM) on the simulation of fluid flow in 2D, the question is how to make an animation of fluid flow.
We should be able to use velocities to find positions of (??) by applying: Position = Velocity x Time. Any ideas of what could be (??).
We think that we could have a same size of velocity matrix of particles for time t0 and find the next position matrix as mentioned above, so to move particles accordingly.
Please share your knowledge!
Is the chosen approach correct?
Any other methods etc etc.
For this problem tips in Python are more than welcome!
Pseudo-codes could be of more help!
To simplify the question the following is the velocity map at time tn, trying to have a fluid flow map based on that, How?
If the initial distribution of your particles is fairly regular (a grid, or uniformly random), you'll find that after a while all the particles tend to cluster together, leaving entire areas of your fluid empty and thus invisible.
I found that a good method is to have short-lived particles (on the order of seconds). When a particle dies, it is respawned in a random position. Also, because each particle traces only a short path, the accuracy of the integration method used doesn't matter so much: a midpoint method or even forward Euler does the job just fine.