I'm new to ROS, and I have a mission to develop an algorithm that allows
the robot to move forward as long as he doesn't have an obstacle in front of
him, but it kept getting stuck in obstacles that I've put in front of him in the gazebo simulation.
When I checked it in depth, I figured that it seems that my robot scans to the sides instead of in front. And when I checked the specs for the scanner laser
it said that the angles of the scan should be maximum between -90 degrees to 90 degrees and preferably much less than that. So it seems that I can't complete my mission due to "hardware" problems but it seems strange to me.
Can anyone please help?
Here is my code:
#!/usr/bin/python
#
# stopper.py
#
# Created on:
# Author:
#
import rospy
import math
from geometry_msgs.msg import Twist
from sensor_msgs.msg import LaserScan
class Stopper(object):
def __init__(self, forward_speed):
self.forward_speed = forward_speed
self.min_scan_angle = -10/180*math.pi
self.max_scan_angle = 10 / 180 * math.pi
self.min_dist_from_obstacle = 0.5
self.keep_moving = True
self.command_pub = rospy.Publisher("/cmd_vel_mux/input/teleop", Twist, queue_size=10)
self.laser_subscriber = rospy.Subscriber("scan",LaserScan, self.scan_callback, queue_size=1)
def start_moving(self):
rate = rospy.Rate(10)
rospy.loginfo("Starting to move")
while not rospy.is_shutdown() and self.keep_moving:
self.move_forward()
rate.sleep()
def move_forward(self):
move_msg = Twist()
move_msg.linear.x = self.forward_speed
self.command_pub.publish(move_msg)
def scan_callback(self, scan_msg):
for dist in scan_msg.ranges:
if dist < self.min_dist_from_obstacle:
self.keep_moving = False
break
You should be able to select the angles you are interested in yourself. The -90 and +90 degrees are just the endpoints the laser scanner measures. So you get a dataset with a lot of distances in different angles. To detect obstacles in front of the robot you need to select a (or multiple) measurements in the middle of the dataset (my knowledge is rusty, I assume the ranges are sorted from -90° to 90° so 0° is in the middle of the array). So you may don't want to loop through all distances in msg.ranges but just a subset.
I found this tutorial that shows how to read out the data and access to the value from different angles.
Related
is there any possibility to move the cursor in Python with % starting from a certain coordinate like (1,1)?
I am using pyautogui atm to automate and I thought it would be quite convenient if this is independent from the monitor size making it universal.
Thanks for your help in advance!
It's possible indirectly. As detailed at the top of the Mouse Control Functions page of the documentation, you can get the screen size using the size() function (which returns a tuple of (X, Y)). You can then do the math to figure out how many screen pixels equal the percentage you're looking for, then call moveTo() to go there.
# script to move mouse 50% to the right and down
import pyautogui as pag
percentage = 0.5
cur_X, cur_Y = pag.position() # current X and Y coordinates
size_X, size_Y = pag.size() # screen size
goto_X = (size_X - cur_X) * percentage + cur_X # current location plus half
goto_Y = (size_Y - cur_Y) * percentage + cur_Y # the distance to the edge
pag.moveTo(goto_X, goto_Y, 1) # move to new position, taking 1 second
I have written a programme that creates a random walk using certain distribution. For example, if it is a Cauchy distribution, the program generates an adequate random number and appends it to the list. Then this list is used by the turtle as a list of steps in a random direction. Now I would like to measure the distance between the turtle in each step and point (0,0) where the random walk is starting and I can't seem to find a good solution for that. It cannot be done with turtle.distance() because I want to draw a graph of dependence this distance on step count. My first idea was to get coordinates of each point that my turtle stops and then try to calculate distance using the Pythagoras theorem, but I make mistakes somewhere along the road and can't acquire them. Could someone explain why, please? Or maybe there is a better way to do this?
cauchylist = []
for i in randomlist:
cauchylist.append(0.15*math.tan(((i-0.5)*math.pi)) )
Franklin = turtle.Turtle()
u = input("Which walk to walk? C - Cauchy or G - Gauss.")
if u == "C":
start()
walk(Franklin, cauchylist)
cv = turtle.getcanvas()
cv.postscript(file="walk_cauchy.ps")
Franklin.penup()
Franklin.fd()
Franklin.end_poly()
coordinates = Franklin.get_poly()
turtle.done()
The distance of the current position of the turtle to the origin is calculated by squaring the posi-values, adding them and getting the square root from that:
from turtle import *
from math import sqrt
from random import seed, choice
seed(42) # ensure same randoms for demonstation purposes
def calc_dist(t):
return sqrt(sum(map(lambda x:x*x, t.pos())))
t = Turtle()
dists = []
for _ in range(20):
t.forward(choice(range(100))-50)
t.left(choice(range(360))-180)
dists.append(calc_dist(t))
print(dists)
Output:
[31.0, 68.97157492789178, 87.25113481192517, 98.48389701852805, 134.17622966161073,
141.40760908816227, 138.90181656585844, 128.76423765225354, 113.79561063931857,
108.70118700467432, 66.53516784607132, 87.40888728250773, 113.65616399911761,
115.22672747425487, 122.12225694530929, 128.35176588895283, 157.57222689310848,
128.33399580245668, 129.3846600939952, 153.87822281203412]
Do the calculation sqrt(sum(map(lambda x:x*x, t.pos()))) after each move of the turtle and store the result in a list.
get_poly() will return a tuple of tuples.
Example:
coordinates = turtle.get_poly()
coordinates = ((0.00,0.00), (100.00,0.00), (128.19,10.26), (136.87,59.50))
You can easily acess each tuple with a for loop
for c in coordinates:
print(c)
Output:
(0.00,0.00)
(100.00,0.00)
(128.19,10.26)
(136.87,59.50)
I would like to measure the distance between the turtle in each step
and point (0,0) where the random walk is starting and I can't seem to
find a good solution for that. It cannot be done with
turtle.distance()
This premise doesn't hold. We can do turtle.distance(0, 0) which doesn't affect the turtle's state but gives us the distance to the origin. To demonstrate, my rework of #PatrickArtner's example using distance():
from turtle import Screen, Turtle
from random import seed, randint
seed(42) # repeatable randomness for demonstration purposes
screen = Screen()
turtle = Turtle()
distances = []
for _ in range(20):
turtle.forward(randint(-50, 50))
turtle.left(randint(-180, 180))
distances.append(turtle.distance(0, 0))
print(distances)
screen.exitonclick()
Console Output
> python3 test.py
[31.0, 68.97157492789178, 87.25113481192517, 98.48389701852807, 134.17622966161073,
141.40760908816227, 138.90181656585844, 128.7642376522535, 113.79561063931855,
108.70118700467431, 66.5351678460713, 87.4088872825077, 113.65616399911758,
115.22672747425486, 122.12225694530927, 128.35176588895283, 157.57222689310848,
128.33399580245668, 129.3846600939952, 153.87822281203412]
>
I cannot use t.forward and t.left commands because I get an error
"invalid command name . canvas turtle". I think it's because my turtle
walk is defined by function looking like that:
def walk(turtle, steps):
x = 0
y = 0
for i in steps:
kat = random.random() * 360
turtle.setheading(kat)
turtle.forward(math.fabs(i))
bounds(turtle)
However, if I don't use it as on the example You gave, I create list
full of only last distance value. Can anything be done about that?
We'd need to see more code to address this issue, e.g. the definition of bounds() and so forth.
I am commanding a robot from a base station with radio. Base station takes location/orientation information from an overhead camera using the AR tag on the robot (with openCV). Moreover, base calculates the path robot should take to reach the target from location information (A* with each grid being 30 by 30 pixels in camera). My robot can only turn left/right (on its central point) and go forward/backward. Robot includes an Arduino Uno with two Lego NXT motors.
I use the following code to get robot to point at right direction. However, when the robot gets close to the angle that it is supposed to travel to, instead of stopping an going forward it tries to fix its orientation infinitely.
def correctOrientation(self, rx: int, ry: int):
#returns direction robot needs to point.
direction = self.getDirection((self.sx, self.sy), (rx, ry))
#method to stop robot.
self.comms.stop()
anglediff = (self.angle - direction + 180 + 360) % 360 - 180
while not (abs(anglediff) < 15):
#Decides which way to turn.
if self.isTurnLeft(self.angle, direction):
self.comms.turnLeft()
else:
self.comms.turnRight()
#Put sleeps because there is a delay in camera feed. Allows it to get the location right
time.sleep(0.3)
self.comms.stop()
#Updates position
self.getPos()
time.sleep(1)
#Calculates orientation of robot and updates it
self.angle = self.calcOrientation()
anglediff = (self.angle - direction + 180 + 360) % 360 - 180
print(anglediff)
time.sleep(1)
My helper function that are used. I calculate orientation of robot by using two points known on the robot and drawing a line in between those two point. Hence, line becomes parallel with th orientation.
def isTurnLeft(self, angle, touchAngle):
diff = touchAngle - angle
if diff < 0:
diff += 360
if diff < 180:
return False
else:
return True
def calcOrientation(self) -> float:
return self.getDirection(self.marker[0], self.marker[3])
def getDirection(self, source: Tuple[int], target: Tuple[int]) -> float :
return (math.degrees(math.atan2(target[1] - source[1], target[0] - source[0]))+360)%360
I can't figure out if my code is problematic in logic. If so what can I do about it? If code is fine and the problem is the delay/setup of the system, what are the other ways I can control the robot?
Thank you for the help.
Solved the problem by changing the image library for AR tag recognition. We were using 2 tags for one robot. It is significantly slower and fail prone to detect two tags. Updated it to only have one tag. Moreover, switched from angle based calculations to vector based which is way simpler to understand.
I wrote a simple 3D engine for pygame. Now I'm trying to create a simple first person controller for moving around a scene. The engine assumes a 3D grid with the player starting facing the negative z axis. The camera can rotate about the three axis as if it was at the origin. Currently the first person controller can only rotate around the y axis - i.e. look around 360 degrees. I'm trying to make the wasd controls move relative to which direction the camera is looking at.
Here is what I currently have (self.camera is a camera object from my 3D engine, which has the attributes x, y, z (position) and x_rot, y_rot, z_rot (rotation):
import python3d as p3d
import pygame
import math
class first_person_controller:
def __init__(self,scene,camera,end_func):
pygame.event.set_grab(True)
pygame.mouse.set_visible(False)
self.scene = scene
self.camera = camera
self.end_func = end_func
def fps_control(self):
keys = pygame.key.get_pressed()
relative = pygame.mouse.get_rel()
self.camera.change_camera_rot([0,relative[0]/1000*-1,0])
if keys[pygame.K_w]:
self.camera.z += math.cos(self.camera.y_rot) * 5 * -1
self.camera.x += math.sin(self.camera.y_rot) * 5 * -1
if keys[pygame.K_s]:
self.camera.z += math.cos(self.camera.y_rot) * 5
self.camera.x += math.sin(self.camera.y_rot) * 5
if keys[pygame.K_a]:
self.camera.z += math.cos(self.camera.y_rot + math.radians(90)) * 5 * -1
self.camera.x += math.sin(self.camera.y_rot + math.radians(90)) * 5 * -1
if keys[pygame.K_d]:
self.camera.z += math.cos(self.camera.y_rot + math.radians(90)) * 5
self.camera.x += math.sin(self.camera.y_rot + math.radians(90)) * 5
if keys[pygame.K_ESCAPE]:
self.end_func()
if keys[pygame.K_c]:
self.camera.y = 0
elif not keys[pygame.K_c]:
self.camera.y = 5
This is supposed to fit into a normal pygame loop like:
while True:
clear screen
fps_control()
render scene
However, when implemented the first person controller goes off course, and at least appears to not walk in a straight line. I can't tell if this is because of my math, or my engine.
If someone could tell me which is the issue - the trigonometry or the 3D engine, and if its the trigonometry suggest better equations, that would be great.
If it helps, here is the 3D engine, but as of right now, the code isn't very readable.
Thanks.
I'll probably be voted down for it, but whatever.
Your code reminds me a lot an earlier attempt to a 3D engine of my own. However, after facing numerous problem, I figured that Vectors would help me a LOT, and then I learned Matrices, as these helped TONS MORE.
And here is a link towards a course named Vector Math for 3D computer graphics which is already great, but then you should even go a step further by using 4x4 Matrices. May seems like boring stuff, but you'll be glad to see how easy the logic becomes once you get it.
I created a robot that will run based on python. For its autonomous program I need it to run for a certain distance( say 10 feet). Currently I am using time to have it go the distance, but is there any way to implement distance in the code to make it more exact. Thank you.
This was code for an old robotics competition i did and i want to learn by improving it. I used these libraries:
import sys
import wpilib
import logging
from time import time
This is the code:
def autonomous_straight(self):
'''Called when autonomous mode is enabled'''
t0 = time()
slow_forward = 0.25
t_forward_time = 6.5
t_spin_time = 11
t_shoot_time = 11.5
while self.isAutonomous() and self.isEnabled():
t = time() - t0
if t < t_forward_time:
self.motor_left.set(slow_forward)
self.motor_right.set(-slow_forward)
self.motor_gobbler.set(1.0)
elif t < t_spin_time:
self.motor_left.set(2 * slow_forward)
self.motor_right.set(2 * slow_forward)
self.motor_shooter.set(-1.0)
elif t < t_shoot_time:
self.motor_mooover.set(.5)
else:
self.full_stop()
self.motor_gear.set(-1.0)
wpilib.Timer.delay(0.01)
self.full_stop()
It looks like you're trying to drive a distance based on time. While this may work over short distances at known speeds, it's generally much better to drive based on sensor feedback. What you're going to want to take a look at are encoders, more specifically Rotary encoders. Encoders are simply counters that keep track of 'ticks'. Each 'tick' represents a percent rotation of the drive shaft.
where d is distance traveled, cir is the wheel circumference res is the encoder 'resolution' (number of ticks per rotation), and num is the current tick count (read off the encoder)
# Example Implementation
import wpilib
import math
# Some dummy speed controllers
leftSpeed = wpilib.Spark(0)
rightSpeed = wpilib.Spark(1)
# Create a new encoder linked to DIO pins 0 & 1
encoder = wpilib.Encoder(0, 1)
# Transform ticks into distance traveled
# Assuming 6" wheels, 512 resolution
def travelDist (ticks):
return ((math.pi * 6) / 512) * ticks
# Define auto, takes travel distance in inches
def drive_for(dist = 10):
while travelDist(encoder.get()) < dist:
rightSpeed.set(1)
leftSpeed.set(-1) # negative because one controller must be inverted
This simple implementation will allow you to call drive_for(dist) to travel a desired distance with a fair degree of accuracy. It does however, have quite a few problems. Here we attempt to set a linear speed, notice there is no acceleration control. This will cause error to build up over longer distances, the solution to this is PID control. wpilib has constructs to simplify the math. Simply take the difference between your setpoint and current travel distance and plug it into your PID controller as an error. The PID controller will spit out a new value to set your motor controllers to. The PID controller (with some tuning) can account for acceleration, inertia, and overshoot.
docs on PID:
http://robotpy.readthedocs.io/projects/wpilib/en/latest/_modules/wpilib/pidcontroller.html?