-
Notifications
You must be signed in to change notification settings - Fork 0
Make the robot follow a ball
to figure how to disconnect the robot from charging and are able to turn it on. You should have received valid login/password details for the robot.
Ensure that you have setup the code on your user account. You may have done this already. Follow the instructions at Set up your user account
In this tutorial, we will be connecting the robot and a desktop machine together. To do so, both machines need to expose their IP address to the ROS System. Do the following:
- Select a robot (the robot should have a kinect) and a desktop machine. Figure out the name of the desktop machine. In this example we'll use
scruffy.csres
. Depending on which machine you select, remember to modify the instructions below appropriately -
On a desktop machine (logged in with your user account), edit the file
~/.bashrc
and add the following lines at the bottom. These lines should be after the line that sources your catkin workspace.
export ROS_IP=`rosrun segbot_bringup get_addr eth0`
-
On a robot (logged in with the common account provided), edit the file
~/.bashrc
and add the following lines at the bottom. These lines should be after the line that sources your catkin workspace. Make sure that you don't have left over lines from someone else working on the assignment:
export ROS_IP=`rosrun segbot_bringup get_addr wlan0`
export ROS_MASTER_URI=
http://scruffy.csres.utexas.edu:11311
#REMEMBER to change scruffy to whatever machine you use
Every time you work on this assignment, you may have to go through these instructions in full or part again to ensure the settings are correct.
On the desktop and the robot, close all open terminals and reopen them. This allows all the configurations changes you made in the last section to be executed.
- On the desktop Run the ROS master on the desktop. The master is a central book-keeper that keeps track of what processes are running in a ROS system. For the desktop and the robot to talk to each other, they will need to connect to the same master - which we will run on the desktop machine
roscore
- On the robot Run the segbot configuration with a kinect on the robot. This configuration launches the robot base and the kinect driver.
roslaunch segbot_bringup segbot_kinect.launch
-
On the desktop Look at various ros topics being published by the kinect (under the namespace
nav_kinect
). You will see a lot of topics relating to the kinect sensor, publishing various information from the color and depth sensors.
rostopic list -v
- On the desktop Take a look at the camera image from the Kinect. You will see a lot of topics relating to the kinect sensor, publishing various information from the color and depth sensors. The images will stream slowly as they are streaming over the wireless:
rosrun image_view image_view image:=/nav_kinect/rgb/image_color
- On the robot We've prepared some blob detection code using the cmvision library that may already be available on the robots. We'll download it again just to be sure:
cd ~/catkin_ws/src
wstool set cmvision --git
http://github.com/utexas-bwi/cmvision.git
--version-new utexas-tutorial-stable
wstool update cmvision
cd ~/catkin_ws
catkin_make
- On the desktop Re-run the commands for adding cmvision on the desktop in your user account.
- On the robot Run the code using the command below:
roslaunch cmvision cmvision.launch
You can take a look at the launch file that runs the blob detector. Run the following commands:
roscd cmvision
cmvision.launch
Now you can open the launch file in your favorite editor (emacs, gedit, vim, nano). You’ll see where the image topic is remapped to /kinect/rgb/image_color, and you’ll also see where the color map file path is set. You can also take a look at the color map file:
colors.txt
It simply gives a range of values for Red, Green, and Blue that should be classified as orange.
- On the robot You can view the output of the blob detection code by running the following command. This command may first produce a few error messages and then there will be about 10 seconds before it starts printing blob info. Be patient.
rostopic echo /blobs
This command will print out each message on topic /blobs that is sent by the system. By default, the blob detector is only looking for orange blobs, so use one of the 2 orange balls in the lab. You will see lots of messages go by (particularly if there’s an orange object in front of the camera). Again, these blob messages tell you about orange blobs (sections of all orange) found in the camera image. The /blobs message has a few important pieces of information. It gives you a blob_count of how many blobs it found in the image, and then for each gives an area, and coordinates of the center and corners of the blob found in the image. You will notice that it sometimes detects small bits of orange without therebeing a ball (something else it is seeing looks orange). Also, the ball may sometimes be broken into multiple blobs. Please hold the ball in the air in front of your robot, as on the ground the robot will have to be very far away to see the ball.
- On the desktop Drive the robot around (It is a bad idea to move the robot by pushing it while it is on).
rosrun segbot_bringup teleop_twist_keyboard
If your robot does not move, you may need to restart the base (press the yellow button, then the green to turn it off, then green and then yellow to turn it back on).
In this section, we'll try and write a ROS Node that autonomously controls the robot by reading the output of the blobs topic and publishing command velocities to the robot (instead of teleop_twist_keyboard). To do so, it is recommended you briefly read through the following ROS tutorials (the information will be useful reference in the future) - remember to select catkin in these tutorials:
- Navigating the Filesystem
- Creating a Package
- Build a Package
- Understanding ROS Nodes
- Understanding ROS Topics
- Writing a Publisher and Subscriber in C++
Next, we'll use some information from these tutorials to perform the following tasks:
-
On the desktop Create a package called
tutorial1
in your workspace. You'll need to modifypackage.xml
and add build + run dependencies oncmvision
,geometry_msgs
androscpp
. - Inside the tutorial1 package, create a new ROS C++ node using the following template. You can name the file as
tutorial1.cpp
. This code subscribes to the blob data, and publishes zero velocities to the robot.
#include
“ros/ros.h
”
#include
“geometry_msgs/Twist.h
”
#include
“cmvision/Blobs.h
”
ros::Publisher velocity_pub;
// This method is called whenever a blob message is received
void blobCallback(const cmvision::Blobs::ConstPtr& msg){
// This is the output velocity that we will publish
geometry_msgs::Twist output;
// This is an example print message
std::cout << "got blob message, count: " << msg->blob_count << std::endl;
// first, we can check if any blobs were found
// blobs are regions of a single color found in the camera image
if (msg->blob_count > 0){
// we may want to access / look at multiple blobs in the array
// this array is all the blobs found in the latest camera image
// you may have multiple blobs because there were multiple balls
// but also because the ball may have been broken into multiple blobs
for (int i = 0; i < msg->blob_count; i++){
// another example print with some blob info
std::cout << "Detected blob " << i << " with area " << msg->blobs[i].area << std::endl;
// some things to look at
msg->blobs[i].area; // blob area
msg->blobs[i].x; // blob center x
msg->blobs[i].y; // blob center y
msg->blobs[i].left; // blob left x
msg->blobs[i].right; // blob right x
msg->blobs[i].top; // blob top x
msg->blobs[i].bottom; // blob bottom x
// it's possible you will see multiple balls, or multiple detected
// blobs within 1 ball... you may want to only use info from the biggest one
}
// TODO: decide what velocities to publish based on blob info
// you probably want to set these to zero when you do not see any blobs
// you may want to work on turning first, and then fwd/backward
output.linear.x = 0; // TODO: fill in this with some number for fwd velocity (meters/sec)
output.angular.z = 0; // TODO: fill this in with some angular velocity (radians/sec)
velocity_pub.publish(output); // publish this velocity message that we filled in
}
}
int main(int argc, char **argv){
ros::init(argc, argv,
“follower
”);
ros::NodeHandle n;
// advertise that we will publish cmd_vel messages
velocity_pub = n.advertise
<geometry_msgs::Twist>(
“cmd_vel
”, 1000);
// subscribe to blob messages and call blobCallback when they are received
ros::Subscriber sub = n.subscribe(
“blobs
”, 1000, blobCallback);
ros::Rate loop_rate(10);
ros::spin();
return 0;
}
- Compile the node by editing
CMakeLists.txt
. Specify the same dependencies under COMPONENTS as the ones you specified inpackage.xml
. Refer the ROS Tutorial for more help on changing the CMakeLists.txt file. You can compile the code by running the following command in your catkin workspace:
cd ~/catkin_ws
catkin_make
- Run the code using the following command. This assumes you created an executable called tutorial1 when you modified the CMakeList.txt file.
rosrun tutorial1 tutorial1
- Next, modify the template to produce code to move the robot autonomously based on seeing the orange blobs. Here are some guidelines:
- First, you may want to check out the file as is. You will notice at the bottom of the file, the node is subscribing to topic blobs, and passing in a callback method to be called whenever a blob message is received, called blobCallback. You’ll also see that the node is publishing to topic cmd_vel.
- Above this code, you’ll see the blobCallback method. It takes in a pointer to the blob message published by cmvision. This message contains a count of all the blobs found in the latest image, as well as an array with info about each blob found in the latest image. Inside this blobCallback method, you’ll see we’ve given examples of some of the variables you may want to access in deciding what velocities to send. Currently the code checks if there were any detected blobs, and loops through them, printing the area of each. Then if any blobs were detected, we publish a desired velocity of 0. This is the code you need to change
- To start with, you may want to just work on turning, try to turn the robot so that the x and y of the blob are in the center of the camera image. Turning should be safer than moving forward and backward.
- Then, you may want to worry about the ball’s area and move the robot forward and backward to get to a desired distance from the ball.
- Finally, you may want to consider robustness to noise, other orange objects and the like. One option for this is to only consider detected orange blobs of some minimum size (smaller ones may simply be noise or random orange pixels).
- You will also probably want to decide what to do when multiple orange blobs are detected (ignore them all? pick one to track?)
- Another thing to think about is how to make the robot move smoothly rather than moving jerkily or randomly.
- When you don’t detect a ball, you probably want to send the robot a velocity of 0 to make it stop moving. Then it should only move if you place a ball in front of its camera.
- Whenever you edit the code, you will need to re-compile it (typically in a separate terminal) by running the following commannds:
cs ~/catkin_ws
catkin_make
Make sure it compiles with no errors. You can then test your code by running the following command:
rosrun tutorial1 tutorial1
If the robot starts moving crazily or you want to stop it immediately, hitting Ctrl+C on your node should stop it from sending commands to the robot, so that the robot should stop. Work on your code until your are satisfied
Have fun!!
If you finish the assignment or plan on working more at a different time, remember to power the robot down. Refer the tutorial on Using the robot. Remember to lock the lab if you are the last one to leave!