CS393R: Autonomous Robots -- Naos and UT Austin Villa

CS393R: Autonomous Robots -- Using the Nao and the UT Austin Villa Codebase


For assignment 4, assignment 5, and the final project we will be using humanoid Nao robots, the UT Austin Villa codebase, and the UTNaoTool (a tool developed by Austin Villa to develop and debug code on the Nao robots).

Initial Setup on Lab Machines

You should use these steps to obtain, setup, and run code on the lab machines and robots for the first time. Note that these steps do not work on the departmental machines. See the bottom of this page for instructions if you want to try and install the codebase on your personal 32-bit Linux machine.

  1. Copy the code from my folder to wherever you are going to use it (I recommend your home directory):
    cp -r /home/katie/Public/Nao .
  2. In your home directory, edit your .bashrc file and put
    export NAO_HOME=~/Nao/trunk/
    export LD_LIBRARY_PATH=/usr/local/mesa-libs/lib
    export TEAM_NUM=5 (where 5 should be replaced by your team number)
    at the end (assuming you copied the code into your home folder). Save. Type bash in any open terminal windows.
  3. In a new terminal, cd $NAO_HOME/tools/UTNaoTool. Type make clean, and then make.
  4. Get your robot out of its locker, and sit it on the ground near you. Plug it in. Turn it on by pushing its chest button for a second. Lights should come on in the eyes. Ping your robot by typing ping 192.168.1.2XX, where XX is your team number (01 for team 1, 02 for team 2, ect).
  5. Once you have successfully pinged your robot, open another terminal and type cd $NAO_HOME/install/. Once here, run the setup_passwordless_ssh script.
  6. Now you should ssh into your robot by typing ssh nao@192.168.1.2XX. Once here, type nao stop to stop naoqi, and then killall naoqi.
  7. Now type cd $NAO_HOME/build in another terminal. Then type the following:
    ./rm -r linux/ robot/
    ./lua
    ./zlib
    ./compile linux swig
    ./compile all all
  8. Then copy everything to your robot by typing ./copy_robot all.
  9. Now you can go back to the terminal you used to ssh into your robot. Type nao start. The robot will eventually say interface and then vision. Once it has said vision, you know it has completely booted.
  10. In the terminal where you compiled the UTNaoTool, type ./UTNaoTool -f (-f opens the files window too). Select your robot ip address from the drop-down menu in the upper right corner of the skinny files window. Now you can push the Set button to make the robot stand, the Playing button to make the robot walk, the Penalized button to make the robot stop walking and stand still, and the Finished button to make the robot sit down.

Preparing for Assignment 5

You should complete these steps before beginning work on Assignment 5. These steps assume that you have already performed the Inital Setup described in the previous section.

  1. Copy the following files from /home/katie/Public/Nao/trunk/ to the appropriate folders in your $NAO_HOME/Nao/trunk/ directory. If you set things up as suggested, you should be able to just use the following commands:
    cp /home/katie/Public/Nao/trunk/tools/UTNaoTool/UTMainWnd.cpp $NAO_HOME/tools/UTNaoTool
    cp /home/katie/Public/Nao/trunk/tools/UTNaoTool/UTMainWnd.h $NAO_HOME/tools/UTNaoTool
    cp /home/katie/Public/Nao/trunk/core/localization/PFLocalization.cpp $NAO_HOME/core/localization
    cp /home/katie/Public/Nao/trunk/core/memory/LocalizationBlock.h $NAO_HOME/core/memory
    cp /home/katie/Public/Nao/trunk/core/common/Field.h $NAO_HOME/core/common
    cp /home/katie/Public/Nao/trunk/core/lua/init.lua $NAO_HOME/core/lua
  2. In a new terminal, type cd $NAO_HOME/tools/UTNaoTool. Then type make clean, then ./configure, and then make.
  3. Now type cd $NAO_HOME/build in another terminal. Then type ./compile all all to compile the changes. Then copy everything to your robot by typing ./copy_robot all.

How to Take a Log

Although you will work with the provided logs for much of this assignment, you will want to take more logs once you have tested your algorithms enough on the provided logs. Following are instructions to capture logs and download them from the robot to your $NAO_HOME/logs folder. Note: your robot should be on, naoqi should be running, and the desired code should already be uploaded before following these instructions.

  1. Go to $NAO_HOME/tools/UTNaoTool and open the tool by typing ./UTNaoTool. Click on the Log Select button.
  2. Select what you want to log in the Log Select window. For the vision assignment, click the box next to vision under the Send To button and the raw_image box. For the localization assignment, only click the box next to Localization under the Send To button.
  3. Click the Send To button to send the modules to log to the robot.
  4. Select the box next to Log on directly under the Send To button in the Log Select window. This starts the logging. As quickly as safely possible, start whatever behavior you want (probably walking with head panning for the assignments) by placing the robot in Playing. If your robot is walking, follow it and ensure that it does not fall.
  5. Once you have captured whatever you wanted, push the chest button on the robot to stop the robot from walking, and then uncheck the box next to Log on directly under the Send To button in the Log Select window to stop logging. Put the robot into a sitting stance as soon as possible to reduce risk of falling, overheating, and joint wear and tear.
  6. Go to $NAO_HOME/build and call the copy_logs script. You will need to give your robot's ip address as an arguement to the script when you call it. This script will copy the log(s) from your robot, and remove them from the robot. This may take a while.

How to Use a Log in the Vision Assignment

You are provided two logs to work with. You will want to test your algorithms on these logs - and future logs. You will also need to show me that your algorithm was running correctly on a log you take during your demo. The instructions below will explain how to use logs that you are provided and logs that you take. Note: you should already have logs to view (either from me or that have already been downloaded from the robot) before following these instructions.

  1. Go to $NAO_HOME/tools/UTNaoTool and open the tool by typing ./UTNaoTool.
  2. Open the log by going File->Open Log in the UTNaoTool and selecting the appropriate log.
  3. Select View Log on the main UTNaoTool window to see what was logged (ie, what the code on the robot was doing when the log was taken), or Run Core to see what your current code is doing (you will almost always use Run Core). When using Run Core to analyze your current code on logs, be sure to call make in $NAO_HOME/tools/UTNaoTool and reopen the tool to see how your code performs on the log.
  4. For vision logs, you will likely want to open the vision window by clicking on the Vision button on the UTNaoTool. You can look at the images and classifications for each frame by moving through the frames on the main UTNaoTool window. The smaller top left image is the camera image and the smaller top right image is the segmented image. Either of the smaller images can be seen in the larger image by clicking on the smaller image.

How to Use a Log in the Localization Assignment

The instructions below will explain how to use logs that you take. Note: you should already have logs to view (ie, logs that have already been downloaded from the robot) before following these instructions.

  1. Go to $NAO_HOME/tools/UTNaoTool and open the tool by typing ./UTNaoTool.
  2. Open the log by going File->Open Log in the UTNaoTool and selecting the appropriate log.
  3. Select View Log on the main UTNaoTool window to see what was logged (ie, what the code on the robot was doing when the log was taken), or Run Core to see what your current code is doing (you will almost always use Run Core). In either case, check the Localization Only box on the main UTNaoTool window (unless you want to run vision too, which you shouldn't need to). When using Run Core to analyze your current code on logs, be sure to call make in $NAO_HOME/tools/UTNaoTool and reopen the tool to see how your code performs on the log.
  4. For localization logs, you will want to open the world window by clicking on the World button on the UTNaoTool. With the world window open, push F3 to bring up a display of all the particles. The robot is shown at the weighted average of all the particles. You can see where the robot thinks it is in each frame by moving through the frames on the main UTNaoTool window.

How to Make a Color Table

Note: It looks like the logs I gave you, and the logs you take, may be taken by the top camera. For now, replace 'bottom' by 'top' in the instructions below. This will likely work. I'll update you soon via email if you need to do something different. FYI - you can see the camera with which a log was taken by opening the Vision window in the UTNaoTool and seeing whether 'Bottom Cam' or 'Top Cam' is written below the larger image.

  1. Go to $NAO_HOME/tools/UTNaoTool and open the tool by typing ./UTNaoTool.
  2. Open a log by going File->Open Log in the UTNaoTool and selecting the appropriate log.
  3. Click the Vision button in the UTNaoTool main window.
  4. Check the Classify box under the large image in the vision window. Uncheck overlay to get rid of detected beacons. What you see now in the segmented window is your current segmentation using your current color table.
  5. Click Bottom Table->New Bottom. Then move the frame slider on the main UTNaoTool window back and forth a bit until the segmented window goes black. This gives you a clean color table to work with.
  6. In the smaller classification window, select the color you want to define from the drop down menu.
  7. Now, make your color table by selecting regions of the large image to paint the selected color. Left click in the large image to make a temporary selection, and then right click to store the selection to your color table. Be careful though - you can undo one (and only one) stored selection by clicking the undo button in the classification window. You can remove colors by setting the drop down menu color to undefined, but this is slow and tedious.
  8. Once you have defined all of the colors you want using the steps in 7, do Bottom Table -> Save As defaultbottom.col in the vision window. I highly recommend copying your current defaultbottom.col table elsewhere before making a new color table just to be safe.

Vision Assignment Documentation

In $NAO_HOME/core/vision/VisionModule.h, segAt(i,j) gets the segmented pixel value from the enum defined in vision.h for the pixel at i,j.

The height and width of the segmented image are defined in $NAO_HOME/core/vision/VisionModule.h as SEG_IMAGE_HEIGHT and SEG_IMAGE_WIDTH (120 and 160, respectively).

You should implement detectBeacons in $NAO_HOME/core/vision/VisionModule.cpp to find beacons from the segmented image. Hint: Consider using the algorithm presented in Fast and Cheap Color Image Segmentation for Interactive Robots by James Bruce, Tucker Balch and Manuela Veloso. However, you can use any algorithm you want.

You need to implement getDistanceToBeacon() in $NAO_HOME/core/vision/VisionModule.cpp.

Call setBeaconObject (implemented in $NAO_HOME/core/vision/VisionModule.cpp) for every detected beacon. This method will cause the detected beacons to be displayed as world objects in the vision window of the tool. An example of how to call this function is given in the insertFakeBeacon function. Parameters to setBeaconObject are:


Other Things You Should Know


Instructions for Installing the Codebase on Your Personal 32-bit Linux Machine

To install our codebase on your laptop, try the following steps. We have successfully gotten it to work on multiple 32-bit Linux laptops running Lucid Linux. We do not recommend attempting to use our codebase on 64-bit laptops - no one has been successful at this yet. Remember that although you can do a lot of work on your laptop if you install the code base, you will still occasionally need to use the lab computers to take new logs on the robots. Note: Atleast one person has gotten everything to compile and work in Lucid Linux (10.04) running in VMWare Player. From a clean Linux install, he only had to do one additional apt-get for mesag-dev.

  1. Make a Nao directory in your home directory. Navigate into this directory, and copy the code from /home/katie/Public/Nao just as you would when installing on the lab computers. You can now ssh into any of the lab machines to do this.
  2. Do sudo apt-get install build-essential cmake ccache libqt4-dev libqt4-core subversion swig lua5.1 cmake-curses-gui libmpfr-dev libboost1.40-all-dev libqwt5-qt4-dev libdevil-dev libode-dev ruby1.8-dev libqglviewer-qt4-dev libqglviewer-qt4-2 fping g++ and sudo apt-get install libboost-system*
  3. Then
    cd ~/Nao/trunk/build
    ./rm -r linux/ robot/
    ./lua
    ./zlib
    ./compile linux swig
    ./compile all all
  4. Then
    cd ~/Nao/trunk/tool/UTNaoTool
    rm UTNaoTool
    ./configure
    make
  5. Finally, set
    export NAO_HOME=~/Nao/trunk
    export TEAM_NUM=11 (where 11 should be replaced by your team number)
    at the end of your .bashrc file.


[Back to Department Homepage]

Page maintained by Katie Genter
Questions? Send me mail