Only released in EOL distros:  

Motivation

  • This package is the first package of the stack wpi_nao. More packages will hopefully be available soon for human-robot interaction and different robot oriented machine learning applications.

  • The main purpose is to control humanoid robots (Aldebaran Nao in this case) in the most natural way possible, without wearing any kind of device, using arm and leg gestures. The second purpose is to make teaching new tasks to robots easier, even for people with no robotics knowledge. The ultimate aim is to make contribution to Learning from Demonstration field.

Functionality

  • This package is aimed to be more general than just controlling Nao using Microsoft Kinect. It's basically querying a depth camera (Microsoft Kinect in this case), interpreting the user's gestures (angles between limbs), and publishing messages to the robot control node (nao_ctrl in this case).

How To Control the Robot

  • nao_ni_controller.png nao_ni_limbs.png


  • (Left) The location where user switches turns the motors ON, becomes the Origin of the navigation controller. The blue zone shown in the figure is the safe zone where the user's small movements won't affect the robot. When the user steps in the red zone, robot steps in +/- X, +/-Y direction accordingly. Initial pose of the user's shoulders become the zero for the angular motion of the robot. Again, the angular blue zone shown in the figure is safe, but rotation made between the blue lines and red lines would make the robot to rotate around Z axis.


  • (Right) The robot has two modes. Body Control mode and Gaze Direction Control mode. The user can switch between these modes using his/her right leg. When in Body Control mode, motions shown in the left figure would make Nao navigate. In Gaze Direction Control mode Nao would pay attention to the right arm of the user and rotate its head accordingly. The user then, can use his/her left arm to give an additional command. For now, Nao is rotating and walking towards the direction its looking at, however I hope to add more commands with the help of the community.

Video

  • To appear at HRI2011 Videos Session

Inside the Package

  • nao_openni/src/teleop_nao_ni.cpp is the main and the only code for now. It has tons of comments but it still has lots of room for comments. I would be more than so super mega happy if you read, find mistakes, develop the code, help me go through the To Do list.

  • nao_openni/nao_ni_walker.py is the python code that controls the motion and speech of Nao. It should be copied in the folder PATH_TO_YOUR_ROS_STACKS/FREIBURGS_NAO_PACKAGE/nao_ctrl/scripts. Here, FREIBURGS_NAO_PACKAGE is the package developed by Armin Hornung at The University of Freiburg (see http://www.ros.org/wiki/nao).

How to Run the Code

  1. If you don't have them already, download the packages written in section Dependencies with their dependencies using,
     rosdep install PACKAGE_NAME
  2. Copy nao_openni/nao_ni_walker.py to nao_ctrl/scripts/

  3. Launch Microsoft Kinect nodes,
     roslaunch openni_camera openni_kinect.launch
  4. If you'd like to see your self being tracked, which is useful when you control the robot,
     rosrun openni Sample-NiUserTracker
  5. Turn on Nao, make sure that it's connected to the network, check it's IP number,
     roscd nao_ctrl/scripts
     ./nao_ni_walker.py --pip="YOUR_NAOS_IP_HERE" --pport=9559
  6. Make and run nao_ni
     rosmake nao_openni
     rosrun nao_openni teleop_nao_ni
  7. Stand in standard Psi pose, wait for the nao_ni code to print out "Calibration complete, start tracking user".
  8. For a short tutorial and to get familiar with the commands, see the video


IMPORTANT NOTE: Calibration of the nao_ni takes (currently) much longer than Sample-NiUserTracker. It's because the message publishing rate for Nao is relatively low. Don't get confused if you're watching your self on "Sample-NiUserTracker". Wait until you see the "Calibration Complete" message on the terminal you're actually running "nao_ni". The plan is to make it independent from publishing rate and thus faster for the next release.

More information

Dependencies

The gesture interpretation code is written in CPP, and the following ROS packages are required:

Wiki: nao_openni (last edited 2012-02-02 19:48:18 by BenerSuay)