What's working:
* core utilities, minus rostopic list (some issue with roslz4 and
python)
* desktop
* desktop_full, minus gazebo (there's a conflict building Gazebo-
5.1.0 right now - more on this later)
Just figured I'd let people know! Please email me if you have any
questions/issues.
As announced a few days, we published episodes 2 and 3
of our Learning ROS series. Over these two new episodes we explain how
create a ROS package that allows a drone to autonomously takeoff, do
stuff (or idle as it's our case) and land (source code). Watch out the
last part of the video (minute 8:58) where we show a life demo of the code developed during the session.
We've
got good feedback so far with several comments pointing out the quality
of the audio. We'll try fixing this in future videos we record. Feel
free to through ideas on what kind of stuff you'd like to see explained
that could be helpful for your research/classes.
At Erle Robotics
we've been pushing Linux-based robotic brains for making robots and
drones for the last year. Our technology is being used in many
Universities and research centers around the world and with the feedback
obtained from our users and community, we've decided to put together a series about Learning ROS oriented to autonomous vehicles that use APM.
We wish to make this material as useful as possible to as many as we can reach. Feedback is welcome either here or directly to victor [at] erlerobot.com.
Thanks to the efforts of Mark Silliman, Austin Meyers, and Melissa Eaton we have a new set of tutorials for the TurtleBot targeted at making the TurtleBot and ROS in general available to the web developers in general. The tutorials include going all the way through to setting up a web interface. Find them at learn.turtlebot.com
Here's his announcement.
Free tutorials bring robotics programming to the web developer masses
Learn.turtlebot.com provides an easy launch pad for anyone interested in robotics
If you've ever dreamed of learning to program a robot, but didn't know how or where to start, your
day just got a whole lot better -- and your next few weekends are booked.
Learn.TurtleBot.com debuted this week its free, 30-session tutorial, which promises to teach
developers how to use the Robot Operating System (ROS) to drive a TurtleBot. The latter is an open source hardware and software platform that can autonomously navigate to objects and places.
"There have been so many technical barriers to robotics from the cost to the fact that most of the literature is aimed at academics," says the tutorial's creator Mark Silliman. "I want robotics to be
accessible to anyone who is interested."
The tutorials are written at a high-school level and can be completed over a couple weekends. Each includes a video and takes about an hour to finish. By the series end, developers will be able to direct their TurtleBot to bring them coffee to their desk (view video).
While the chance to build a coffee-bot is undoubtedly exciting, Silliman developed and funded the tutorials with an even loftier aim: to help create the next generation of robotics programmers.
Silliman, a serial entrepreneur and CEO of Smartwaiver, has a long-held passion for the robotics industry and its potential.
"I want to live in a world with a robot in every home, and though we have the computing power to do so, we also need a critical mass of people studying the field," Silliman says.
He contends that robots are ready for the first wave of amateur developers to build off the early work of robotics pioneers and take the field to new heights. Software packages such as ROS as well as the relatively affordable TurtleBot, which you can buy for $1,000 or build yourself for less, have helped make robotics even more accessible.
And now the learn.turtlebot.com tutorials push the needle even further by teaching core robotics programming concepts in a fun and affordable way.
"We're standing on the shoulders of many brilliant people and engaged, innovative communities," Silliman says. "I hope this helps break down even more barriers and puts us that much closer to making the robot revolution a reality."
The iRobot Create 2 is a great resource for students and developers alike. With many programming methods, anyone from beginning programmers to advanced roboticists can utilize the Create 2 platform to grasp the fundamentals of robotics and build their own advanced solutions with sensors, video feeds, and other peripherals.
The Gumstix iRobot Create 2 tutorials focus on basic ROS to control and steer a Create 2 while streaming video wirelessly from building the package to installation and deployment. The video shows the final results of the project. Follow the iRobot Create 2 Gumstix tutorial series here: https://goo.gl/nZWtWW
A more specific tutorial that shows how to add a real genetic algorithm planner is presented in this tutorial page.
It is possible to work with other path planning algorithms. We implemented the iPath C++ library that provides the implementation of several path planners including A*, GA, local search and some relaxed version of A* and Dijkstra (much faster that A* and Dijkstra). More will be added soon on Google Code.
The iPath library is available as open source on Google Code under the GNU GPL v3 license. The library was extensively tested under different maps including those provided in this benchmark and other randomly generated maps.
A tutorial on how to use iPath simulator is available on this link.
Credits particularly go to Maram Al-Ajlan (Master Student at Al-Imam Mohamed bin Saud University, Saudi Arabia) and Imen Chaari (PhD student at Manouba University Tunisia) for their efforts in implementing the algorithms and integration to ROS.
If you have any suggestions or questions about tools or tutorials, please contact me.
For more information see the submission on the Matlab Central File Exchange: http://www.mathworks.com/matlabcentral/fileexchange/44853-use-matlab-ros-io-package-to-interact-with-the-turtlebot-simulator-in-gazebo
I have modified Rviz so it can render in stereo. This will work on Linux if you have a Quadro card, 3DVision glasses, and a stereo-capable display (it should also work on any system that supports quad-buffered stereo display).
For now this requires building Ogre and Rviz from source. If this does not scare you and you want to try it out, you can read how here: (its not really that hard if you have the necessary stereo graphics HW)
One of the new features you may have noticed on the ROS website are the Robots portal pages, which They are designed to help you get a new ROS enabled robot up and running quickly.
But what do you do if you are trying to build your own robot?
Just head over to the new Sensors page, where there is a list of sensors supported by official ROS packages and many other sensors supported by the ROS community. The Sensors portal pages have detailed tutorials and information about different types of sensors, organized by category. Hopefully, the sensor portal pages can also become a resource for developers and inspire interoperability between similar sensors.
If your robots or sensors are not on the list, you can help improve the portals by adding your documented packages and tutorials.
While not yet complete, ROS is now one step closer to working on Windows using the Minimal GNU compiler toolchain thanks to the work of Daniel Stonier and Yujin Robot.
If you are a windows user and have experience using MinGW or cross compiling there is a tutorial showing how to use Qt with ROS on Windows up for people interested in testing and improving ROS support for Windows.
We've had a need to develop test and debugging apps for our test and factory
engineers, who, unfortunately (for them!), only use windows. While service
robotics' patched ros tree could give us msvc apps, it wasn't patched into
ros mainstream and it couldn't let us share our own testing apps on linux
with the test engineers on windows without building two of each application.
So...enter mingw cross <http://mingw-cross-env.nongnu.org/>! We've now got
this patched in eros/ros up to being able to run a talker/listener and add
int server/client along with inbuilt support for qt as well.
If you're interested in being a guinea pig to test this, or just curious,
you can find a tutorial on the ros wiki here:
If you come across any bugs (in the tutorial or the installation), reply to
this email, or contact me on irc in OFTC #ros so we can squash the buggers.
Hopefully as time goes by we can patch support in for other commonly used
ros packages as well as adding rosdeps upstream to the mingw cross
environment. We also aim to get ros running on msvc in a more complete way,
but that will need to wait for the rebuild of the ros build environment that
is looming.
Patrick Goebel of PI Robot has put together an excellent tutorial on doing 3D head tracking with ROS. In Part 1 he covers configuring TFs, setting up the URDF model and configuration of Dynamixel AX-12+ servos for controlling the pan and tilt of a Kinect.
I have put together a little tutorial on using tf to point your robot's
head toward different locations relative to different frames of
reference. Eventually I'll get the tutorial onto the ROS wiki, but for
now it lives at:
The tutorial uses the ax12_controller_core package from the ua-ros-pkg
repository. Many thanks to Anton Rebguns for patiently helping me get
the launch files set up.
Please post any questions or bug reports to http://answers.ros.org or
email me directly.
Melonee Wise has put together a tutorial on Adding a Kinect to an iRobot Create, which we hope will help those of you interested using the Kinect on inexpensive platforms. It walks you through two different methods of powering the Kinect directly off the Create (thanks Sparkfun!).
I Heart Robotics has a five-part, but ongoing "Vision for Robotics" series that helps readers integrate OpenCV into their robotics applications. They also put together a useful review of ROS USB cam drivers, which might be a useful pre-requisite.
Billy McCafferty of sharprobotica.com has put together a six-part series on design and testing concerns for developing ROS packages. This series tackles these challenges step-by-step, layer-by-layer: