[ros-users] [Discourse.ros.org] [TurtleBot] A do-it-yourself Turtlebot

Anfederman ros.discourse at gmail.com
Fri Feb 23 18:28:17 UTC 2018

I can send you the botvac article plus youtube videos of  interest.  I have to look for the original Roll Your Own Turtlebot article. (about 3 laptops ago!)   Sending raw odt:

Roll Your Own Turtlebot Part II

By Alan N. Federman (Dr. Bot)

Several years ago I wrote a Servo article showing how you could convert an old Roomba vacuum cleaner and a Microsoft Kinect into a robot training platform capable of teaching yourself  ROS, the Robot Operating System from Willow Garage.  Willow Garage has now closed, and Clearpath Robotics can sell you a complete brand new Turtlebot for about $2100. Recently OSRF and Robotis has announced a less expensive Turtlebot, but this is still a bit out of the range for most serious amateurs  What if you could easily build the equivalent of a Turtlebot for under $300?  I am going to show you how easy it is to convert a Neato robotic vacuum cleaner into a fully functional training platform in less than a day.  Very little hardware skill or special tools are needed. Everything is available COTS, and the software is all Open Source.

What you need to get, if you don't already have them:

A laptop or wifi connected desktop running Unbuntu. This should be at least 14.04 and running ROS Indigo, but it would be better to upgrade to the same versions if running cross platform. ROS versions are usually matched to Ubuntu releases.
A Neato Botvac or equivalent (I have seen used XV-12s for under $200, and new basic models for under $300)
A Raspberry Pi 3 (camera is optional but highly recommended) ~$50
16 Gig Sd Card for Pi ~$10
Rechargeable 5v power pack (Those for recharging cell phone are fine) ~$20
USB cables for battery pack to Pi (micro) and Pi to Botvac (can be mini or micro depending)
Small scraps of aluminum or a tin from can.
Small scraps of flat plywood or acrylic
Velcro, double sided tape or other easy to remove adhesives

Step 1 Modifying the Botvac

Depending on your model, you may chose to ignore any hardware modifications entirely. Then if you mess up, you can just use it to cleanup your house!  I removed the brushes, the dust bin and used a stip of metal to disable the bin detector switch (See Photo 1)

STEP 2 Preparing the Pi and attaching to Botvac

Artfully arrange the Pi, battery pack and optional camera on a 6 by 6 flat piece of wood or plastic. Attach with double sided tape. On the bottom of the assembly, attach a piece of Velcro or similar quick release fastener. Attach the matching Velcro to the top of the Botvacs Lidar unit. Lastly plug in the USB cables.  You might want to charge your batteries.  It would be a shame to have all the software loaded and than have to wait to test it.

STEP 3 Loading the software onto the PI.

At the time of this writing, an official version of Ubuntu 16.04 was not available fro the Pi 3. I used the Ubuntu Mate (pronounced ma tay) version. Instructions for loading Mate are found here:


And follow the instructions for 16.04  Raspberry PI 2/3.

You can use an HDMI TV and attached keyboard to initially set up the Pi, using the Graphical Environment.  It also helps to have a direct Ethernet connection when doing the initial set up, because you need to load a lot of software initially. Using the Desktop, it is pretty easy to get WiFi working.

I suggest creating an 8 gig image on a 16gig SD card. After the initial software is on and you can bring up a graphical desktop, follow the intro screen and click on Raspberry PI info  it will enable you to expand the image to 16gig. It also will allow you to configure your WiFi.  I suggest you still use the Ethernet Connection, but you can at this point open a terminal, type sudo graphical disable, and then use ssh over WiFi to complete the installation.

Once Ubuntu is working, continue loading ROS onto the PI,

You may have to maintain your Ubuntu distributions; the following commands are useful:

sudo apt-get update
sudo apt-get upgrade  (must run both in sequence)

and sometimes to clear dpkg errors:

sudo dpkg configure -a



sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main" > /etc/apt/sources.list.d/ros-latest.list'

sudo apt-key adv --keyserver hkp://ha.pool.sks-keyservers.net:80 --recv-key 0xB01FA116

sudo apt-get update
sudo apt-get install ros-kinetic-desktop-full

(if -full is not available, just get ros-kinetic-desktop)

sudo rosdep init
rosdep update
echo "source /opt/ros/kinetic/setup.bash" >> ~/.bashrc
source ~/.bashrc
sudo apt-get install python-rosinstall

ROS Catkin Workspace installation

mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/src
cd ..

And then edit .bashrc to change the source /opt/ros/kinetic/setup.bash to ~/catkin_ws/devel/setup.bash
also it helps to add the following line if the Pi is hosting the robot:

export ROS_MASTER_URI=http://$HOSTNAME.local:11311

Where $HOSTNAME is the name you have in the /etc/hostname file

Your /etc/hosts file should look like this:	localhost	your hostname

# The following lines are desirable for IPv6 capable hosts
::1     ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

This will support ROS networking.

You also may wish to install chrony to synchronize the time different 'nodes' are running at, this is because the Pi doesn't have a real time clock, and if your Wifi is not connected to the Internet, the Pi will have the wrong time.

Next I suggest you load the following ROS packages into your catkin_ws/src workspace:

ROS by Example part one (RBX1) from Patrick Goebel

https://github.com/pirobot/rbx1     This should go on both your laptop and the PI


the SV-ROS Botvac nodes courtesy of mostly Mr.  Ralph Gnauck


follow the instructions in the README files to install and test.


cd ~/catkin_ws/src

git clone https://github.com/SV-ROS/intro_to_ros  (this also should go on both)

cd ..



On the laptop:

roscd teleop

If nothing is found,

sudo apt-get install ros-kinetic-teleop-twist-keyboard

On the Botvac, turn on the pi, and sign on via a terminal window from the laptop.

I like to launch a custom base only node on the Pi

roslaunch bv80bot_node bv80bot_njoy.launch (code included at the end of this article)

Then you should hear the Neato Lidar unit start to spin.

On the Laptop open up another terminal window and

set up the ROS_IP  and ROS_MASTER_URI environment variables via the 'export' command.

Test to see if you are getting topics:

rostopic list

and scans

rostopic echo /scan

Finally launch teleop

rosrun teleop_twist_keyboard teleop_twist_keyboard.py

You should be able to drive your robot.

If you open RVIZ in another window, you should see the LIDAR returns.

What Next?

With just this simple robot, you can begin to learn how to accomplish advanced robotics tasks and begin to learn the subtleties of autonomous navigation. Because of the Neatos XV-11 LIDAR unit, you can simultaneously accomplish localization and obstacle avoidance. Support for webcams and the Raspberry Pi Camera are available through ROS nodes. I have gotten teleop via a blue tooth joystick to work through the laptop, but not directly on the Rpi.  Please note that though the Rpi has 4 USB slots, there is seldom enough power to run more than the Botvac interface and a WiFi dongle. A typical USB webcam will draw too much current and crash the Rpi.

With a WiFi connected phone and some ingenuity, you would be able to issue voice commands. So you can call up your home robot from the office and ask it to find the cat. The Botvac is a little underpowered for bringing you a snack from the kitchen, but when the next more powerful platform is available, youll know just how to program it.

Figures and Code

LISTING 1  /catkin_ws/src/intro_to_ros/bv80bot/bv80bot_node/launch/include/bv80bot_njoy.launch


  <!-- Change this to use a different joystick controller -->
  <!-- Set the default value for the 'teleop_controler' arg below to one of the folowing values to select the controler you are using to teleop the robot:

    no joystick launch from pi, joystick or keyboard on remote

  <arg name="teleop_controler"   default="xbox360" />

  <arg name="input_cmd_vel_topic"    default="/raw_cmd_vel" />
  <arg name="feedback_cmd_vel_topic" default="robot_cmd_vel" />
  <arg name="output_cmd_vel_topic"   default="smoothed_cmd_vel" />

  <!--  smooths inputs from cmd_vel_mux/input/teleop_raw to cmd_vel_mux/input/teleop -->
  <include file="$(find bv80bot_node)/launch/include/velocity_smoother.launch">
    <arg name="input_cmd_vel_topic"    value="$(arg input_cmd_vel_topic)" />
    <arg name="feedback_cmd_vel_topic" value="$(arg feedback_cmd_vel_topic)" />
    <arg name="output_cmd_vel_topic"   value="$(arg output_cmd_vel_topic)" />
  <!-- velocity commands multiplexer -->
  <node pkg="nodelet" type="nodelet" name="cmd_vel_mux" args="load yocs_cmd_vel_mux/CmdVelMuxNodelet mobile_base_nodelet_manager">
    <param name="yaml_cfg_file" value="$(find bv80bot_node)/param/mux.yaml"/>
    <remap from="cmd_vel_mux/output" to="/robot_cmd_vel"/>
    <remap from="cmd_vel_mux/input/navi" to="/cmd_vel"/>
    <remap from="cmd_vel_mux/input/teleop" to="$(arg output_cmd_vel_topic)" />
  <!-- create transform for laser (should be moved to the URDF) -->
  <node name="laser_to_base" pkg="tf2_ros" type="static_transform_publisher"  args="-0.090 0.0 0.037 0 0 0 1 base_link base_laser_link" />

  <!-- launch the main base driver node -->
  <node name="neato" pkg="neato_node" type="neato.py" output="screen">
    <param name="port" value="/dev/ttyACM0" />
    <remap from="cmd_vel" to="robot_cmd_vel" />
    <remap from="/base_scan" to="/scan" />
  <!-- publish teh URDF -->
  <param name="robot_description" command="$(find xacro)/xacro.py $(find neato_node)/urdf/neato.urdf.xacro" />
  <!-- publish the robot state transforms -->
  <node name="robot_state_publisher" pkg="robot_state_publisher" type="robot_state_publisher" output="screen" >
      <param name="use_gui" value="False"/>


LISTING 2  Terminal output from launching startup nodes:

roslaunch bv80bot_node bv80bot_njoy.launch &

rostopic list



https://drive.google.com/open?id=0B2Cigsr5YM6ldXZjNTAxeGdYWjg   Fig 1  XV-12 dustbin removed

https://drive.google.com/open?id=0B2Cigsr5YM6lcHh3di00dTFYNTQ  Fig 2  XV-12 Name Plate

https://drive.google.com/open?id=0B2Cigsr5YM6lSnAxT3JIc210U2c    Fig 3  XV-12  Brush Removed

https://drive.google.com/open?id=0B2Cigsr5YM6ldFhYZ3JEcXdUUnM   Fig 4 RPi 3 mounted

Youtubes of Turtlebot I from create base and voice control ~2013-2014? :




Other Youtubes of interest:

https://www.youtube.com/watch?v=Tzk9020D5x4   Botvac 2015

https://www.youtube.com/watch?v=F-hNlJfmdPI&t=100s  Magni 2016

https://www.youtube.com/watch?v=w-Vk1Lcp6_0  Loki 2015

[Visit Topic](https://discourse.ros.org/t/a-do-it-yourself-turtlebot/3978/13) or reply to this email to respond.

More information about the ros-users mailing list