[ros-users] Single Laser dependency

Eric Perko wisesage5001 at gmail.com
Wed Oct 6 07:15:54 UTC 2010


Prasad,

On Wed, Oct 6, 2010 at 2:33 AM, Prasad Dixit <abhimohpra at gmail.com> wrote:

>
> Hello experts,
>  This question is an extension of my thread raised sometime back.
> 1.
> Please let me know what your experience says...
> I am working on indoor robot application, in which i am using only one
> Hokuyo URG-04LX-UG01 sensor.  I would like to know that, Is it efficient to
> use only one sensor? can i achieve Localization, Planning and Navigation
> cycle using only single sensor? How effective would be ROS algorithms in
> such case? Do i need any additional supportive sensors like IMU or some
> external Trans-receivers or Vision sensors?
>
>
All you should need to do navigation is the single laser scanner and a good
source of odometry information to describe how the robot base is moving.
I've had good luck with no more than a gyro and wheel encoders for
outputting odometry information for a differential drive robot and fusing
them with an Extended Kalman Filter to output the information in the
Odometry message type. As long as your odometric reference frame does not
drift very quickly, navigation should be possible. By pairing those sensors
with a Sick LIDAR, I'm able to get accurate localization (1-2 cm error would
be my guess, though I've never directly measured it, only seen algorithms
that depend on precise localization have error in that range) in a map
generated with gmapping of my lab. One thing that may affect localization
using your Hokuyo (depending on environment) is the short range of the
URG-04LX series; they are good out to 4 meters or so if I recall correctly,
which is much lower than the max range of the Sick's I use (80 meter max
range). I haven't done much testing with such a short range LIDAR, so I
don't know how it will affect AMCL's performance. As far as planning and
path following, you may want to refer to
http://ros-users.122217.n3.nabble.com/Nav-Stack-with-nonholonomic-robots-td1404191.html#none
which
describes my experience with base_local_planner and base_global_planner on
my differential drive robot; the navigation stack documentation will also be
useful for evaluating whether the algorithms used are optimal for your
situation and robot.

If you only want to use the laser, there are some packages (stack page:
http://www.ros.org/wiki/scan_tools) for doing odometry estimation from the
laser directly, but I cannot comment on their effectiveness as I've never
used them.


> Though I am using Sonar sensors but they are not involved anywhere in
> Navigation process. They are just kept as precautionary measures in case
> obstacle is on some height which is above laser scans.
>

2. Is it possible to change the range of the Laser scanner dynamically? Is
> there any parameter for it to set that in ROS?
> For e.g: Suppose my Robot is running in a museum. The robot is mounted with
> the touch screen on the top (like texai monitor) . A viewer can access the
> screen to get the information. Now my laser will have specific range for
> usual navigation but if user comes in its front the laser then that time it
> should reduce its range so as not to treat user as an obstacle.
> OR i can have camera to capture face and direct laser to reduce its range.
>

I believe you _can_ change the laser scanner's parameters dynamically using
dynamic_reconfigure. I'm not familiar with it any farther than using the
reconfigure_gui provided in the dynamic_reconfigure package, but that GUI
does not require you to manually restart the Hokuyo driver when you change
parameters. I suggest you examine the dynamic_reconfigure docs to
investigate how to change the parameters from another ROS
node programmatically.

That said, in this situation, I doubt you want to reduce the laser's range
so much as filter out any pings that you can say for certain correspond to
the user and not the environment; I believe there is a node someplace that
does't this for the PR2's arms in the tilting laser scans, but I can't
recall the name of the node.

Also, filtering them out entirely would be a poor idea in my mind, as the
user _is_ still an obstacle that you don't want to drive through (at least
I'm assuming you don't want to drive through them :) ). Make sure you keep
in mind that the robot can only avoid things it has knowledge of, either
known a priori or gained through sensors while navigating.

Hope that helps.

- Eric


>
> - Prasad
>
> --
> View this message in context:
> http://ros-users.122217.n3.nabble.com/Single-Laser-dependency-tp1640828p1640828.html
> Sent from the ROS-Users mailing list archive at Nabble.com.
> _______________________________________________
> ros-users mailing list
> ros-users at code.ros.org
> https://code.ros.org/mailman/listinfo/ros-users
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.ros.org/pipermail/ros-users/attachments/20101006/3c384291/attachment-0003.html>


More information about the ros-users mailing list