As long as you have a source of odometry, the navigation stack should work fine with the URG-04. One thing to watch out for is that in large open spaces, amcl will lose confidence because the shorter-range laser won't provide much data for matching against the map. brian. On Oct 5, 2010 11:33 PM, "Prasad Dixit" wrote: > > Hello experts, > This question is an extension of my thread raised sometime back. > 1. > Please let me know what your experience says... > I am working on indoor robot application, in which i am using only one > Hokuyo URG-04LX-UG01 sensor. I would like to know that, Is it efficient to > use only one sensor? can i achieve Localization, Planning and Navigation > cycle using only single sensor? How effective would be ROS algorithms in > such case? Do i need any additional supportive sensors like IMU or some > external Trans-receivers or Vision sensors? > > Though I am using Sonar sensors but they are not involved anywhere in > Navigation process. They are just kept as precautionary measures in case > obstacle is on some height which is above laser scans. > > 2. Is it possible to change the range of the Laser scanner dynamically? Is > there any parameter for it to set that in ROS? > For e.g: Suppose my Robot is running in a museum. The robot is mounted with > the touch screen on the top (like texai monitor) . A viewer can access the > screen to get the information. Now my laser will have specific range for > usual navigation but if user comes in its front the laser then that time it > should reduce its range so as not to treat user as an obstacle. > OR i can have camera to capture face and direct laser to reduce its range. > > - Prasad > > -- > View this message in context: http://ros-users.122217.n3.nabble.com/Single-Laser-dependency-tp1640828p1640828.html > Sent from the ROS-Users mailing list archive at Nabble.com. > _______________________________________________ > ros-users mailing list > ros-users@code.ros.org > https://code.ros.org/mailman/listinfo/ros-users