[ros-users] Simulating RGB-D sensors in Gazebo
abm760 at gmail.com
Thu Feb 17 14:29:20 UTC 2011
This is an issue that we are addressing right now as well. In our
simulation we have a block laser working to simulate the Swiss Ranger 4000,
similar to the Kinect depth sensor but lower resolution. It creates a
serious hit in performance in gazebo, so we are looking at getting the
stereo camera gazebo plugin working again. We've found various issues in
getting it to run without crashing, but still have not managed to get the
valid data out of the depth buffer in orge. John Hsu has opened up a ticket
with the code we've been working on at
https://code.ros.org/trac/ros-pkg/ticket/4795 Please fell free to take a
look, and hopefully someone will be able to make some headway in moving this
along. You can contact me if you have any questions or suggestions. We
will be picking this up again in a month or so, but for now it's been put on
On Thu, Feb 17, 2011 at 7:00 AM, Stefan Kohlbrecher <
stefan.kohlbrecher at googlemail.com> wrote:
> Hi everyone,
> I recently started looking into possibilities to simulate RGB-D
> sensors in gazebo. At least for the moment, I only need non-colored
> Pointcloud data, so using a GazeboRosBlockLaser is sufficient for now.
> I'm still wondering if that's currently the only possibility and what
> other people are using.
> John talked about the StereoCamera plugin available in Gazebo that
> reads out the Z-Buffer here:
> Is there a ROS plugin in the works for this?
> Regarding GazeboRosBlockLaser, to make it work in CTurtle, I had to
> modify the example given here:
> Among others, I had to add <resRange> or Gazebo would crash, as well
> as remove the <model:physical> and <body:empty> tags. Should I open a
> ticket regarding this?
> ros-users mailing list
> ros-users at code.ros.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the ros-users