[ros-users] Visual-Inertial SLAM Sensor
Janosch Nikolic
janosch.nikolic at mavt.ethz.ch
Fri Jan 13 13:02:10 UTC 2012
Dear ROS users and roboticists,
We (Swiss Federal Institute of Technology, ETH) are about to develop an
open Visual-Inertial low-cost camera system for robotics. The goal is a
standard piece of hardware that combines a monocular/stereo/multi camera
and inertial sensors (an IMU) providing synchronized, calibrated data in
ROS.
Such a piece of hardware could allow the community to focus more on the
development of novel algorithms and less on hardware and integration
issues. Our hope is that this will motivate young PhD students and
researchers around the world to make their VSLAM frameworks ROS
compatible, ultimately leading to 'plug and play VSLAM for everyone'.
The system will be based on the latest generation of XILINX FPGAs
(Artix/Kintex-7 / Zynq, that's fixed) as well as a user programmable CPU
and provide at least a GigE port. We target real-time, low-power
processing, and the user should have access to raw data as well as
pre-processed information such as features and flow fields etc.
We would like to invite you to give us feedback an thereby influence the
design of our system. We are extremely happy about any feedback you can
provide, more general feedback is as welcome as technicalities such as:
- Cameras (number, configuration, resolution, HDR, global shutter, lens
type, etc.)
- General specs such as weight, size, power consumption, etc.
- Inertial sensor grade
- Interface to the host, API, etc.
- On-camera features (feature detection, dense optical flow, dense
stereo, etc.)
Many thanks and best regards,
ETH - CVG
ETH - ASL
More information about the ros-users
mailing list