[ros-users] [Discourse.ros.org] [ROS Projects] IMcoders: easy-to-install (and cheap) odometry sensors

solosito ros.discourse at gmail.com
Mon Jul 30 10:12:17 UTC 2018





Hey there,



We would like to introduce you our project: **IMcoders**.



A few months ago we started developing some sensors in our spare time to provide odometry data for robot with wheels. One of the main objectives of the project is to develop sensors extremely easy to integrate in already developed devices. The provided odometry data can be used among the output of other sensors to navigate autonomously.



We presented this project to the [Hackaday Prize](https://hackaday.io/prize) under the category [Robotics Module Challenge](https://hackaday.com/2018/04/23/robotics-module-challenge-build-robot-win-prizes/) so in case you want information about the development process, just take a look here: https://hackaday.io/project/158496-imcoders/





Here it is a short introduction before going to the problematic:





If you want to prototype with autonomous navigation on real vehicles the options are quite limited.



Imagine you want to make the forklift in a warehouse, a tractor at the field or a wheelchair at the school to navigate autonomously, to prototype and get some first hand input data you could easily integrate cameras and get some visual odometry or attach a GPS device for the use cases when you are outside. Perfect, you can navigate using that input but still the system is not reliable enough, maybe there are not features for computing a reliable visual odometry or you are inside a building and there is no GPS signal, wouldnt be nice to have some encoder input? Lets add them to our vehicle!... hmmm not so easy, right? If you are good at mechanics, you could install some encoders in your vehicle, but on off the shelf vehicles, the hardware modifications needed to add this sensors are not a real option for all the people. At the moment, there is nothing mechanical and money-wise affordable for everybody. In order to meet this needs the IMcoders are borned.





For that we are using IMUs but not in a conventional way. The idea here is to attach an IMU to each robot wheel and measure its spatial orientation. Tracking the change of the orientation of the wheel we can infer how fast the wheel is spinning and, if needed, its direction. (Yes, as you probably already notice the Idea is to provide an output very similar to a traditional encoder, just from a different source, hence the name IMCoder = IMU+Encoder)



You could think this approach has a lot of error due to the nature of IMUs (and of course is not the perfect solution for every use case!) but adding some constraints based on the location of the IMUs on the robot most of the error can be mitigated so that the output provided is stable for most use cases.





After some simulations, we developed some IMU wireless boards which provide IMU data using the ROS interface:



![sensor1|292x321](upload://eCcmnGHfIBBM599P6AdpnbU4ALn.gif)



It means that combining some of them and using some theory about differential drive steering, we might be able to calculate a reliable odometry. So that's what it is almost happening. To focus on the odometry calculations we created a simulation environment using gazebo and we attached one IMU (using the gazebo IMU plugin) to each wheel of our simulated differential drive robot. It is almost working as expected: the calculated odometry using our sensors is quite similar to the one provided by the diff_drive plugin for gazebo. We say almost because there is still a mismatching between the output odometry provided by the diff_drive odometry plugin and ours. We guess there is something we are not considering within our calculations for the odometry, so our output it is not as good as expected (it is our first time working with quaternions).



Summarizing what we are doing:



* We get the IMU absolute orientation as a quaternion in one time instant and also in the next one.



* We compute the quaternion that defines the rotation between the first measurement and second one



* The rotation of the sensor is translated to linear velocity (we know the diameter of the wheels).





With this information (linear velocity of each wheel) and a little bit of theory about differential driving vehicles, we are able to compute the new position of the robot.





![odom|690x389](upload://gSJ8esPbpHhe0ZlOjxDiYTmDafE.jpeg)





In the image there are three arrows: the orange one corresponds to the diff_drive gazebo plugin output and the red/green one (they are the same) corresponds to our output.



It is easy to observe that after some left/right turning the odometry we are computing it is accumulating some error.



**We assume it is because of our calculations. Do you have any idea how can we verify the correctness of our implementation?**



[Here you can find the link to our code](https://github.com/solosito/IMcoders/blob/devel/imcoders_diff_odom/src/imcoders_diff_odom/imcoders_diff_odom.cpp#L123). Concretely where the odometry is being computed.



In case you want to reproduce the problem [just follow the readme](https://github.com/solosito/IMcoders/tree/devel) in the repository (more precisely, the *Differential wheeled robot with IMcoders* part) and you will be able to play around with our simulation environment.



Once the problem is solved we will continue integrating the sensors in a comercial RC car (Parrot Jumping Sumo) and testing them with real data:





![45985ced-0df9-4b38-aef4-9132f438ae87|674x500](upload://pUvHGstYL8645ElXZY8Mecx8RD.jpeg)











---

[Visit Topic](https://discourse.ros.org/t/imcoders-easy-to-install-and-cheap-odometry-sensors/5543/1) or reply to this email to respond.









More information about the ros-users mailing list