[ros-users] [Discourse.ros.org] [ROS Projects] Tractobots, my attempts at field robots

Max Schwarz ros.discourse at gmail.com
Thu Mar 30 15:51:06 UTC 2017

Very nice project! One remark on the killswitch: I hope it is implemented in a fail-open way (no wireless connection -> stop) ;-)

I guess you are not using any localization filter at the moment. In that case, you don't need odometry/pose messages, just direct tf transforms.

Do you get attitude measurements (e.g. compass bearing) from the GPS? I assume so in the following.

The remaining question is how you fuse the two GPS measurements. Naively, I would simply average them in the UTM frame, and then pretend that there is a single GPS antenna in the center. Then, you could even use your old code as you mentioned.

Otherwise, you already identified the steps you have to take to transform the data in ROS properly. Here is how I would fill these steps with more details:

* Model your robot using URDF, including coordinate frames for base_link (somewhere in the middle of your tractor), gps_link (location of your GPS antenna) and rear hitch. For starters, you can use fixed joints for everything.
* Start a robot_state_publisher, which takes care of publishing these transforms
* Check that the transforms look okay using rviz (fixed frame: base_link, add a TF display).
* Write a node which converts GPS lat/lon measurements into some Cartesian system (UTM?) and publishes a transform world -> base_link. I'm not aware of a ready-to-use implementation for your use case, but it should not be hard to implement.
The tricky part is the correction for the mounting offset: Ask tf for the transform gps_link -> base_link and multiply that from the right side onto the world->gps_link transform (UTM) to obtain world->base_link.

[Visit Topic](https://discourse.ros.org/t/tractobots-my-attempts-at-field-robots/1486/11) or reply to this email to respond.

More information about the ros-users mailing list