Hello everyone! I have already made a question about tf and synchronization which was succesfully answered, so I will tell again briefly what I am doing: I am currently working on building a 3D map of the environment of a robot. I am using a swissranger SR4000 and a xsens MTI for the rotation of the camera. The camera is attached to the robot a bit displaced from the center. More precisely, 80cm back and 1.4m above, so, when the robot turns, the camera describes a circle around the center of the robot. For that, I have made this transformation: ... swissranger.header.frame_id = "base_link"; swissranger.child_frame_id = "swissranger"; swissranger.transform.translation.x = x_scan_offset_-0.8; // offset in x for the camera swissranger.transform.translation.y = y_scan_offset_; swissranger.transform.translation.z = 1.4; // height at which the camera is placed ... I have a 2D map of the same environment, and I use that map for localization with amcl. When a new message from amcl arrives, I took a pointcloud from the swissranger. I also use the transformation from tf between /map and /swissranger to get the actual x-y-z of the swissranger in the map: ... tlistener.lookupTransform("/map", pc_current.header.frame_id, pc_current.header.stamp, bswisstransform); ... my first question is: does this sentence represent the actual position of the swissranger in the map?: bswisstransform.getOrigin(); I also have the values of the gyroscopes in a transformation called stransform. I publish it as a transformation between "/map" and a new frame which I named "/rot_swissranger", because I only need the values. It is ok to do that in this way? Finally, I want to use the position of the swissranger + the values of the gyroscopes of the xsens to build a transformation in this way: transform = tf::Transform(tf::Quaternion(stransform.getRotation()[0], stransform.getRotation()[1], stransform.getRotation()[2]-0.41, stransform.getRotation()[3]), tf::Vector3(bswisstransform.getOrigin())); The "0.41" value is there because the gyroscope is put in a different orientation to the direction that the robot moves. The problem is that it works well (or at least, not too bad) when it moves horizontally or vertically, but, when it turns, the point clouds are not transformed properly, it seems that the rotation is greater than desired, but when the robot reachs again an horizontal or vertical position the clouds appear in a more correct position again. I read in the xsens manual that the quaternions returned are unit vectors, so I try to normalize the resultant quaternion in this way: double a,b,c,d; a = stransform.getRotation()[0]; b = stransform.getRotation()[1]; c = stransform.getRotation()[2]-0.41; d = sqrt(1 - a*a - b*b - c*c); transform_rotation2 = tf::Transform(tf::Quaternion(a,b,c,d), tf::Vector3(bswisstransform.getOrigin())); But this is worst. So, the d value seems of great importance. Which is the mistake in the values of the gyroscopes I am using? Thanks in advance.