BTW, anyone that can has an SR3K device, and can test this, I would be interested to hear the results. I have tested this on an SR4K on several Ubutnu platform, but I don't have an SR3K. Anyone interested can get the camera_drivers_experimental stack, run 'rosdep install sr4k', change a compile flag in the sr4k.h for SR3K use (this will get changed in the future to a configuration option), and you should be able to view the output in rviz using the node name as the frame id. Radu Bogdan Rusu wrote: > Great work Patrick! Thanks. > > Btw, the swissranger in the TUM repository _was_ already handling both > SR3k and SR4k, as we had both sensors there. If it didn't properly, > someone must have introduced a bug somewhere. I remember Morgan sent me > some patches but due to me moving overseas, I never had time to check > them in. > > Cheers, > Radu. > > Patrick Beeson wrote: >> I have added a new sr4k package to the camera_drivers_experimental >> stack. This extends the swissranger package in the tum repository to >> handle the newer Mesa Imaging libraries, and supports both SR3000 and >> SR4000 devices. >> >> I do not have an SR3K device, so I could only test with an SR4K, but >> the API is the same for both (an early calls tells the Mesa libraries >> which device you want to talk to and differences are handles at the >> Mesa library level). >> >> I have not yet included a standalone viewer, but users can view the >> output in rviz. The current frame id is the node name (defaults to >> /sr4k). The user can view Image and PointCloud types. There are 3 2D >> images, distance, intensity, and confidence. >> >> Patrick Beeson >> >