Hi Daniel, On Mon, Nov 15, 2010 at 1:50 PM, Daniel Radev wrote: > After examining the code, I believe there is image_transport package (not a > surprise) to publish camera data (either row, compressed images and theora > video). > > However I need video streaming, i.e theora streamed over the network so > that the device can show it. Currently it looks like not possible or I can't > find how to do it. > Do you have any further info or idea how to achieve this? > I'm not even sure decoding Theora video (or anything other than H.264) on the iPhone is a solved problem. Are you able to run some version of libtheora on the phone and get good performance? image_transport provides plugins for various compression schemes, but these (at least the existing ones) are limited to the transport protocols currently supported by ROS, namely TCP and UDP. Whereas for streaming video, something like RTP (real-time transport protocol) is more suitable. Maybe you can extract something useful from theora_image_transport, or at least use it to create Theora packets on the robot side. I'm an expert on neither streaming video nor iPhone development, so I'm afraid I can't be of much more help. Some folks at SIUE have worked on a ROS interface for the Parrot AR.Drone quadrotor, which uses H.263 for the video feed (ML archive). Maybe they have some more experience interfacing ROS with streaming video. Also is there a way to simulate a camera(even several images), just so I can > continue with my work and tests, without actual hardware? > If you want images from a true simulated environment, you should investigate the ROS wrappers for the Gazebo simulator, http://www.ros.org/wiki/simulator_gazebo. In particular the GazeboRosCamera plugin in gazebo_plugins. If just publishing some images from disk is sufficient, mocking up a ROS node that acts like a camera driver is pretty easy. There's some code here(for continually publishing a single image) that should get you started. A final and maybe the best option is to record a bag file of actual data from your robot, and use that while testing. Cheers, Patrick