[ros-users] Web introspection tools

Edwards, Shaun M. sedwards at swri.org
Wed Jun 25 22:20:57 UTC 2014


Vincent,

I’m not an expert in web development, but what you are describing is a full port, requiring code rewrite in javascript I assume.  I believe this is a large effort, especially if we included all rQt widgets.

The main benefit that I see is at the application level (i.e. user interfaces).  In particular, we receive a lot of critical feedback about ROS being Ubuntu only.  Our “work-around” is to create web based user interfaces.  Once it’s in a web browser, nobody questions what is running the server.

I would propose that we focus on making application level tools web-based.  To a certain extend the  Robot Web Tools group has done this.  I do not see great value in moving introspection tools to a web based environment.  Introspection is largely a developer task, why can’t they use native applications.  If rQt tools weren’t written you might be able to make an argument for web based introspection, but they are written and there seems to be little value in porting them.

All that being said…I am interested to hear if there are any application level tools that are missing from the Robot Web Tools.

Shaun Edwards
Senior Research Engineer
Manufacturing System Department


http://robotics.swri.org
http://rosindustrial.swri.org/
http://ros.swri.org<http://ros.swri.org/>
Join the ROS-Industrial Developers List<https://groups.google.com/group/swri-ros-pkg-dev/boxsubscribe>
Southwest Research Institute
210-522-3277

From: ros-users-bounces at lists.ros.org [mailto:ros-users-bounces at lists.ros.org] On Behalf Of Vincent Rabaud
Sent: Wednesday, June 25, 2014 3:05 PM
To: User discussions
Subject: [ros-users] Web introspection tools

At ROS Kong, some rQt on the web was demoed as well as a Gazebo web visualizer.

I am no expert in web programing but what prevents us from moving rQt and RViz to the web/javascript ? The obvious advantages would be cross-platform support, optimizations independent of the underlying libraries, easier code to debug, no compilation. Plus, there are already great tools like robotwebtools and rosbridge.

Here are a few reasons I was told that could prevent that move; are those really valid ? Are there better reasons ?
- hard to access local files (html is not aware of local files)
- no customizable shaders in WebGL (is that the case ? do we need those ?)
- slow JSON parsing libraries (aren't those optimized in recent browsers ?)
- heavy bandwidth usage (how about binary JSON ?)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.ros.org/pipermail/ros-users/attachments/20140625/5bc90ad1/attachment.html>


More information about the ros-users mailing list