[ros-users] [Discourse.ros.org] [Quality Assurance] What qua…

Top Page
Attachments:
Message as email
+ (text/plain)
Delete this message
Reply to this message
Author: tyagikunal via ros-users
Date:  
To: ros-users
CC: tyagikunal
Subject: [ros-users] [Discourse.ros.org] [Quality Assurance] What quality metrics do we need to make packages quality visible?


As discussed previously, ROS package page already list several of the following data. It'd be great if we bring it under a certain topic rather than scatter these everywhere. I've made a list of items which can be used to define the metrics, suggestions are welcome. This list will be dynamic. I'll edit (and repost if required) based on future discussions. I've also created some groupings. Corrections welcome.

CI
* Build [Pass/Fail]  ---> Basic data from the CI tool
* Unit Tests [Pass/Fail] --> This might require more granularity because some tests are more important. Doing this might be possible for some core packages
* Unit Test Crash --> Apparently CI tools can detect and report this already, we just need to showcase it
* Unit Test Coverage [%] --> A diagram showing code test coverage with pass/fail spots like a heatmap (was it codecov that provided it?)
* Static Analysis
    * Code Quality (https://wiki.ros.org/code_quality)
    * Number of Coding Standard errors
* Dynamic Analysis
    * Clang AddressSanitizer and LeakSanitizer ([reference](https://discourse.ros.org/t/input-validation-as-a-metric-for-quality/3732/11?u=tyagikunal))
    * Fuzzy testing by "chaos node" ---> Being discussed along with contracts in ROS. Maybe use pyros-dev or similar tools??


Documentation
* Status (Maintained, Orphaned, etc.)
* README (not all packages do. Repository != package)
* Wiki (Github/GitLab, etc. if the content isn't on ROS wiki)
* Getting Started (Tutorials & Debugging Common Bugs)
* Sphinx/Doxygen links
* Link to Tagged Questions from answers.ros (as an FAQ)
* Other resources like photo/handdrawn/generated (https://www.planttext.com/) UML diagrams, etc
* User rating/review (maybe for tutorials also, eg: How helpful is this)

For issues, we'll need to access the host (Github, bugzilla) API regarding
* Number of open issues
* Time to close issue
* Activity on issues
* Other status (eg: wont-fix, etc.)

NB: This isn't an exhaustive list or even a final list. I've compiled it based on past discussions.

Current questions:
* Integration tests
* HAROS: What all data to show? Dependency graph, Dependencies of package, packages that depend on this package, etc. There is lots of data
* Coverage for documentation?
* Low Priority: Model-in-loop or hardware-in-loop tests





---
[Visit Topic](https://discourse.ros.org/t/what-quality-metrics-do-we-need-to-make-packages-quality-visible/3985/2) or reply to this email to respond.


If you do not want to receive messages from ros-users please use the unsubscribe link below. If you use the one above, you will stop all of ros-users from receiving updates.
______________________________________________________________________________
ros-users mailing list

http://lists.ros.org/mailman/listinfo/ros-users
Unsubscribe: <http://lists.ros.org/mailman//options/ros-users>