I recently implement couple of algorithms for Deep Learning, which provides fast inference on an embedded devices. One of them is called 'LCNN', you can find the original paper here : https://arxiv.org/abs/1611.06473 And I implemented it with tensorflow, https://github.com/ildoonet/tf-lcnn This codes compress alexnet which takes roughly 150ms or more on a single core cpu, to a sparse convolutional layered network which takes 10~50ms on the same environment. ![image|690x241](upload://xfPsVMENuBtzhvVRSCdo4OQRMwH.png)https://github.com/ildoonet/tf-lcnn/raw/master/images/timeline_alexnet.png So.. based on these new technologies, i am looking for an idea to try. Like openpose on a robot : https://discourse.ros.org/t/human-pose-estimation-deep-learning-model-openpose-ros-package/2407 Any Thoughts? --- [Visit Topic](https://discourse.ros.org/t/deep-neural-network-on-embedded/2604/1) or reply to this email to respond. If you do not want to receive messages from ros-users please use the unsubscribe link below. If you use the one above, you will stop all of ros-users from receiving updates. ______________________________________________________________________________ ros-users mailing list ros-users@lists.ros.org http://lists.ros.org/mailman/listinfo/ros-users Unsubscribe: