Ugo, The only explanation I can think of is that whatever module is doing the place is not collision aware - the objects are clearly being represented in the environment_server, and any collision aware module would preventing this kind of collision. If it is the case that you are using non-collision aware kinematics that might be the culprit - arm_kinematics doesn't check for collisions as far as I know, though that's definitely something we're interested in contributing to that package eventually. I would try to figure out exactly what your code is using for the final segment of placing objects and then we can try to determine the best way to make it collision aware. -Gil On Fri, Feb 18, 2011 at 6:56 AM, Ugo Cupcic wrote: > Ok, so I'm still stuck with my problem of undetected collision between the > attached object and the environment. > > I have one question: > Is this collision computed by the ik_constrained_planner? I had to set > "self.perception_running = 0" in ik_utilities.py, as I don't have an > ik_constrained_planner: I'm using the generic arm_kinematics module with my > robot. Would this explain the behavior I have? > > Cheers, > > Ugo > > > > On Wed, Feb 16, 2011 at 11:47 AM, Ugo Cupcic wrote: > >> Thanks a lot !! This was the node I was still missing! I now have a first >> draft of the manipulation stack for our hand / arm (with still lots of >> improvement needed, but I'm already able to "theoretically" pick up and >> place a coke can). >> >> Thank you all for your answers. >> >> Cheers, >> >> Ugo >> >> >> >> On Tue, Feb 15, 2011 at 7:51 PM, Gil Jones wrote: >> >>> Hi Ugo, >>> >>> To see what the current state of the environment server is you have to >>> run an another program in the planning environment package. Here's the >>> contents of the launch file in >>> planning_environment/launch/display_planner_collision_model.launch: >>> >>> >>> >> name="display_planner_collision_model_environment_server" >>> type="display_planner_collision_model" respawn="false" output="screen"> >>> >>> >>> >>> >>> >>> >>> This node calls services in the environment_server and broadcasts markers >>> with the objects. You should see a marker topic after you launch this, and >>> a collision map topic showing the state of the collision map the environment >>> server uses. >>> >>> Are you sure you are processing the kinect data through the appropriate >>> set of self filters? You need to be filtering the robot and the attached >>> objects out of the point cloud in order for the grasping pipeline to >>> function. I'm attaching a couple very experimental launch files that should >>> process the kinect data through the required pipeline to produce robot-free >>> collision maps. >>> >>> Let me know how it goes. >>> >>> -Gil >>> >>> -- >>> E. Gil Jones (gjones@willowgarage.com) >>> Research Engineer >>> Willow Garage, Inc. >>> 68 Willow Road >>> Menlo Park, CA 94025 >>> <+16504759772>650.475.9772 >>> >>> >>> >>> On Mon, Feb 14, 2011 at 9:34 AM, Ugo Cupcic wrote: >>> >>>> Hi, >>>> >>>> Thanks for your answer. I think that I still haven't figured out how to >>>> use the environment server properly. I still can't view the attached object. >>>> >>>> I pasted the rostopic info for my environment server at the end [1]. If >>>> you have any hints to what may be the problem, they're welcome! >>>> >>>> Cheers, >>>> >>>> Ugo >>>> >>>> >>>> >>>> [1] >>>> rosnode info /environment_server >>>> >>>> -------------------------------------------------------------------------------- >>>> Node [/environment_server] >>>> Publications: >>>> * /environment_server/state_validity >>>> [motion_planning_msgs/DisplayTrajectory] >>>> * /environment_server/allowed_contact_regions_array >>>> [visualization_msgs/MarkerArray] >>>> * /rosout [rosgraph_msgs/Log] >>>> * /environment_server/collision_pose >>>> [motion_planning_msgs/DisplayTrajectory] >>>> * /environment_server_contact_markers [visualization_msgs/Marker] >>>> >>>> Subscriptions: >>>> * /collision_object [mapping_msgs/CollisionObject] >>>> * /tf [tf/tfMessage] >>>> * /collision_map_occ [mapping_msgs/CollisionMap] >>>> * /attached_collision_object [mapping_msgs/AttachedCollisionObject] >>>> * /collision_map_occ_update [mapping_msgs/CollisionMap] >>>> * /joint_states [sensor_msgs/JointState] >>>> >>>> Services: >>>> * /environment_server/tf_frames >>>> * /environment_server/get_state_validity >>>> * /environment_server/get_current_allowed_collision_matrix >>>> * /environment_server/set_logger_level >>>> * /environment_server/get_environment_safety >>>> * /environment_server/revert_allowed_collisions >>>> * /environment_server/get_joints_in_group >>>> * /environment_server/get_loggers >>>> * /environment_server/get_group_info >>>> * /environment_server/get_execution_safety >>>> * /environment_server/get_trajectory_validity >>>> * /environment_server/get_collision_objects >>>> * /environment_server/get_robot_state >>>> * /environment_server/set_allowed_collisions >>>> >>>> >>>> >>>> On Fri, Feb 11, 2011 at 9:32 PM, Kaijen Hsiao wrote: >>>> >>>>> Hi Ugo, >>>>> >>>>> On Mon, Feb 7, 2011 at 10:20 AM, Ugo Cupcic >>>>> wrote: >>>>> > I'm not sure on which topic I should subscribe to visualize the >>>>> > attached_object in rviz as well. (I looked at the tf but couldn't see >>>>> any tf >>>>> > moving with the model, once the hand has grasped the can). >>>>> >>>>> Just noticed this last bit in your last email. Hopefully you've >>>>> figured it out already, but in case you haven't, you visualize >>>>> attached objects the same way you visualize all the collision models >>>>> that you've added to the collision environment. In our setup, the >>>>> topic is /collision_model_markers/environment_server but yours may >>>>> vary if you've changed your setup significantly. There's no separate >>>>> tf frame associated with the object being broadcast. >>>>> >>>>> -Kaijen >>>>> _______________________________________________ >>>>> ros-users mailing list >>>>> ros-users@code.ros.org >>>>> https://code.ros.org/mailman/listinfo/ros-users >>>>> >>>> >>>> >>>> >>>> -- >>>> Ugo Cupcic | Shadow Robot Company | ugo@shadowrobot.com >>>> Software Engineer | 251 Liverpool Road | >>>> need a Hand? | London N1 1LX | <+442077002487>+44 20 7700 2487 >>>> http://www.shadowrobot.com/hand/ @shadowrobot >>>> >>>> >>>> _______________________________________________ >>>> ros-users mailing list >>>> ros-users@code.ros.org >>>> https://code.ros.org/mailman/listinfo/ros-users >>>> >>>> >>> >>> _______________________________________________ >>> ros-users mailing list >>> ros-users@code.ros.org >>> https://code.ros.org/mailman/listinfo/ros-users >>> >>> >> >> >> -- >> Ugo Cupcic | Shadow Robot Company | ugo@shadowrobot.com >> Software Engineer | 251 Liverpool Road | >> need a Hand? | London N1 1LX | +44 20 7700 2487 >> http://www.shadowrobot.com/hand/ @shadowrobot >> >> > > > -- > Ugo Cupcic | Shadow Robot Company | ugo@shadowrobot.com > Software Engineer | 251 Liverpool Road | > need a Hand? | London N1 1LX | +44 20 7700 2487 > http://www.shadowrobot.com/hand/ @shadowrobot > > > _______________________________________________ > ros-users mailing list > ros-users@code.ros.org > https://code.ros.org/mailman/listinfo/ros-users > >