pick-凯发k8网页登录
setup an end-to-end pick and place workflow for a robotic manipulator like the kinova® gen3.
the pick-and-place workflow implemented in this example can be adapted to different scenarios, planners, simulation platforms, and object detection options. the example shown here uses rrt for planning and simulates the robot in gazebo using the robot operating system (ros). for other pick-and-place workflows, see:
overview
this example identifies and recycles objects into two bins using a kinova gen3 manipulator. the example uses tools from five toolboxes:
robotics system toolbox™ is used to model and simulate the manipulator.
ros toolbox™ is used for connecting matlab to gazebo.
image processing toolbox™ and computer vision toolbox™ are used for object detection using point cloud processing and simulated depth camera in gazebo.
this example builds on key concepts from the following related examples:
(ros toolbox) (ros toolbox)
3-d point cloud registration and stitching (computer vision toolbox) (computer vision toolbox)
robot simulation and control in gazebo
start a ros-based simulator for a kinova gen3 robot and configure the matlab® connection with the robot simulator.
this example uses a virtual machine (vm) containing ros melodic available for download .
start the ubuntu® virtual machine desktop.
in the ubuntu desktop, click the gazebo recycling world - depth sensing icon to start the gazebo world built for this example.
specify the ip address and port number of the ros master in gazebo so that matlab® can communicate with the robot simulator. for this example, the ros master in gazebo uses the ip address of
172.21.72.160
displayed on the desktop. adjust therosip
variable based on your vm.start the ros 1 network using
rosinit
.
rosip = '172.16.34.129'; % ip address of ros-enabled machine rosshutdown; rosinit(rosip,11311); % initialize ros connection
initializing global node /matlab_global_node_63627 with nodeuri http://172.16.34.1:35153/ and masteruri http://172.16.34.129:11311.
after initializing the gazebo world by click the icon, the vm loads a kinova gen3 robot arm on a table with one recycling bin on each side. to simulate and control the robot arm in gazebo, the vm contains the ros package, which are provided by kinova.
the packages use to control the joints to desired joint positions. for additional details on using the vm, refer to (ros toolbox)
pick-and-place tasks
the pick-and-place workflow is implemented in matlab and consists of basic initialization steps, followed by two main sections:
identify parts and determine where to place them
execute pick-and-place workflow
for an implementation that uses stateflow to schedule the tasks, see pick-and-place workflow using stateflow for matlab.
scanning the environment to build planning scene for rrt path planner
before starting the pick-and-place job, the robot goes through a set of tasks to identify the planning scene in the examplecommandbuildworld
function and detects the objects to pick using the examplecommanddetectparts
function.
first, the robot moves to predefined scanning poses one by one and captures a set of point clouds of the scene using an onboard depth sensor. at each of the scanning poses, the current camera pose is retrieved by reading the corresponding ros transformation using (ros toolbox) and (ros toolbox). the scanning poses are visualized below:
once the robot has visited all the scanning poses, the captured point clouds are transformed from camera to world frame using (computer vision toolbox) and merged to a single point cloud using (computer vision toolbox). the final point cloud is segmented based on euclidean distance using (computer vision toolbox). the resulting point cloud segments are then encoded as collision meshes (see ) to be easily identified as obstacles during rrt path planning. the process from point cloud to collision meshes is shown one mesh at a time below.
opening and closing the gripper
the command for activating the gripper, examplecommandactivategripper
, sends an action request to open and close the gripper implemented in gazebo. for example, to send a request to open the gripper, the following code is used.
[gripact,gripgoal] = rosactionclient('/my_gen3/custom_gripper_controller/gripper_cmd'); grippercommand = rosmessage('control_msgs/grippercommand'); grippercommand.position = 0.0; gripgoal.command = grippercommand; sendgoal(gripact,gripgoal);
moving the manipulator to a specified pose
most of the task execution consists of instructing the robot to move between different specified poses. the examplehelpermovetotaskconfig
function defines an rrt planner using the manipulatorrrt
object, which plans paths from an initial to a desired joint configuration by avoiding collisions with specified collision objects in the scene. the resulting path is first shortened and then interpolated at a desired validation distance. to generate a trajectory, the function is used to assign time steps to each of the interpolated waypoints following a trapezoidal profile. finally, the waypoints with their associated times are interpolated to a desired sample rate (every 0.1 seconds). the generated trajectories ensure that the robot moves slowly at the start and the end of the motion when it is approaching or placing an object.
the planned paths are visualized in matlab along with the planning scene.
this workflow is examined in detail in the example. for more information about the rrt planner, see . for simpler trajectories where the paths are known to be obstacle-free, trajectories could be executed using trajectory generation tools and simulated using the manipulator motion models. see .
joint trajectory controller in ros
after generating a joint trajectory for the robot to follow, the example
commandmovetotaskconfig
function samples the trajectory at the desired sample rate, packages it into joint-trajectory ros messages and sends an action request to the joint-trajectory controller implemented in the kinova ros package.
detecting and classifying objects in the scene
the functions examplecommanddetectparts
and examplecommandclassifyparts
use the simulated end-effector depth camera feed from the robot to detect the recyclable parts. since a complete point cloud of the scene is available from the build environment step, the iterative closest point (icp) registration algorithm implemented in (computer vision toolbox) identifies which of the segmented point clouds match the geometries of objects that should be picked.
start the pick-and-place workflow
this simulation uses a kinova gen3 manipulator with a gripper attached.
load('examplehelperkinovagen3grippergazeborrtscene.mat');
rng(0)
initialize the pick-and-place application
set the initial robot configuration and name of the end-effector body.
initialrobotjconfig = [3.5797 -0.6562 -1.2507 -0.7008 0.7303 -2.0500 -1.9053];
endeffectorframe = "gripper";
initialize the coordinator by giving the robot model, initial configuration, and end-effector name.
coordinator = examplehelpercoordinatorpickplacerosgazeboscene(robot,initialrobotjconfig, endeffectorframe);
specify pick-and-place coordinator properties.
coordinator.homerobottaskconfig = gettransform(robot, initialrobotjconfig, endeffectorframe); coordinator.placingpose{1} = trvec2tform([0.2 0.55 0.26])*axang2tform([0 0 1 pi/2])*axang2tform([0 1 0 pi]); coordinator.placingpose{2} = trvec2tform([0.2 -0.55 0.26])*axang2tform([0 0 1 pi/2])*axang2tform([0 1 0 pi]);
run the pick-and-place application step by step
% task 1: build world
examplecommandbuildworldrosgazeboscene(coordinator);
moving to scanning pose 1 searching for other config... now planning... waiting until robot reaches the desired configuration capturing point cloud 1 getting camera pose 1 moving to scanning pose 2 now planning... waiting until robot reaches the desired configuration capturing point cloud 2 getting camera pose 2 moving to scanning pose 3 searching for other config... now planning... waiting until robot reaches the desired configuration capturing point cloud 3 getting camera pose 3 moving to scanning pose 4 now planning... waiting until robot reaches the desired configuration capturing point cloud 4 getting camera pose 4 moving to scanning pose 5 now planning... waiting until robot reaches the desired configuration capturing point cloud 5 getting camera pose 5
% task 2: move to home position
examplecommandmovetotaskconfigrosgazeboscene(coordinator,coordinator.homerobottaskconfig);
now planning... waiting until robot reaches the desired configuration
% task 3: detect objects in the scene to pick
examplecommanddetectpartsrosgazeboscene(coordinator);
bottle detected... can detected...
% task 4: select next part to pick
remainingparts = examplecommandpickinglogicrosgazeboscene(coordinator);
1
while remainingparts==true % task 5: [picking] compute grasp pose examplecommandcomputegraspposerosgazeboscene(coordinator); % task 6: [picking] move to picking pose examplecommandmovetotaskconfigrosgazeboscene(coordinator, coordinator.grasppose); % task 7: [picking] activate gripper examplecommandactivategripperrosgazeboscene(coordinator,'on'); % part has been picked % task 8: [placing] move to placing pose examplecommandmovetotaskconfigrosgazeboscene(coordinator, ... coordinator.placingpose{coordinator.detectedparts{coordinator.nextpart}.placingbelt}); % task 9: [placing] deactivate gripper examplecommandactivategripperrosgazeboscene(coordinator,'off'); % part has been placed % select next part to pick remainingparts = examplecommandpickinglogicrosgazeboscene(coordinator); % move to home position examplecommandmovetotaskconfigrosgazeboscene(coordinator,coordinator.homerobottaskconfig); end
now planning... waiting until robot reaches the desired configuration
gripper closed...
now planning... waiting until robot reaches the desired configuration
gripper open...
2
now planning... waiting until robot reaches the desired configuration
now planning... waiting until robot reaches the desired configuration
gripper closed...
now planning... waiting until robot reaches the desired configuration
gripper open...
now planning... waiting until robot reaches the desired configuration
% shut down ros when the pick-and-place application is done
rosshutdown;
shutting down global node /matlab_global_node_63627 with nodeuri http://172.16.34.1:35153/ and masteruri http://172.16.34.129:11311.
visualize the pick-and-place action in gazebo
the gazebo world shows the robot in the working area as it moves parts to the recycling bins. the robot continues working until all parts have been placed.
凯发官网入口首页 copyright 2021 the mathworks, inc.