Вы находитесь на странице: 1из 12
zani2018 >Home => Blog » ROS » Code » Contests veletter > Hardware » Media » Photos, » Videos Pi Robot ROS by Example: Head Tracking in 3D (Part 1) NOTE: This tutorial is several years out of date and the specific commands for installing software and using ROS have changed since then. An up-to-date version ofthis tutorial can be found inthe book ROS By Example Volume 2: Packages ‘and Programs for Advanced Robot Behaviors, available as a downloadable PDF and in paperback on Lulu.com, In the previous tutorial we learned how to use ROS and OpencV to do basic head tacking of a sual target as seen in atypical webam. In this tutorial, we will learn how to perform a similar task using the ROS tf package for transforming target locations from one frame of reference to another. At the same time, we will trade in our wedcam for a Microsoft Kinect which provides an RGB-D point loud that we can process using the ROS Point Cloud Library (PCL) fo OpenNI. Part 1 of the tutorial will cover the use of EF to specify target locations and does not require that you have a Kinect. Part 2 will use the resuts fof Part 1 together with the object recognition abilities of PCL and Open to track 3D visual targets. Part 1: Head Pointing Using tf ‘The ROS tf package allows us to specify the target in nearly any frame of reference. For example, we might know the location of an object relative to our robot's base (e.g. "1.2 meters directly ahead and on the floor’) and we want the robot to pan and tl its camera to look straight at that location. Conversely, we can use tf to translate a location relative to the ‘camera frame into coordinates relative to any other frame, such as the robot's base or hand. In this way we can use vision to determine the direction and gistance the robot woulé have to travel to a given object, or where to position its hand to ‘When working with reference frames, Keep in mind that ROS uses @ right-hand convention for ‘orienting the coordinate axes as shown on left. Similarly, the direction of rotations about an ax's is defined by the right hand rule shown on the right: if you point your thumb inthe positive direction ‘of any axis, your fingers curl inthe eirection of a positive rocation, For a mobile robot ushg ROS, the 2-ax's points upward, the x-axis points forward, and the y-axis points to the left. Under the right han rule, a positive rotation of the rabot about the z-axis is counterclockwise while a negative rotation is clockwise, In the earier head tracking tutorial, we used two diferent packages for controlling Pi Robots Dynamixel AX-12 servos: the ‘Arbot package from Vanadium labs, and the Robotis package from the Healthcare Robotics Lab (HRL) at Georgia Tech. In this tutorial, we will use a third package for managing a Dynamixel bus; namely, the dynanixel_controllers package ‘rom the Arizona Robotics Research Group (ARRG). (A special thanks to Anton Rebguns for helping me understand how to set up the controller launch files.) This package is part of the dynamixel_motor stack which in turn is part of the the larger ua-ros-pke repository that includes packages for controling a multjointed arm as wel as the use of ROS actions and trajectories. We will be looking at actions, arm trajectories and inverse kinematics for Pi Robots arms in the near future so the UA stack will come in quite handy. {As inthe previous tutorial, we expect the reader to be familiar withthe basics of ROS. You should be able to use elther CCurtle or Diamondback for your main ROS installation although | recommend making the upgrade to Diamondback. Here ‘again isa quick checktist of pre-requisites before you can run the cade we will create later on + testa Ubuntu Linus (/ am using version 10,04 on a machine that dual boots with Windows). If your robot does not have its own onboard computer, you can still run the code on your main computer and wire the Kinect and servo controller to a pair of USB ports. + Install either the Diamondback or Elactric release of ROS on your Ubuntu machine + If you are not already familar with the ROS basics, work through the Begioner Tutorials. It important to actually try the sample cade on your own machine rather than just reading through the text. Infact, Iran through all the tutorials twice since a few concepts were a litle shaky after a single pass + lnadaition to the beginner tutorials, tt essential to also work through the ROS tf Tutorials. ‘To prepare our robot for the real 3-dimensioal world, we will need a few key ingredients, all of which are wel tested parts of the ROS framework. These incide: + A geometrically accurate model of our robot using the Unified Robot Description Format or URDF. ‘+ Away to visualize and test the joints of the robot before hooking up a real servo controller. This will be hancled by the joint_state_publisher package by David Lu! Iitputwin probotorgblog018h wa zan2018 Pi Robot ‘+ Amethod for combining the geometry of the robot and the current joint angles to yield the 30 positions and ‘orientations of the joints. This is taken care of by the all-powerful robot _state_publisher package which outputs ‘the t€ frames attached to each lnk on our robot and the transformations between them. + Drivers for our Knect RGB-D camera. We will use the epenni packages. (Not done unt Part 2.) + A controller package for our AK-12 servos and USB2Dynamixel controller. For this, we will use the ‘dynanixel_controllers package as already stated. ‘The steps we wal follow in this tutorial are '+ Downlcad the tutorial files from the Pi Robot repository. ‘+ Create a URDE model of aur robot and test it in RViz. + tnstall and set up the dynamixel_controlers package to contrl our servos. + Learn how to point the head to an arbitrary location using tf. Downloading the Tutorial Files Alte files needed for the tutorial can be downloaded via SVN. Move into your personal ROS path (e.g. ~/ros) and execute the commands: 5 svn co http://pt-robot-ros-pkg. googlecode.con/svn/trurk/p!_tutorials/pi_head_tracking_34_ part Now take a look at the manifest file: '$ nore pi_head tracking 34 partt/manifest.xal package? ‘description brief="pi_head_tracking_3d_partt”> Head Tracking 49 30 Port 2 elicense>9so¢/license> review statuse"unreviewed* notess"*/> «depend “roscpp" > «depend rospy"/> ‘depend 'sté_nsgs"/> ‘depend package="geonetry_nsgs"/> “depend package="t#"/> «depend package="eviz"/> ‘depend package="robot_state_publisher"/> “depend package="Joint_state_publisher"/> “depend package="éyramLxel_nsgs"/> ‘depend package="éynanixel contrallers"/> Note the dependency on the thre: party packages Joint_state_publisher as well as dynanixel_driver, dynanixel_wsgs and dynanixel_contrellers. Lets download and install these packages now so we can build the main head tracking project. Installing the joint_state_publisher Package ‘The Joint state publisher package was created by David Lul! and we can install as part of his urd¥_ tools stack. First move into your personal ROS directory (e.g. ~/ros), then jsue the commands: 5 sun co https://mu-ros-pkg.svn.sourceforge.net/svarcot /mu-ros-pkg/stacks/uréf_tools/trunk uréf_tools ca uraf tools § rosnake --rosdep-install Installing the dynamixel_motor Packages Move into a ctrectory tn your personal ROS path (e.g. ~/ros) and get the entire dynamixel_wotor stack using one of the following two commands: For Dlamondback: sudo apt-get install ros-disnondback-dynanixel -notor For Electric: Iitputwin probotorgblog018h ae zan2018 Pi Robot '§ sudo apt-get install ros-electric-dyranixel-notor ‘The dynamixel_controLlers package included in the dynanixe1_notor stack works in a manner similar to the controller used on the Willow Garage PR2: frst a controller manager node i launched that connects to the Dynamicel bus (it ‘ur case a USBZDynamixel controller). ‘The controler node can then start, stop or restart one or more individual servo controllers. All ofthis taken care of by a launch file that we wil create later on, Building the Head Tracking 3D Package [Now that we have our dependencies installed, we can build the main head tracking package using the command: $ rosmake --rosdep-install pi_head_tracking 3¢ part Setting Up the URDF/Xacro Robot Model Before we can use tf to translate between frames of reference, we need a model of how our robot is put together. Even if you simply mount your Kinect onto a pair of pan and tilt servos, tf needs to know the geometry describing haw these servos are connected to each other and to the camera, a5 well as how the whole setup is mounted relative to the robot base or table top. ‘To describe the configuration of inks and joints making up your robot, ROS uses an XML file written inthe Unified Robot Description Format (URDF).. You can also se some simple macros using the Xacro macro language for simplyng the UROF file, i Robot’s URDF model i fairly complex because of the number of joints (13 altogether) and some funky offsets ofthe joints ‘due to the way the brackets are mounted. So we will use a simpler robot model that we will call the KinectBot for lack of ‘better name. You can alza use your own robot made fi includes pan and tlt joints fr the head. (It assumed in this tutorial that the pan and tit inks are called head_pan_Link and head_tilt_Link,) An excellent set of URDF and Xacro {tutorials for creating your own URDF model can be found at: we wi urd ‘Otherwise, you can use the KinectBot madel found in the tutorial package under the urd directory. ‘To see what the contents of ths file look like, issue the commands: '§ nosed pi_hesd_tracking_34_parti/uraf nore kinectbot.urdf.xacre To check that validity of this fle, run the folowing commands '$ rosrun xacre xacro.py Kinectbot.urdf.xacro > tmp.urdf ‘Then, for C-Turtle or Diamondback: $ rosrun urdf check_vrdf tmp.urdt (Or if you are using Electric: '$ rosrun urdf_parser check uraf tmp.unde You should see an output that looks ike this: robot mane 4s: kinactbot ~ Successfully Parsed XML ---~ root Link: base_Link has 1 child(ren) ‘ehile(a): torso_Link ‘enile(a): head_pan link ‘ehild(2): pan t11t bracket chile): head ti2t_Link cehil6(2): neck Link ‘enild(a): Read base_1ine ‘enild(s): head post tine ehile(2): Read Link entld(a): hair atk ent): LeFt_eye tik enild(3):right_eye_Link htt probotorgblog018! ana zan2018 Pi Robot Feel free to modify the kinectbot .urdf.xacro file as you ike, but be sure to verify any changes using the procedure described above. (STW, the left and right eye links do not correspond to the Kinect lenses; they are simply for decoration {a this point, We wil deal with the true optical frames of the Kinect in Part 2). Testing Your Model using joint_state_publisher @ robot_state_publisher To test aur URDF modet in RVAZ, we need a way to publish the Joint positions even though we are not yet connected to a real robot. This is where David Lull's Joint _state_publisher comes in. Well need a launch file that brings up our URDF ‘magel together with the joint_state_publisher node and one other key ROS node called the robot_state publisher. The robot state publisher knows how to take the current joint angles of your robot and turn ‘them into 3D poses ofthe links by propagating the joint angles through the kinematic tree defined by your URDF file. For ‘example, if your robot's head is connected to the tlt servo joint by a 10cm long bracket, and the tlt joint angle is currently 90 degrees, then the robot_state_publisher can compute that the head is now 10cm forward of the seve joint and parallel to the ground. The end rest is that both both t# and RVLZ have access to the current configuration of your robot at any moment i time, Part ofthe tutorial package includes the launch file test_urdf Launch that looks tke this: launch “cparan nane~"robot_ description" conand-"$(find xacro)/xacro.py °$(find piLhead tracking 3d_partt) /uraf/iinectbot unde. xacro™ /> nade nane="Joint_state publisher” pkgs"foint_state publisher” type="Joint_state_publésner” /> ons 00 ‘The firs tine inthe launch file highlighted in yellow loads our URDF model of the KinectBot onto the parameter server as the parameter /robot_description, ‘The next two lines highlighted in blue launch the joint_state_publisher node. Setting the use_gui parameter to True will bring up a slider control that allows us to set the simulated joint angles manwaly. Next we launch the robot state publisher node as shown in green. The robot state publisher knows how to take the current joint angles ofthe robot and map them into 3D poses ofthe tnks as defined by the URDF model. ‘The final line provides a static transform between the robot /base_ink frame and the /world frame. In the case of the KinectBot, the midale of the base is 3.25 cm above the ground plane, Let’ fire up ths set of parameters, nodes, and transforms, by launching the test_urdf. launch file as follows: $ roslaunch pi_head_tracking_3¢_partt test_uréf-launen If al goes well, you should see the folowing output on your terminal window: sumaRy PARAMETERS * jrosversion + posegut + srosdistro * robot description * /robot_state_publishery/publish_frequancy ove 1 Joint_state publisher (Jeint_state_publisher/osnt_state publisher) Iitputwin probotorgblog018/ an zan2018 Pi Robot robot_state_publisher (robot_state_publister/state publisher) world base broadcaster (tf/static_transfora publisher) ROS_wusTER_unisntep://localhost:11311 core service [/rosout] found process(Joint_state_publisher-aJ: started with pié (11992) process[robot_state_publisher-2]: started with pid [11992] process[world_base_roadcaster-3): started with pid [11953] You should also see a Uitte sider control pop up on your screen that looks Uke this: Bee To visualize the KinectBot and test the joints, bring up RVAL using the configuration file included in the tutorial package: '§ roscd pi_head tracking 34.parti S$ rosrun ruiz rviz ~d tutorial vee "You should see the RVL2 window come up looking something lice this ‘Now bring the Joint sider control back to the foreground and try out the controls for the pan and tilt serves. You should see the KinectBot’s head mave in RViz as you change the joint angles. Assuming everything works OK, we are ready to try ‘things out on the real robot. Servo Control Using the dynamixel_motor Packages PLEASE NOTE: (Sept 7, 2011) The following section has been changed since the original tutorial was written. The old ‘ax12_controller_core package has been replaced by the dynamixel_moter stack. The updated text below therefore uses the new dynamixe|_motor stack. ‘The dynanixel._controllers package works in a manner similar to the one used on the Willow Garage PR2: frst a ‘controller manager node is launched that connects to the Dynamixel bus (in our case a USB2Dynamixel contraller), The controller node then launches a number of individual controllers, ane for each servo on the bus. Here isthe launch file we wll use to control our robot's pan and tit servos: Iitputwin probotorgblog018h 52 zan2o8 Pi Robot «launch «arg nane="dynanixel_nanespace” valuee"dynantxel_controller* /> <1 Load the UROF/Kacro model of our robot --> cparan nane="robot_deseription” conmand="$(Find xacre)/xacra.py “$(Find plead tracking 3d_partt) /uraf/iinectbot .undé.xacro™ /> inode 'state_publisher'> narespace: pi_dynamixel_manager serial_ports <1-+ Loae Joint controler configuration fran YAM file to paraneter server ==> crosparan ns="$(arg dynarixel_nanespace)” file="$(Find plead tracking 3d_partt)/paraes/dynanixel_parans.yanl” conande"Loa P ‘node ns="$(arg dynanixel_narespace)” nane="dynanixel_joint_states_publishes" Pkg="pi_bead_tracking_34_parei” type="dynanixel_joint.state publisher.py” eutput="seneen* /> Looking at the launch fle, we see that the USB2Dyamixel controller is assumed to be on port /dev/ttyUSB@ and the servo IDs are t and 2. Change these values if necessary for your setup. The launch file depends on dynambcel_params. yar file found in the params subdirectory. That file los tke this: ynantxels: ("head pan", ‘head tilt") head_pan_controller: controller: package: dynanixel_controliers module: Joint_position controller type: JointPositionControt ler joint_nane: head_pan_joint Joint_speed: 2.6 rotor ia: 2 ntti siz Iitputwin probotorgblog018/ an, zan2018 Pi Robot rnin: @ max: 2024 head 4418 controller controller: package: dynanixel_controliers module: joint_position controller type: JointPositionControl ler joint_nane: head tilt joint Joint speed: 2.0 rotor ia: 2 ntti siz inin: 380 rox: 00 First we define a parameter called dynamixels that simply ists the name of our servos. Then we specify the type of controller that will control each servo as well a its hardware ID, initial postion value and its min and max position values. ‘The head tit controller is given less than ful range since It eannot go all the way forward or back without hitting the top of the torso. AL ths level, the init/rmin/max numbers are given in servo Licks which varies from 0 to 1023 for the AK-12s (We also specify limits on the head tit joint in the robots UROF file, giving @ min/max of 1.57 radians which i 90 degrees either way.) Testing the Servos ‘To test the pan and tlt servs, first connect your servos and USB2Dynamixels to @ power source, then make sure your ‘UsB2Dynamixel is connected to @ USB port on your computer. Once connected, issue the following command to see what USB ports you have connected: 1s /dev/teyusor Hopefully you will see something lke the folowing output: ‘/dev/ttyusse Ir instead you get back the message: Ast cannot access /dev/ttyUSB*: No such file or directory ‘then your UsB2Dynamixel has not been recognized. Try plugging it n toa different USB port or use a different cabe, If you have no other USB devices attached, your US82Dyramixel should be on /dev/ttyUSB@ and the default launch file will, work without modification. If itis on a different numbered USB port, edit the dynamixels, launch file inthe launch directory land change the port accordingly. Similarly, if your servos have IDs ather than 1 and 2, edit the dynamixels_parans. yam and dynamixels Launch files as needed, Ir you have RViz up during the following tests, you should see the virtual heag move in sync with the real head. When this is all done, fire up the dynamixels Launch file '$ rosiaunch pi_head_tracking_3¢_parti dynanixels. launch ‘You should see a numberof startup messages that look something lke this: process[robot_state_publisher-1]: started with pid [19398] process [dynanixel_controller/dynanixel_manager-2]: started with pid [19395] process [dynanixel_controller/aynasixel_controller_spawner_ax12-3]: started with pid [19396] process dynanixel_controller/dynanixel_joint_states_publisher-4]: started with pig [19397) process[relax_all_servos-5]: started with pid [19399] process[world_base_broadcaster-6]: started with pic [19404] [1WFo} [WallTine: 1313701889.865474] Pinging motor TDs 1 through 2.. [INFo] [Wat1Tine: 1313761889.865755) cynanixel_axi2 controller_spawner: waiting for controller_sanager pi_dynamixel_nanager to startup in /dynanixel_controller/ namespace. [WFO] (Wal1Time: 1313701889.869122] Found eotors with Tos: (1, 2)- [1nFo} [WaiiTine: 1313701889.872879] Starting Dynanixel Joint State Publisher at 1eHz [INFo] [Wai1Tine: 13137@1869.968588] There are 2 AX-12+ servos connected [INFO] [Wa21Tine: 1313701889.968957] Dynasixel Manager on port /dev/ttyUS®E initialized [1NFo] [WallTine: 1313701890.179728) dynanixel_ax12 controller_spawner: All services are up, spawning Iitputwin probotorgblog018h me zan2018 Pi Robot controllers. [INFO] [WallTime: 1323701830,267504] Controller head_pan_controller successfully started. [iFo] [WaiiTine: 1313701890.373233] Controlier head_tilt_contreller successfully started. ‘Once the dynamixel controllers are up ané running, bring up anew terminal and send a couple of simple pan and tilt ‘commands. The first command should pan the head to the left through 1 racian or about 87 degrees: '$ rostopic pub -1 /dyranixel_controller/head_pan_controller/comand std_nsgs/Flostsa -- 1.0 Re-center the servo with the command '$ rostopic pub -1 /éyranixel_controller/head_pan_controller/conand sté_nsgs/Floats4 -- Now try tilting the head downward hald a radian (about 28 degrees '$ rostopic pub -1 /dyranixel_controller/head_tilt_controller/conmané std_nsgs/Floatss -- 0.5 And bring it back up '$ rostopic pub -1 /éyranixel_controller/head_tilt_controller/command std_nsgs/Floatss -- 8.0 ‘To change the speed of the head pan servo in radians per second, use the set_speed service: '§ rosservice call /éynanixel_controller/hesd_pan_controller/set_speed 1.8 '$ rostopic pub -1 /dyranixel_controller/head_pan_contreller/comand std_nsgs/tloatsa -- 1.0 ‘To relax a servo so that you can move it by hané,use the torque_enable service: $ rosservice call. /éyranixel_contraller/head_pan_controtler/torque_enable False The Point Head Node We ae finally ready to introduce tf into the picture. Recall that at the start of this tutorial, we promised that we would be able to point the head to any location defined in any frame of reference. For example, i we say to the rabot “Wok 1.2 meters forward, 0.5 meters up, and 2.1 meters to the right", where all measurements are relative to the base reference: frame, how do we map these coordinates into a pair of pan and tilt angles at the heag? To do this on your own, you would hhave to figure out al the 3-dimensional transformations (rotations and translations) that connect all the links and joints of your robot. Foruntately, that’s exactly wnat tf does for us. ‘The wa-ros-pkg repository includes a node that uses tf to point the head toward a desired 3D location. However, tat ‘node (found in the wubble_actions package) uses ROS actions which isa topic for another tutorial. So the following code {is a less elegant way to do the same thing but without the fancier contro that actions provide, ‘The Python script pednt_head.py found inthe bin directory of the tutorial package does the work. First wel show the Whole listing, then well fake a closer look atthe more interesting parts et nurbiien pytion ‘npon roslib; roslib.toad_sanifest(y heed waciag 5d pat”) ‘spon respy inpor fom geometry_nsgs.nsg impor Pointstanped fom Stdlasgssnsg impor Floste Jnpon math clus PonteaeNode() ef Lit_ Gel: 7 ialize nw nods rospy-dnit node(poit bend node, anonynouseTre) ynamtxel_nanespace = respy.get_nanespace() rate = soipy.get_paran(nicy 2) Pe rospy.tave( sate) iialze the tage pint Scllctarget.point = PointStanped() seliclast_target_point = Pointstamped() Iitputwin probotorgblog018/ an zan2018 Pi Robot 1 Suber he rel point opie rospy.subseriber(Turzel sun’, PointStanped, sll-update_target point) #nializ pubihe fr the pa evo feltsbeadpan frame = hv ‘cllhead_pan_pub = rospy Publisher (dynanixel nan pace + ‘esd_pan_sntoliersomman', Floats) Scllubaad ede Frane = Sea it Jk ‘chaad tilt pub = rospy Publisher (dynandxel_nanespace + hs \contlesconmnd, Floatea) lialze sence folletf = tf TransformLastener() Make sure we ca seat east pan and tit fares fellsthaastrortransfore(el-head pan frame, sf. rospy-Duration(5-0)) eit frome, rospy.tine(), \ rospy-steep(1) Sireset ead position() rospy. loginfoC toads io cep uest p00") wile no rospy.5_shutdoan(): rospy-watt_for_nessage( ‘ore: point Wr starget point ‘canget_angles ~ slltransform target, point(sal-target_point) xcept (Hf-Ficeplony_tF.Connectivityexception, tf-Lookupexcept ion): rospy. loginfo(' Faure) Posntstanped) seltshead_pan_pud.publish(target_angles()) ‘ellthead_t1®- pub. publish (targe®-angles(]) selllast_target_point = if-tanget_point Fospy-logingo( Setine rage Rustin” + su arget_point)) resteep() 466 update mst point, ms) felttarget point = ase et a hen postion ‘lead. pan_pub.publish(9.0) ‘eliead_t41=_ pubs publasn(o rospy-sieep(3) ef uansfom_ tet poin(sly target): 1 Sethe pan an tiltrefereoe fumes teen pan fame and hn pan_ref_frane = rllnead_pan,_ frame CiIeLpe? frome = sichead Te Frome ‘fame defined above 1 Wait eri Geant in second) sell-tf-waitrortransfore(pan ref frane, target.header-trane_id, rospy.Tine(), \ raspy. urstion(5-0)) seltstf.waitrorransfore(&iit_ref_frane, target.header.frane_id, rospy.Tine(). \ rospy. Duration(5.0)) ‘tm a ne fae re ema yarget = sll-te-transtormPolve(pan ref frame, target) Panzangle = noth. atana(pan, target. pointy, pan target potnt.x) 1 Tranatmtuget post to tl ence ame & eee eagle ‘ile target = cistf,transforaPoint(tilt_ref_frane, target) ‘ut_angle ~ math. atana2(cite_target.point.2, math. sqrt(nath-pow(tilt_target.point.x, 2) + wath.pow(tilt_target.point.y, 2))) sum (panna 1 CARE ane] point_head = Posntieachode() Fospy-spin() acct raspy: ROSInterruptException: Now lets focus an some of the more important parts of the script beginning near the top: ingon tf fom geometry 185.n5g inpon Pointstonped Since the name of the game is frame transformations, we import the tf package. The PointStanped message type glues together a Point message type (x, ¥, z coordinates) with a Header message type (seq, stamp, frame_Io) which therefore attaches the point toa particular frame of reference. Remember that you can always display the fields ofa given message Iitputwin probotorgblog018h an zan2018 Pi Robot ‘ype using the rosmsg command: 5 rosnsg. show geonetry_nsgs/PotntStanped Header header uint32 seq ‘ine stare string frame_id ‘eonetry_nsgs/Point point Floats x Floatss y Aostet 2 ‘A Pointstamped message is therefore exactly what we need if we want to specify a target location in a given frame, such 1s the base of the robot (/base_link), a current map (/map) or some other part of the robot (e.g, /left_urist_joint). owe intialize the target point accordingly. We also keep track of the last target point so we can tell i the target has chang seiftarget_point = PointStanped() Solace target point + Poinestanped() [Next we subscribe to the /target_point topic on which we will publish PointStamped target locations: rospy.Subseniben(ine point, Polnttanped, slf-update_target_potnt) ‘The callback function update. srget_point simply sets the local tanget_point variable to the published value: ot pete saret point, sg Sortarget point = mse We use a pair of publishers to update the positions of the pan and it servos. We also select the two reference frames, head_pan_Link and head_tilt_link, that will be used in transforming the target location into appropriate motions: eine pabisherfor the pan are (heed pan frane = “calpain “Githaad_pan_pub = rospy.PubLizner(aynanixel,nanespace + esis conisleiconant, FL0at64) liz publisher forthe it evo vidhead_ 441. frome =o tsk sovthead Hit_pub = rospy.Publishe(éynamixel_namespace + ei sontilevconman, Floats) [Next we initialize the tf listener and make sure that atleast the pan and tlt frames are up and vsble sifctf = tf TransfornListener() SAE waitForTransform(s!cheat_pan_frane, sf.head tilt frare, rospy.Tine(), \ Fospy,Duration(5.8)) Now we enter the main processing loop: while not rospy.is_shutdown(): Tospy.wait_for_nessage( as pois', PointStamped) ir stetarget point == st Last_target_point: Frist we wait on the /target_point topic to make sure we get a target message. Ifthe current target isthe same as the last, we skip the rest of the lap since there is nothing to do. Otherwise, we transform the target location inte the pan and ‘i Link frames and get back the angles needed to rotate the head to point atthe target: arget_angles = sll-transform_target_point(scl-target_point) acct (HF fucelony_£F. Connectivity Exception, Cf Lookupexception) : rospy. loginfo(' Ful) ‘The function transform _tanget_point will be described shortly. Once we have our pan and tit angles, we publish them to the dynamixel controllers to mave the head: sei.heag_pan_pud.publish(target_angles{e)) sit heag_ti1t pubs publish(target-angles(]) ‘The real work is done by the transform_target_point function which, as mentioned earlier, is taken from the UA \wubble head action script. Lets see how it works Iitputwin probotorgblog018h wwe zan2018 Pi Robot ef nso target point target): ‘Setthepa ana ilrefereoe fast the hend_pan fame ond hed tih_ fame dened above pan_ref-frane = rll-head, pan frame CLIELpe? frame = siheade tile rome ‘The input to the function isthe PointStamped target location. First we set our reference frames to the head_pan_frame and head_tilt_frane defines in the main script, namely /head_pan_Link and /head_tilt_Link. Since these are the frames in which head motion takes place, we have to transform the target’ coordinates into these frames. Its a good idea ‘to wait for tf to see both the reference frames (pan and tit) and the target frame, so we do that next: # Wait fortis Gime seconds) sollstf-uadtrortranstare(pan ref frane, target header-frane td, respy.Tine(), \ Tospy.Duration(s sellstf-waitrorTransfore(Ciit_ref frame, target.header.frane_id, rospy.Tine(), \ ‘ospy. Durstion(s.8)) Finally, we put tf to work by using the the transformPoint method to map the target point from its own frame into the pan and tlt frames. Flrst the pan frame: pan target = slf-tf.transformPoint (pan ref_frane, target) ancangle = math-atana(pan_target.point.y, pantarget-point x) In the fist line above, pan_target is assigned the (x, y, 2) coordinates of the target inthe reference frame attached to the head_pan_Link. The corresponcing pan angle is then computed in the second line from the projection of these coordinates in the horizontal x-y plane. In a similar fashion, the next two lines compute the tit angle as follows: ‘lt target = wifcté,transfornpoint(tilt_ref_frane, target) elt_angle = math.stan2(cile_target.point.2, math. sart(natn.pou(tilt_target.point.x, 2) + math.pow(tilt_target.point.y, 2))) ‘With both angles computed, we return them to our main loop: um [panangle, tilt angle] This completes the point_head node. Once launched, the node wil listen on the /target_poiint topic and when a target message & received, it wil move the pan and tit servos to point the head in the aporopriate direction. To test the node, first make sure you have launched the dynamixel controllers if they are not already running '$ rosiaunch pi_head_tracking_3¢ parti dynanixels. launch ‘Then move into anather terminal and launch the point _head node: '$ rosiaunch pi_head_tracking_3¢_parti point_head. launch Shortly after launching the point_head node, the pan and tilt servos should move to their neutral positions and you should see the following messages on scree: pracess(axi2_controller/point_heae_node-1]: started with pid (878) (INF0] (wa1iTine: 1298908182.741528] Ready to accept target point [AL this point, ring up another terminal, and try publishing a target on the /target_point topic. The following commang sets the target location 1 meter forward, 1 meter to the left, ané 0 meters upward relative to the base_Link frame: 5 rostopic pub -1 /target_point geonetry_msgs/PointStamped "{ header: (frane_Id: base_link }, point: (x: 1.0, ye 1.0, 2: 8.8) } If all goes well, your robot's head should pan to the left and tit downward. Now ty the almost fdentical command, but change the frame_id from base_Link to torso_link: '$ rostopic pub -1 /target_point geonetry_nsgs/PointStanped °( header: (Frane Ad: torso_Link }, point: (xt 1.8, y2 1.8, 2: 0.8) } You should have noticed the head tilt up ever so slightly. Why? Since the torso reference frame is sightly above the base reference frame, giving the same coordinates relative to this upward-shifted frame refers to a higher point in the world and the robot’ head tts up accordingly. Iitputwin probotorgblog018h we zan2018 Pi Robot ‘You can move the head back to the neutral position using the command: $ rostopic pub -1 /target_point geonetry_nsus/PointStanped "{ header: {frane_sd head_pan_link }, points (x: 100.8, ys 8.8, 2: 0.8) )° Note how in the command above, we have set the target relative to the head_pan_Link frame. Since this frame is fixed relative to the rest of the robot below the head, setting a target far away and straight ahead (y =O} centers the head regardless of the heads current position. On the ather hand, note what happens if we repeatedly publish a fixed set of target coordinates relative to the head_Link frame whichis attached to the head itself and therefore moves with the head, (Note that ‘rostopic pub =r 2° repeats the message once per second.) S rostopic pub -r 1 /target_point goonetry.nsgs/PointStanped "{ header: (frane_id: head_Link }, point: (x: 5.8, y: 0.8, 2: 8.01) } (Type Ctrl-C to stop the motion before t goes too far.) After fsuing the command above, you should see the robot head tt further and further up/back each time the message is published. Why? The target coordinates (x: 5.0, y: 0.0, 2: 0.01) specifies a location § meters forward an¢ 0.01 meters (1 cm) upward from the frame attached to the head. When we first publish this target, the nead therefore tits aUtle bit upward, but so does the head_Link frame that is attaches to the head. So when we publish the same coordinates again, they are relative to this new frame and again the head tits “upward” relative to this new frame, Head Pointer GUI Pointing the head using the command tine isa little tedious so the tutorial package includes a simple GUI to make testing @ Uitle easier. The code for this GU! s based on Mike Ferguson's most excellent ArbotiX controller GUI. Small modifications were made to allow the selection of a reference frame and the specification of a target location relative to this frame. ‘Once you have launched the dynamixels. Launch file and the point _head. launch file, you can fire up the GUI using the command: '$ rosrun pi_pead_tracking_34_partt head_potnter_gut.py eto aons ea e le [me 1) ser niget on F b ‘Poineticact) [Rese Position! Use the Select Frame pull down menu to select the reference frame in which you want to specify the target location. Then enter the x, y and z coordinates of the target and click the Point Head button. To re-center the head, click the Reset Position button. Looking Ahead to Part 2 In Part 2 of the tutorial, we will combine the head pointing nodes we have already developed with 3D visual targets using the Kinect RGB-D camera together wit Open skeleton tracking Copyright © 20062016 by Patrick Gosbel bitputwin probotorgblog018t vane

Вам также может понравиться