You are on page 1of 50

Mikkel Viager

Flexible Mission Execution for Mobile Robots


Individual course report, July 2012

Mikkel Viager

Flexible Mission Execution for Mobile Robots


Individual course report, July 2012

Flexible Mission Execution for Mobile Robots Report written by: Mikkel Viager Advisor(s): Jens Christian Andersen Nils Axel Andersen Anders Billes Beck

DTU Electrical Engineering Technical University of Denmark 2800 Kgs. Lyngby Denmark studieadministration@elektro.dtu.dk

Project period: ECTS: Education: Field: Class: Remarks:

16/01/2012 - 31/05/ 2012 10 Points M. Science Electrical Engineering Public This report is submitted as partial fulfillment of the requirements for graduation in the above education at the Technical University of Denmark. Mikkel Viager, 2012

Copyrights:

Flexible Mission Execution for Mobile Robots

Mi

Abstract
This report is the documentation for an individual course undertaken during the spring semester 2012. Through the project is developed a set of tools allowing flexible mission execution demonstrated with an AGV demonstration. Focusing on the future usability of the developed tools, each chapter explains important information on how to make each tool run in any similar mission execution framework. Furthermore, all code for a functional demonstration is included and can be used as a reference example.

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

Table of contents
1. Introduction .............................................................................................................. 9 1.1 Project goal.......................................................................................................... 9 1.2 Project limitations ............................................................................................... 9 1.3 Documentation overview .................................................................................... 9 2. Physical Equipment ................................................................................................ 10 2.1 The Guidebot ..................................................................................................... 10 2.2 Joypad................................................................................................................ 10 2.3 Electromagnet ................................................................................................... 11 2.4 Carts................................................................................................................... 11 2.5 Kinect sensors.................................................................................................... 12 2.6 LCD screen ......................................................................................................... 12 3. Kinect Sensor calibration ........................................................................................ 13 3.1 Kinect tilt calibration rule .................................................................................. 13 4. Guidemark localization rule .................................................................................... 15 4.1 Configuration..................................................................................................... 15 4.2 Rule procedure .................................................................................................. 16 5. Cart handling rule ....................................................................................................17 5.1 Configuration..................................................................................................... 17 5.2 Rule procedure .................................................................................................. 17 6. Conclusion .............................................................................................................. 19 6.1 Further work:..................................................................................................... 19 Appendix A: Kinect accelerometer experiments......................................................... 23 Appendix B: Code ....................................................................................................... 27

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

1. Introduction
Live demonstrations of mobile robots often take a long time to develop, and cannot easily be modified once fully implemented. Since many such demonstrations are set in dynamic environments, or completely portable for exhibitions, it is desired to be able to adapt the robot to a new or changed environment without having to create a new implementation from scratch. The Guidebot mobile robot platform is used for research and demonstration purposes by DTU (the Technical University of Denmark) and DTI (the Danish Technological Institute). Both find it desirable to have the capability of creating and adapting new missions relatively easily.

1.1 Project goal


The goal of this project is to create and demonstrate a set of tools, with both software and hardware modules, directly usable in a simple flexible mission execution framework. Documentation focused on the future use of the tools should be made available, encouraging incorporation into new missions and making implementation as straight forward as possible.

1.2 Project limitations

Jakob M. Hansen

DTI and DTU want the project to focus on the concept of a tow type AGV (Automated Guided Vehicle), where one robot transport simple carts between predefined areas. Furthermore, it is requested that the tools are developed and tested with the Guidebot platform, based on the concept of rules for mobotware.

1.3 Documentation overview


In accordance with the project goals, this report focuses mainly on the details of the final implementation of the toolset, and not much on the development process. Each chapter is meant to be useful on its own, allowing lookup as in a manual. Chapter 2: Chapter 3: Chapter 4: Chapter 5: Chapter 6: Appendix A: Appendix B: Primary physical equipment created and used. Kinect sensors and automatic calibration using the accelerometer. Guidemark localization rule details. Cart handling rule configuration and behavior details. Project conclusion Kinect accelerometer experiment details Code files
Mikkel Viager, July 2012

10

Flexible Mission Execution for Mobile Robots

2. Physical Equipment
Extending the existing platform with hardware add-ons has been necessary to meet the project goals. In the following sections are given summaries of the robot itself and individual descriptions of the major hardware add-ons created and used during this project.

2.1 The Guidebot


The Guidebot platform has a two-wheel centered differential drive configuration with castor wheels as third and fourth point of contact to prevent tipping. Because of its industrial grade motors and encoders, as well as the firm rubber tires, the odometry precision of this robot type is remarkably high. Running on a dual core atom processor, the system is capable of handling the image data from the two Kinects decently. With fully charged batteries, the robot can run continuously for around 15 hours and recharge overnight, making it very useful for demonstrations at expos or conventions.

2.2 Joypad
An option for manually maneuvering the robot has proven to be highly desirable for e.g. moving the robot to its initial start position of a demo or simply general transportation. To make this possible, a wireless 2.4GHz joypad has been interfaced into the rhd (robot hardware daemon) of mobotware, allowing manual control of the robot directly after boot-up, activated by a button press on the joypad. A short instruction manual for remote operation with the joypad can be found in appendix C.
Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

11

2.3 Electromagnet
Mounted on the lower backside of the robot, an electromagnet has been firmly fitted using only existing holes in the robot chassis, with a highly modified mounting bracket. The electromagnet itself can easily be removed for repair or replacement by removing the two unbraco screws on the top side and detaching the wiring behind the aluminum cover facing outwards.

The electromagnet is powered directly from the profibus with 24V, drawing 500mA current while activated. Originally designed as a secure door lock, the electromagnet has a carrying capacity of 500kg. Sets of one electromagnet and one metal anchor are available at danbit.dk [1], where the mounting bracket can be bought separately. A special arrangement was made to purchase additional metal anchors for use with the carts.

2.4 Carts
In order to interfere as little as possible with the kinematics of the robot, castor wheels are chosen for the mobile carts. Several professional quality carts, rated at a 500kg carrying capacity, was ordered through 3x34 [2] for use in this project. The metal anchors have been mounted on the bottom side of the carts, using modified mounting brackets and rubber shock absorbers allowing some flexibility in order to avoid stressing the connection too much. An example of a completely rigged cart is shown on the next page.

Mikkel Viager, July 2012

12

Flexible Mission Execution for Mobile Robots

2.5 Kinect sensors


Two Kinect sensors, made for use with the Xbox 360, are mounted on top of the robot in a modular mounting system allowing easy reconfiguration of direction and tilt angles. The Kinect sensors are powered by an internal 12V power supply and runs at below ~5W each, as determined in a previous project [3]. Due to limitations of the USB bus of the onboard computer, it is not possible to have several Kinect sensors running in depth-mode simultaneously. In this case it is chosen to use only the normal cameras, making it possible to receive uninterrupted image data from both Kinects simultaneously. Even though the depth function is not used in this particular project, the Kinect sensor was still chosen over standalone VGA cameras due to both price, future expansion options and because of the Kinects internal accelerometer which is already interfaced for use with mobotware. In the image to the left, the two Kinect sensors are configured as both pointing forwards. With the final version of the demonstration it has been chosen to have one Kinect sensor pointing forwards and the other backwards, mainly to prevent having the robot turn around completely whenever it is desired to look in a specific direction.

2.6 LCD screen


An 8-inch LCD display has been fitted to the top of the Guidebot, where it is attached to the aluminum pole similar to the Kinect sensors. This screen is meant for debug or reconfiguration in areas where no wireless access is available for a SSH connection. The screen also has resistive touch capability, and can be set up to provide a graphical user interface in the future.
Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

13

3. Kinect Sensor calibration


In order to allow robust positioning based on images of guidemarks, it is very important to know the pose of each camera as precisely as possible, as well as any distortion of the image caused by the camera lens. The problem of lens distortion is handled directly in the mobotware camera server by setting up a push command for rectifying all images to a separate image pool. Calibration is completed only once, as the properties are assumed to be non-changing, and set in the ucamserver.ini file as in this example:
var campool.device18.distortion="0.24042, -0.72509, -0.00018, -0.00022, 0.72285" var campool.device18.intrinsic="525.829, 0.0, 322.205; 0.0, 525.829, 257.529; 0.0, 0.0, 1.0"

And the push command can be set as:


poolpush img=18 cmd="rectify device=18 src=18 dst=30 silent"

Along with this, it is necessary to specify the focal length, physical pose in robot coordinates and various settings for the Kinect plug-in. An example ucamserver.ini file with settings for running 2 Kinects simultaneously can be found in appendix B.

3.1 Kinect tilt calibration rule


When setting the camera pose in the ucamserver.ini file it is important to have as precise values as possible, since any error in these will cause considerable guidemark positioning errors. The position of the camera in robot coordinates can be considered relatively static and any error will directly translate to the position measurements of guidemarks. However, when it comes to the angular direction of the camera, errors can have a high impact on the measurement calculation precision, leading to a critical decrease in reliability of a running demonstration depending on this data for positioning. To minimize this positioning error, an addition to the mobotware Kinect plug-in has been developed. Along with this is a rule allowing recalibration of a Kinect sensor whenever the robot itself is not accelerating, usable at any time during a mission. An example on how to use this feature can be seen in the end part of carthandling.rule in appendix B. An instance of the calibration rule itself can be seen in the tiltkinectcalibratefront.rule in appendix B. One instance of the rule should be configured for each Kinect by modifying the device number within the rule file itself. The detailed development documentation for the Kinect accelerometer value calculations can be found in appendix A.
Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

15

4. Guidemark localization rule


With the capability of determining positions of guidemarks in relation to the robot, a rule has been created to reset the localizer plug-in from such relative poses. This requires that all waypoint-guidemarks to have unique IDs which are each registered with world coordinates in the map.xml file. It is also important to set all waypoint guidemarks as being stationary. An example can be seen in the file DTU326map.xml in appendix B. The gmklocalize rule requires that the ulmsserver, ucamserver and mrc are running. ulmsserver for localization with the localizer plug-in. ucamserver for the getting guidemark information with the gmk plug-in mrc for handling acquired and calculated poses (mainly odometry)

The rule has 3 global variables which can be initialized to default values. The below defaults are recommended:
<!-- Global variables --> global.gmklocalize.run = false global.gmklocalize.silent = true global.gmklocalize.successtime = -1

The run variable can be set to true to activate the rule and thereby have it keep running until one successful guidemark-localization has been made. The silent variable can be set to false to get a lot of debug information printed in the console. The successtime variable holds the timestamp for when the last gmklocalization was successfully completed. This is meant to serve as a way of detecting whether the rule has completed its task, in case this is a requirement for a higher task or program.

4.1 Configuration
All configuration of the rule is done by changing the values of the configuration constants:
<!-- Configuration constants --> blocksize = 0.025 poolImg = 30 timeoutlimit = 4 linemapname = "data/DTU326-lines.xml" cov[0] = 0.05 cov[1] = 0.05 cov[2] = 0.05

Making it possible to configure:


blocksize = 0.025

The size (in meters) of the black squares in the guidemarks to be detected.
Mikkel Viager, July 2012

16

Flexible Mission Execution for Mobile Robots

poolImg = 30

The image pool number containing a rectified image ready for detection of guidemarks.
timeoutlimit = 4

Limit on how many attempts should be done to try and find a guidemark in an image before giving up and requesting a new image instead. This should be set in accordance with how often the rulebase is running.
linemapname = "data/DTU326-lines.xml"

A string with the relative path to an xml file containing line-map localization data of the environment. This is only used for re-initializing the localizer and should not contain the definitions of guidemarks.
cov[0] = 0.05 cov[1] = 0.05 cov[2] = 0.05

Contains the desired values for use in resetting the localizer upon successful gmklocalization. These should be determined by the expected precision of the gmklocalization.

4.2 Rule procedure


The rule can be activated at any time, and keeps running until a guidemark has been found and the localizer reset. This is made possible by taking into account the possibility that the robot is moving while trying to find a guidemark, calculating any position difference from odometry values. If the robot is moving along a path, after activation of the rule at t0, and receives a picture from the camera at time t1, it will take some time to process the image and detect a guidemark. Once the detection is complete and the position is calculated, the robot will have moved to a new position at time t2. By using the timestamp of the image with the guidemark, it is possible to look back into the pose history and determine the difference between the positions at t1 and t2, from odometry data. The localizer is then reset to the position at t1 plus the difference between positions at t1 and t2, which is still very accurate because of the high odometry precision and since the travelled distance isrelatively short. In cases where the robot is stationary while doing the entire localization, the position difference is simply zero, from where the image is taken to where the calculations are complete.
Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

17

5. Cart handling rule


In order to make it easy to use utilize the functionality of picking up carts with the magnet, a rule has been created to take care of this. By doing this, it is possible to have a robust preprogrammed routine take over control of the robot while performing a pick up or drop off and return to the mission upon completion of a task. Activation of the rule is done by setting one of the following values as true:
global.carthandling.pickup global.carthandling.dropoff

And then waiting for the equivalent successtime variable to be updated, indicating that the procedure was completed and that the mission should now continue:
wait() : (now() - global.carthandling.pickupsuccesstime < 1.0) wait() : (now() - global.carthandling.dropoffsuccesstime < 1.0)

An example implementing this rule can be seen in dockdemo.rule in appendix B.

5.1 Configuration
The rule is configured similarly to the localization rule, by modifying the indicated configuration variables in the top of the rule file. In this case it is also possible to reconfigure some of the values during the mission, allowing individual configuration for specific pick up points if desired, by setting some of the global variables. Descriptions for all configuration values are available directly in the carthandling.rule file in appendix B. In case of uncertainty about the impact of a configuration variable, a complete run-through of the rule procedure is given in the following section.

5.2 Rule procedure


Upon setting the variable global.carthandling.pickup=true, the following sequence of actions is completed: (A demonstration video incorporating this procedure can be found on the CD or via the following link: http://youtu.be/ppCNU5yhKz0 )

5.2.1 Find guidemark on cart


The front Kinect is used for acquiring a new image of the environment in front of the robot, which is then searched for any guidemarks. All guidemarks found are noted and compared to known guidemarks from the map file in the order of discovery. If no guidemarks are recognized or accepted, a new image is requested and the procedure is run again. After a preconfigured number of retries (ruling out simple image noise as the sole problem) the robot initiates a search for the cart in its immediate surroundings (if configured to do so). Turning left and right in preconfigured intervals
Mikkel Viager, July 2012

18

Flexible Mission Execution for Mobile Robots

the robot thoroughly searches for the cart in its surroundings until eventually giving up after completing the predefined sweep angle interval, if there is still no guidemark found to be acceptable. When a guidemark is matched in the list from the map file, it is verified whether the guidemark is defined with the value of gmkstationary being false. This will prevent the robot from accidentaly trying to pick up guidemarks meant for localization. Once the guidemark has been verified, the position of the cart in relation to the robot is requested from the guidemark plug-in.

5.2.2 Moving in position and picking up cart


A distance is defined in the configuration variables, determining the first position the robot should move to (defined as only one value, indicating how far directly in front of the cart the initial position should be). It is important to make sure that this position is not so close to the cart that the rear Kinect cannot see the guidemark on the cart. The robot moves to this position and turns to point the electromagnet directly towards the metal anchor of the cart. Switching to using the Kinect pointing backwards, the guidemark is re-detected in order to confirm that the heading angle is aligned properly, as well as to calculate the distance. Any angle offset is corrected and the robot slowly closes in on the cart after turning the electromagnet on. A configurable value indicates how much further than calculated, the robot should move when moving backwards to dock with the cart. A small value of around 0.020.05m has proven to provide robust pick up maneuvers. At this point the electromagnet is hopefully touching the metal anchor, and a small twist to both left and right is completed to make sure that there is full surface connection. After a very short move forward, the status of the micro switch below the electromagnet is evaluated to verify whether the pick up attempt was a success. If the micro switch is not pressed, the robot moves to the initial position (still directing the electromagnet towards the cart) and retries the alignment and docking procedure. After a predefined number of failed tries it moves even further away, turns around to have the front face the cart and start searching for the cart as in the beginning of the rule. When the micro switch is evaluated as pressed after a pick up attempt, the global succestime value is updated to indicate the successful pickup, and the cart handling rule returns control of the robot back to the mission procedure.

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

19

6. Conclusion
With the new hardware and software it is possible to write and easily reconfigure a demonstration, as shown in the video of the robot running the dockdemo.rule. The Kinect calibration functionality will most likely prove to be incredibly useful in the near future, as most of the robots at Automation and Control of DTU will soon be mounted with Kinect sensors. Localization by guidemarks has proven to be a very good way of initializing starting positions for any mission in mobotware which utilizes the laser scanner localization plug-in, as it greatly reduces the initial positioning precision required for successful operation. Automated handling of picking up and dropping off carts has proven to be both robust and efficient. Only very few per cent of the pickup attempts are unsuccessful, and in those cases the software quickly corrects any positioning error and makes further docking attempts automatically. The simple activation commands allow for future use with automated mission planner software for navigation and other future software and functionalities.

6.1 Further work:


Some obvious approaches to further improve the tool set would be: Development of detection software making the robot aware of its surroundings and thereby greatly decreasing safety risks of moving carts blindly. Development of path planning and navigation software capable of navigating with extra care whenever the robot is carrying a cart. Increasing the maximum allowed resolution in the guidemark detection plugin, allowing full Kinect image resolution and robust use of the small guidemarks.

The goal of developing a set of tools allowing simple and flexible mission execution of an AGV demonstration has been achieved. A video of the demonstration example can be seen here: http://youtu.be/ppCNU5yhKz0 and all files used in the demonstration are available from here: http://blog.mivia.dk/files/dockdemo.zip
Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

21

References and links


[1] http://www.danbit.dk/produkter/2906.phtml (17-07-2012) [2] http://www.flyttebutik.dk/shop/flyttehund-191p.html (17-07-2012) [3] Analysis of Kinect For Mobile Robots, M. Viager, 2011. Can be found here: http://blog.mivia.dk/2011/06/my-kinect-datasheet/

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

23

Appendix A: Kinect accelerometer experiments


The following details are from the experiments with the accelerometer of the Kinect. the Accelerometer orientation

Figure 1 image 1 2 3 Acc1 0 0 10 Acc2 10 0 0 Acc3 0 10 0 FigureFigure 2 2 Figure 3

image 4 5

Acc1 7.0 -7.0

Acc2 7.0 -7.0

Acc3 0 0

Figure 4

Figure 5

image 6 7

Acc1 -6.6 7.0

Acc2 0 0

Acc3 6.6 7.0

Figure 6

Figure 7

image 8 9

Acc1 0 0

Acc2 7.1 7

Acc3 -7.1 7 Figure 8 Figure 9

Mikkel Viager, July 2012

24

Flexible Mission Execution for Mobile Robots

From these accelerometer measurements it is possible to determine the orientation of the accelerometer, and to set up the according left hand Cartesian coordinate system as shown below. It is noticed that the orientation of this coordinate system is not ideal in terms of using the Kinect as a sensor for a mobile robot. A robot coordinate system, with the x-axis oriented along the direction of forward movement, is also defined below. Orientation of coordinate systems for accelerometer and robot:

The accelerometer values (acceleration in m/s2 along that axis) are given from the Kinect in the order: 1 2 3 = =

Which can be converted to robot coordinates as:

Where subscripts acc is for accelerometer, and subscript R is for the defined Robot coordinates.

Unreliability during Robot acceleration The magnitude of the total resulting acceleration vector can then be calculated as:
Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

25

Which is must be downwards vector of approximate magnitude 9.81 (m/s2), in all cases where the Kinect sensor is stationary or at a constant speed. Thus, if the Kinect is mounted on a mobile robot, the acceleration vector will be impacted by any acceleration or deceleration of the robot.

Kinect orientation in radians When defining the angular orientation of the Kinect sensor, it is desired to do it on the form (, , ), corresponding to rotation around its own x, y and z axis respectively. In most cases, the Kinect will be pointing in the same direction as the robot, making only the up and down-wards tilt defined by interesting. Defining zero tilt around x and y as being horizontal instead of vertical downwards, as well as choosing positive angles to indicate a downwards tilt of the camera side, the following equations describes the orientation parameters of the Kinect. The Kinect orientation ( , , ) in robot coordinates can then be described as: = tan = tan = tan

Keeping in mind that the calculation of KR is not to be trusted Whereas these calculations are sufficient for cases where the Kinect sensor is mounted normally, they will cause problems if the Kinect is tilted more than /2 around either xR or yR, e.g. if it is mounted upside down. This is taken care of by using the atan2 function in the implementation, such that: = 2( 2( 2( , ) , ))

( =

, )

Using these equations, the following orientations have been calculated for the 9 tilt example figures:
Mikkel Viager, July 2012

26

Flexible Mission Execution for Mobile Robots

Figure 1 2 3 4 5 6 7 8 9

0 0 - /2 - /4 /4 /4 - /4 0 0

0 - /2 0 0 0 - /4 - /4 - /4 /4

0 /2 /2 /4 /4 /2 /2 /4 /4

Changes made to ufunckinect.h: Added around line 140:


UVariable * varOrient;

Changes made to ufunkkinect.cpp: Added around line 323:


varOrient = addVarA("orient", "0 0 0", "3d", "(r) rotation around robot coordinate axes [Omega, Phi, Kappa]");

added around line 692 (in function run())


URotation orient;

Added later in function run() in end of accelerometer value update:


// calculate new orientation based on acc values // (roll) rotation on x, based on values in y,x-plane orient.Omega = atan2(-acc.y, -acc.z); // (tilt) orient.Phi = atan2(acc.x, -acc.z); // kappa (yaw)can not be calculated safely orient.Kappa = atan2(-acc.y, acc.x); varOrient->setRot(&orient, &t);

Result The calculated values are available in var kinect.orient as [Omega, Phi, Kappa], updated as often as the accelerometer data update.

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

27

Appendix B: Code
This appendix contains all code from the files used in the final demonstration, and can be considered a viable example of how to use the tools. All the .ini and .conf files are first, followed by the rule files in the order listed here: ucamserver.ini ulmsserver.ini aunav.ini robot.conf tiltkinectcalibratefront.rule gmklocalize.rule carthandling.rule dockdemo.rule DTU326-map.xml

All of the code files are available from the CD or the following web address: http://blog.mivia.dk/files/dockdemo.zip ucamserver.ini
module module module module module module load="var" load="aukinect.so.0" alias="kinectfront" load="aukinect.so.0" alias="kinectback" load="gmk" load="auvarmrc.so.0" load=aucamrectify.so.0

camset device=18 focallength=510 camset device=28 focallength=510 var campool.device18.camPose="0.07 -0.015 0.805 0 0.53 0" var campool.device28.camPose="-0.115 0.015 0.805 0 0.53 3.14159" var campool.device18.distortion="0.24042, -0.72509, -0.00018, -0.00022, 0.72285" var campool.device28.distortion="0.24042, -0.72509, -0.00018, -0.00022, 0.72285" var campool.device18.intrinsic="525.829, 0.0, 322.205; 0.0, 525.829, 257.529; 0.0, 0.0, 1.0" var campool.device28.intrinsic="525.829, 0.0, 322.205; 0.0, 525.829, 257.529; 0.0, 0.0, 1.0" var kinectfront.kinectNumber=0 var kinectback.kinectNumber=1 var kinectfront.imagesC3D="18 19 20 22" var kinectback.imagesC3D="28 29 26 27" var kinectfront.camDeviceNum="18 19" var kinectback.camDeviceNum="28 29" var kinectfront.useDepth="0 0" var kinectback.useDepth="0 0" #var kinectfront.desiredResolution=2 #var kinectback.desiredResolution=2 var gmk.diagonal=0 kinectfront open=true kinectback open=true poolpush img=18 cmd="rectify device=18 src=18 dst=30 silent" poolpush img=28 cmd="rectify device=28 src=28 dst=31 silent"

Mikkel Viager, July 2012

28

Flexible Mission Execution for Mobile Robots

ulmsserver.ini
server datapath="data" # script to load URG and simulated laser scanner module module load="laserpool" module load="odopose" #module load="auv360.so.0" module load="ulmsv360.so.0" module load="var" module load="mapPose" scanset devtype=lms100 devname="lms100:2111" scanset replaySubdir="log" scanset def=lms100 # set scanner position in robot coordinates scanset x=0.3 y=0.0 z=0.25 scanset mirror=true scanset width=240 scanpush cmd="v360 update" module module module module module module module module module module module # load="auefline.so.0" load="ulmspassable.so.0" load="aulobst.so.0" load="aurule.so.0" load="aulocalize.so.0" load="auplan.so.0" load="aupoly.so.0" load="mapbase.so.0" load="aulmsmrcobst.so.0" load="auvarmrc.so.0" load="aumapobst.so.0"

#load mapbase mapbase graphload="./data/DTU326-map.xml" mapbase mapload="./data/DTU326-lines.xml" # load module to utilize mapped obstacles var mapobst.marginSolidFactor=0.3 var mapobst.marginFluffyFactor = 1.1 mapobst map2localize #load settings for guidebot var localize.wheelBase=0.51 #set initial pose setinitpose x=0.0 y=0.0 th=0.0 setinitcov Cx=0.01 Cy=0.01 Cth=0.01 scanget scanget

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

29

aunav.ini
# test # open server server server server server # module module module module module of passable interval server command log imagepath="log" datapath="data" replayPath="log" port=24922 serverlog load='var' load='imagepool' load=odoPose load=utmPose load=mappose

module load="mapbase.so.0" module load="aulaserif.so.0" module load="ausmr.so.0" module load="auplanner.so.0" module load="audrivepos.so.0" module load="auroaddrive.so.0" module load="aupoly.so.0" module load="auavoid.so.0" module load="aurule.so.0" #module load="aurhdif.so.0" module load=if alias=laser # ensure laser interface gets the road and obstacle info # add a variable handler to client laser add=var laserdata add=road laserdata add=obst # request a copy af all road variables laserOnConnect cmd="laser varpush struct=obst flush" laserOnConnect cmd="laser varpush struct=obst cmd='obst'" laserOnConnect cmd="laser mapPosePush flush" laserOnConnect cmd="laser mapPosePush cmd='mappose pose'" laserOnConnect cmd="laser odoPosePush flush" laserOnConnect cmd="laser odoPosePush cmd='odopose pose'" # connect to this testserver, as it is laser server too. laser connect=localhost:24919 # interface to camera server module load=if alias=cam module load=camdata camdata add=gmk camdata add=path camdata add=cam #request a copy of all gmk variables camOnConnect cmd="cam var gmk copy" camOnConnect cmd="cam varpush struct=gmk flush" camOnConnect cmd="cam varpush struct=gmk cmd='var gmk copy'" #request a copy of kinectfront.orient variables camOnConnect cmd="cam var kinectfront copy" camOnConnect cmd="cam varpush struct=kinectfront flush" camOnConnect cmd="cam varpush struct=kinectfront cmd='var kinectfront copy'" #request a copy of kinectback.orient variables camOnConnect cmd="cam var kinectback copy"

Mikkel Viager, July 2012

30

Flexible Mission Execution for Mobile Robots

camOnConnect cmd="cam varpush struct=kinectback flush" camOnConnect cmd="cam varpush struct=kinectback cmd='var kinectback copy'" cam connect=localhost:24920 #Load SMR Module and connect to smrConnect cmd="smr do='laser "mapposepush cmd='localize getonly'"'" smr connect=localhost:31001 #Load mapbase plugin mapbase graphload="./data/DTU326-map.xml" mapbase mapload="./data/DTU326-lines.xml" #Add planner planner verbose #Reduce period time for debugging var planner.period = 0.25 var planner.speed = 0.15 var planner.accel = 0.1 rule rule rule rule rule rule load="./gmklocalize.rule" load="./tiltkinectcalibrateback.rule" load="./tiltkinectcalibratefront.rule" load="./carthandling.rule" load="./dockdemo.rule" resume

robot.conf
<?xml version="1.0" ?> <!-Configuration file for SMRdemo This file is for the SMR Robot types differential, ackerman Name smr, mmr, ackerbot, hako --> <robotinfo type="differential" name="guidebot" > <debug/> <routeinput module="mrc" parlist="motionconnect" par="obstaclefront" var="l3" /> <odometry cl ="0.00001012" cr ="0.00001012" w ="0.51" ts ="0.01"

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

31

maxtick ="10000" control ="0" /> <motioncontrol ts ="0.01" line_gain ="0.05" line_tau ="20" line_alfa ="0.3" wall_gain ="1" wall_tau ="0.07" wall_alfa ="0.02" drive_kdist ="4.8" drive_kangle="2.25" kp ="1.7" gain ="0" tau ="0" alfa ="0" w ="0.52" lim ="0.2" stopdist ="0.3" alarmdist ="0.1" velcmd ="0.3" acccmd ="1" nolinedist ="0.2" /> <motorcontrol velscalel velscaler kp ki /> <linesensor size ="16" k_filt ="0.5" /> <irsensor /> <cameraserver /> <gpsmouse hostname ="localhost" port ="9500" SBAS ="0" run ="1" use ="1" /> </robotinfo> ="10000" ="8350" ="66" ="10"

Mikkel Viager, July 2012

32

Flexible Mission Execution for Mobile Robots

tiltkinectcalibratefront.rule
<?xml version="1.0" ?> # Detects current Kinect tilt angle and sets device values accordingly <rule name="tiltkinectcalibratefront" run="true"> <init> # Global variables global.tiltkinectcalibratefront.run = false global.tiltkinectcalibratefront.silent = false global.tiltkinectcalibratefront.successtime = -1 # Configuration constants device = 18 # Local variables rotOmega = 0.0 rotPhi = 0.0 <rule name="recalibrate" if="(global.tiltkinectcalibratefront.run) " > # Detect tilt angles and set variables rotOmega = getvar('ifcam.kinectfront.orient[0]') rotPhi = getvar('ifcam.kinectfront.orient[1]') core.send('cam camset device='device' rotOmega='rotOmega' rotPhi='rotPhi) if(!global.tiltkinectcalibratefront.silent) print("Tilt values set: rotOmega="rotOmega" rotPhi="rotPhi) global.tiltkinectcalibratefront.successtime = now() global.tiltkinectcalibratefront.run=false </rule> </init> wait() : false </rule>

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

33

gmklocalize.rule
<?xml version="1.0" ?> <!-- Guidebot guidemark-localization rules --> <!-- Use: Modify configuration constants below and load in server alongside ucamserver and mrc. --> <!-Upon global.gmklocalize.run=true runs until one successful gmklocalization is completed. --> <!-Silence via global variable if desired, and use the global "successtime" to verify completion --> <rule name="gmklocalize" run="true"> <init> <!-- Global variables --> global.gmklocalize.run = false global.gmklocalize.silent = false global.gmklocalize.successtime = -1 <!-- Configuration constants --> blocksize = 0.025 poolImg = 30 timeoutlimit = 4 linemapname = "data/DTU326-lines.xml" cov[0] = 0.05 cov[1] = 0.05 cov[2] = 0.05 <!-- Local rule variables gmkready = false canrequestnewgmk = false gmkname = 'guidemark' gmkcrc = 0.0 testval = 0.0 gmkvalue = 0.0 IDindex = 0.0 gmkinmap = -1 gmkstationary = 0 robpose[0] = 0.0 robpose[1] = 0.0 robpose[2] = 0.0 robpose[3] = 0.0 robpose[4] = 0.0 robpose[5] = 0.0 gmkworldpose[0] = 0.0 gmkworldpose[1] = 0.0 gmkworldpose[2] = 0.0 ododeltastart[0] = 0.0 ododeltastart[1] = 0.0 ododeltastart[2] = 0.0 ododeltaend[0] = 0.0 ododeltaend[1] = 0.0 ododeltaend[2] = 0.0 ododeltaend[3] = 0.0 posetolocalize[0] = 0.0 posetolocalize[1] = 0.0 posetolocalize[2] = 0.0 -->

Mikkel Viager, July 2012

34

Flexible Mission Execution for Mobile Robots

gmktimestamp = 0.0 gmkimgserial = 0.0 lastgmkimgserial = ifcam.gmk.imageserial[1] timeoutcount = 0.0 x0 = 0.0 y0 = 0.0 th0= 0.0 x = 0.0 y = 0.0 th= 0.0 resx = 0.0 resy = 0.0 resth= 0.0 resxdelta = 0.0 resydelta = 0.0 resthdelta= 0.0 <rule name="askgmk" if="(global.gmklocalize.run & !gmkready) " > <!-- Take an image and get gmk information --> gmkvalue = -1 lastgmkimgserial = ifcam.gmk.imageserial[1] core.send('cam gmk img='poolImg' block='blocksize) gmkready = false global.gmklocalize.run = false canrequestnewgmk = true </rule> <rule name="verifygmk" if="!gmkready & canrequestnewgmk"> <!-- Evaluate if there is a new guidemark ready --> gmkimgserial = ifcam.gmk.imageserial[1] if(gmkimgserial != lastgmkimgserial & ifcam.gmk.count > 0) <block> lastgmkimgserial = gmkimgserial if(!global.gmklocalize.silent) print("Found " ifcam.gmk.count " new guidemark(s) with ID(s): "ifcam.gmk.IDs" in image with serial: " gmkimgserial) <!-- go through all detected guidemarks, or until one is defined in map --> IDindex = 0 gmkcrc = 0 gmkinmap = -1 gmkstationary = 0 while ((gmkcrc < 1 | gmkinmap < 0 | gmkstationary < 1) & IDindex <= ifcam.gmk.count-1) <block> gmkvalue = getvar('ifcam.gmk.IDs['IDindex']') gmkinmap = mapbase.connectorGlobal(gmkname gmkvalue,'id.value') gmkstationary = mapbase.connectorGlobal(gmkname gmkvalue,'stationary.bool') gmkcrc = getvar('ifcam.gmk.gmk' gmkvalue '.crcOK') IDindex = IDindex + 1 </block> if(gmkcrc > 0 & gmkinmap >=0)

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

35

<block> robpose[0] = robpose[1] = robpose[2] = robpose[3] = robpose[4] = robpose[5] = gmktimestamp

getvar('ifcam.gmk.gmk' gmkvalue '.robPose[0]') getvar('ifcam.gmk.gmk' gmkvalue '.robPose[1]') getvar('ifcam.gmk.gmk' gmkvalue '.robPose[2]') getvar('ifcam.gmk.gmk' gmkvalue '.robPose[3]') getvar('ifcam.gmk.gmk' gmkvalue '.robPose[4]') getvar('ifcam.gmk.gmk' gmkvalue '.robPose[5]') = getvar('ifcam.gmk.gmk' gmkvalue '.time')

gmkready = true canrequestnewgmk = false timeoutcount = 0 </block> </block> else <block> if(canrequestnewgmk) <block> if(!global.gmklocalize.silent) print("timeoutcount = " timeoutcount" / "timeoutlimit) timeoutcount = timeoutcount + 1 if(timeoutcount > timeoutlimit) <block> if(!global.gmklocalize.silent) print("Timed out while waiting for guidemark. New guidemark requested.") global.gmklocalize.run = true timeoutcount = 0 </block> </block> </block> </rule> <rule name="handlegmk" if="gmkready & gmkvalue >= 0"> <!-- Complete desired actions with this new guidemark --> <!-while (currpose[3] <= gmktimestamp) <block> if(!global.gmklocalize.silent) <block> print("mappose not yet updated. Waiting...") print("( "currpose[3]" <= "gmktimestamp" )") </block> currpose = mappose.pose </block> --> gmkworldpose[0] = mapbase.connectorGlobal(gmkname gmkvalue,'mappose.x') gmkworldpose[1] = mapbase.connectorGlobal(gmkname gmkvalue,'mappose.y') gmkworldpose[2] = mapbase.connectorGlobal(gmkname gmkvalue,'mappose.th') <!-- coordinate transformation for position --> x0 = gmkworldpose[0] y0 = gmkworldpose[1] th0= gmkworldpose[2] x = robpose[0] y = robpose[1] th= robpose[5]

Mikkel Viager, July 2012

36

Flexible Mission Execution for Mobile Robots

resx = cos(th0)*x-sin(th0)*y+x0 resy = sin(th0)*x+cos(th0)*y+y0 resth= th + th0 odopose.poseattime(gmktimestamp) ododeltastart[0] = odopose.calcpose[0] ododeltastart[1] = odopose.calcpose[1] ododeltastart[2] = odopose.calcpose[2] ododeltaend = odopose.pose

x = ododeltaend[0] - ododeltastart[0] y = ododeltaend[1] - ododeltastart[1] th= ododeltaend[2] - ododeltastart[2] resxdelta = cos(th0)*x-sin(th0)*y resydelta = sin(th0)*x+cos(th0)*y resthdelta= th posetolocalize[0] = resx + resxdelta posetolocalize[1] = resy + resydelta posetolocalize[2] = limitToPi(resth + resthdelta) if(!global.gmklocalize.silent) <block> print("--- printing poses for debug ---") print("Odometry pose at time of image taken print("Odometry pose at current time print("Robot pose diff as a result (x y th) "resthdelta) print("Guidemark pose in world coordinates = "gmkworldpose) print("Robot pose relative to guidemark = "robpose) print("Calculated pose to reset localizer = "posetolocalize) print("Timestamp for image with guidemark = "gmktimestamp) print("--- executing the following commands ---") print("laser resetlocalizer") print("laser mapobst map2localize") print("laser setinitpose x="posetolocalize[0]" y="posetolocalize[1]" th="posetolocalize[2]) print("laser setinitcov Cx="cov[0]" Cy="cov[1]" Cth="cov[2]) print("laser localize") print("") </block> core.send("laser resetlocalizer") core.send("laser mapobst map2localize") core.send("laser setinitpose x="posetolocalize[0]" y="posetolocalize[1]" th="posetolocalize[2]) core.send("laser setinitcov Cx="cov[0]" Cy="cov[1]" Cth="cov[2]) core.send("laser localize") global.gmklocalize.successtime = now() gmkready = false </rule> </init> wait() : false </rule>

= "ododeltastart) = "ododeltaend) = "resxdelta" "resydelta"

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

37

carthandling.rule
<?xml version="1.0" ?> # Procedure for picking up a cart with a guidemark on it # Cart with guidemark should be in front of the robot when rule is activated # Guidemark should be defined in mapbase as a mobile guidemark # Please set global configuration variables to fit needs <rule name="carthandling" run="true"> <init> # Global variables global.carthandling.pickup = false global.carthandling.dropoff = false global.carthandling.silent = false global.carthandling.carryingcartnow = 0 global.carthandling.pickupsuccesstime = -1 global.carthandling.dropoffsuccesstime = -1 # #########Configuration variables (some are global to allow change on the fly)######### # Do search of gmk and cart if not found from initial position (true/false) global.carthandling.searchforgmk = true # How many attempts to dock should be made before giving up global.carthandling.dockretries = 10 #how far directly in front of the cart should the robot move to initially (in meters, 1.0m is suggested) global.carthandling.initialpickupdist = 0.65 # Recalibrate kinects before each pickup using tiltkinectcalibratefront.rule and tiltkinectcalibrateback.rule (true/false) recalibratekinectsactive = true # Pool image numbers to look for guidemarks in, and respective device numbers poolimgfront = 30 gmkdevicefront = 18 poolimgback = 31 gmkdeviceback = 28 # Guidemark rectangle sizes in m blocksize = 0.025 # How many run through of the rule should be completed before requesting new guidemark (if none found)--> timeoutlimit = 3 # How many times to get and attempt to verify gmk before moving gmkgetattemptlimit = 6 # How many degrees should be rotated extra every turn rotateincrement = 5 # how large a total angle should the robot search in? (120 would mean a search from -60 to 60 degrees) searchwidthangle = 120

Mikkel Viager, July 2012

38

Flexible Mission Execution for Mobile Robots

# Movement speed of all docking actions taken (double between 0 and 1) speedmodifier = 1.0 # The distance from the middle of the robot to the end of the magnet midtomagnet = 0.39 # The distance to move further when docking to the cart, than what is just precisely needed extradockdist = 0.02 # The speed at which the robot should approach the cart in the final stage of a docking dockspeed = 0.08 # How may times should the robot retry docking before doing a 180 turn and starting over? maxdockattempts = 2 # Seconds to wait after turning off magnet, before cart can be assumed to have disconnected dropwait = 2 ######### Local internal variables (do not modify)######## doinit = 1 dokinectcalibrations = 1 gmkready = false dosearchmove = false searchmove = 0 stage = 1 dockattempts = 0 ask4gmk = true canrequestnewgmk = false gmkname = 'guidemark' gmkvalue = -1 lastgmkimgserial = 0 gmkimgserial = 0 gmktimestamp = 0 poolimg = 0 lastpoolimg = 0 gmkdevice = 0 IDindex = 0 gmkcrc = 0 gmkinmap = -1 gmkstationary = 0 gmkrequests = 0 rotateval = 0 startcalibratetime = 0 maxspeed = 0.3 minspeed = 0.07 turnvel = (minspeed + (maxspeed-minspeed)*speedmodifier)*0.6 movevel = minspeed + (maxspeed-minspeed)*speedmodifier timeoutcount = 0.0

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

39

robpose[0] robpose[1] robpose[2] robpose[3] robpose[4] robpose[5]

= = = = = =

0.0 0.0 0.0 0.0 0.0 0.0

xpose1 = 0 xrob = 0 yrob = 0 pickupdist1 = 0 pickupturn1 = 0 pickupturn2 = 0 alpha = 0 theta = 0 Pi = 3.1416 atan2angle = 0 atan2angle2 = 0 thetaoffset = 0 dockstatus = 0 dockattempts = 1 dockdisttomove = 0 # Make sure magnet is turned off smr.do("setoutput \digitalout\ 0") <rule name="askgmk" if="global.carthandling.pickup & !gmkready & ask4gmk" > # Take an image and get gmk information gmkvalue = -1 lastgmkimgserial = ifcam.gmk.imageserial[1] lastpoolimg = poolimg if(stage > 1) <block> poolimg = poolimgback gmkdevice = gmkdeviceback </block> else <block> poolimg = poolimgfront gmkdevice = gmkdevicefront </block> print("Asked for new guidemark from "gmkdevice) core.send('cam gmk img='poolImg' block='blocksize) if(lastpoolimg == poolimg) <block> ask4gmk=false canrequestnewgmk = true </block> else <block> smr.do("wait 0.5") smr.putEvent(100) wait() : smr.gotEvent(100) </block> </rule> <rule name="verifygmk" if="!gmkready & canrequestnewgmk" >

Mikkel Viager, July 2012

40

Flexible Mission Execution for Mobile Robots

# Evaluate if there is a new guidemark ready gmkimgserial = ifcam.gmk.imageserial[1] if(gmkimgserial != lastgmkimgserial & gmkdevice == ifcam.gmk.imageserial[0] & ifcam.gmk.count > 0) <block> print("accepted new guidemark from "gmkdevice) lastgmkimgserial = gmkimgserial if(!global.carthandling.silent) print("Found " ifcam.gmk.count " new guidemark(s) with ID(s): "ifcam.gmk.IDs" in image with serial: " gmkimgserial) # go through all detected guidemarks, or until one is defined in map IDindex = 0 gmkcrc = 0 gmkinmap = -1 gmkstationary = 1 while ((gmkcrc < 1 | gmkinmap < 0 | gmkstationary > 0) & IDindex <= ifcam.gmk.count-1) <block> gmkvalue = getvar('ifcam.gmk.IDs['IDindex']') gmkinmap = mapbase.connectorGlobal(gmkname gmkvalue,'id.value') gmkstationary = mapbase.connectorGlobal(gmkname gmkvalue,'stationary.bool') gmkcrc = getvar('ifcam.gmk.gmk' gmkvalue '.crcOK') IDindex = IDindex + 1 </block> if(gmkcrc > 0 & gmkinmap >=0 & gmkstationary < 1) <block> robpose[0] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[0]') robpose[1] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[1]') robpose[2] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[2]') robpose[3] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[3]') robpose[4] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[4]') robpose[5] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[5]') gmktimestamp = getvar('ifcam.gmk.gmk' gmkvalue '.time') if(!global.carthandling.silent) print("chose guidemark with ID: "gmkvalue) gmkready = true canrequestnewgmk = false timeoutcount = 0 rotateval = 0 </block> </block> else <block> if(canrequestnewgmk) <block> if(!global.carthandling.silent) print("timeoutcount = " timeoutcount" / "timeoutlimit) timeoutcount = timeoutcount + 1 if(timeoutcount > timeoutlimit) <block> if(!global.carthandling.silent) print("Timed out while waiting for guidemark. New

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

41

guidemark requested.") timeoutcount = 0 ask4gmk = true gmkrequests = gmkrequests + 1 </block> if(gmkrequests > gmkgetattemptlimit & global.carthandling.searchforgmk) <block> if(!global.carthandling.silent) print("Max get-guidemark tries reached. Moving to look in new direction.") dosearchmove = true gmkrequests = 0 ask4gmk = false canrequestnewgmk = false </block> </block> </block> </rule> <rule name="searchforgmk" if="dosearchmove & global.carthandling.searchforgmk & stage == 1" > if(rotateval < 0) rotateval = (rotateval - rotateincrement ) * -1 else rotateval = (rotateval + rotateincrement ) * -1 if(rotateval >= searchwidthangle) <block> print("Could not find any carts to pick up. Moving on..." dosearchmove = false global.carthandling.pickup = false global.carthandling.failedtime = now() break </block> if(!global.carthandling.silent) print("Turning " rotateval " degrees") smr.do("turn " rotateval " @v" turnvel "") smr.do("wait 1") smr.putEvent(101) wait() : smr.gotEvent(101) dosearchmove = false ask4gmk = true </rule> <rule name="moveinposition" if="gmkready & stage == 1" > # move in front of cart and point magnet at gmk # Make sure magnet is turned off smr.do("setoutput \digitalout\ 0") xpose1 = global.carthandling.initialpickupdist theta = robpose[5] xrob = robpose[0] yrob = robpose[1] alpha = atan((xrob-xpose1)/yrob)

Mikkel Viager, July 2012

42

Flexible Mission Execution for Mobile Robots

pickupdist1 = sqrt((xrob-xpose1)*(xrob-xpose1)+yrob*yrob) #calculate the first and second turn angle if(theta >=0) <block> pickupturn1 = (theta - pi/2 + alpha) * -1 pickupturn2 = (theta + alpha) * -1 </block> else <block> pickupturn1 = (theta + pi/2 - alpha) * -1 pickupturn2 = (theta - alpha) * -1 </block> atan2angle = atan2(yrob, xrob-xpose1) if(theta < 0) atan2angle2 = atan2angle - pi-theta else atan2angle2 = atan2angle + pi-theta if(!global.carthandling.silent) <block> print("Robot pose in gmk coordinates = "robpose) print("Distance to initial position = "pickupdist1" m") print("Angle to turn = "atan2angle2" radians") </block> if(pickupdist1 > 0.05) <block> smr.do("turn "atan2angle2" \rad\ @v"turnvel) smr.do("ignoreobstacles") smr.do("fwd "pickupdist1" @v"movevel) </block> smr.do("turn "-theta-atan2angle2" \rad\ @v"turnvel) smr.putEvent(102) wait() : smr.gotEvent(102) gmkready = false ask4gmk = true stage = 2 </rule> <rule name="dock" if="gmkready & stage == 2" > xpose1 = global.carthandling.initialpickupdist dockdisttomove = robpose[0] - midtomagnet + extradockdist print("robpose[0] = "robpose[0]) thetaoffset = -robpose[5] if(!global.carthandling.silent) print("thetaoffset = "thetaoffset" (max = 0.03)") if(abs(thetaoffset) > 0.05) #large offset <block> smr.do("turn "thetaoffset" \rad\ @v"minspeed/3) smr.putEvent(103) wait() : smr.gotEvent(103) gmkready = false ask4gmk = true </block> else

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

43

<block> if(!global.carthandling.silent) print("Distance between cart and magnet (including extradockdist) = "dockdisttomove) smr.do("setoutput \digitalout\ 1") smr.do("fwd "-dockdisttomove" @v"dockspeed) smr.do("wait 0.5") smr.do("turn "-0.03" \rad\ @v"minspeed/2) smr.do("turn "0.06" \rad\ @v"minspeed/2) smr.do("turn "-0.03" \rad\ @v"minspeed/2) smr.do("fwd 0.02") smr.putEvent(104) wait() : smr.gotEvent(104) dockstatus = smr.eval("$digital[1]") if(dockstatus < 1) <block> if(!global.carthandling.silent) print("Docking failed - microswitch not pressed. Trying again") smr.do("setoutput \digitalout\ 0") smr.do("wait 1") smr.do("fwd "dockdisttomove" @v"movevel) smr.putEvent(105) wait() : smr.gotEvent(105) if(dockattempts >= maxdockattempts) <block> smr.do("turn 180 @v"turnvel) smr.do("fwd -0.1 @v"dockspeed) smr.putEvent(106) wait() : smr.gotEvent(106) dockattempts = 1 print("SETTING STAGE=1") stage = 1 </block> else dockattempts = dockattempts + 1 gmkready = false ask4gmk = true print("flags are now: gmkready = "gmkready" , ask4gmk = "ask4gmk) </block> else <block> if(!global.carthandling.silent) print("Docking success! - global variable 'carryingcartnow' updated") global.carthandling.carryingcartnow = 1 dockattempts = 1 stage = 3 </block> </block> </rule> <rule name="finished" if="gmkready & stage == 3" > global.carthandling.pickup = false gmkready = false ask4gmk = true dosearchmove = false

Mikkel Viager, July 2012

44

Flexible Mission Execution for Mobile Robots

stage = 1 global.carthandling.pickupsuccesstime = now() </rule> <rule name="dropcart" if="global.carthandling.dropoff" > smr.do("setoutput \digitalout\ 0") smr.do("wait "dropwait) smr.putEvent(107) wait() : smr.gotEvent(107) if(!global.carthandling.silent) print("Dropoff successful!") global.carthandling.carryingcartnow = 0 global.carthandling.dropoffsuccesstime = now() global.carthandling.dropoff = false </rule> <rule name="calibratekinects" if="dokinectcalibrations & recalibratekinectsactive" > startcalibratetime = now() global.tiltkinectcalibratefront.run = true wait() : (now() - global.tiltkinectcalibratefront.successtime < 1.0 | now()startcalibratetime > 3) # wait for calibration if(now() - global.tiltkinectcalibratefront.successtime < 1.0) <block> if(!global.carthandling.silent) print("front-kinect successfully calibrated!") </block> else <block> if(!global.carthandling.silent) print("front-kinect calibration failed!") </block> startcalibratetime = now() global.tiltkinectcalibrateback.run = true wait() : (now() - global.tiltkinectcalibrateback.successtime < 1.0 | now()startcalibratetime > 3) # wait for calibration if(now() - global.tiltkinectcalibrateback.successtime < 1.0) <block> if(!global.carthandling.silent) print("back-kinect successfully calibrated!") </block> else <block> if(!global.carthandling.silent) print("back-kinect calibration failed!") </block> dokinectcalibrations = 0 </rule> <rule name="printsettings" if="doinit" > #Print init data print("--------------------------carthandling.rule-------------------------") print("Rule initialized with settings:") print("") print(" front kinect img: "poolimgfront)

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

45

print(" back kinect img: "poolimgback) print(" drive velocity: "movevel" m/s") print(" turn velocity: "turnvel" m/s") print(" pickup retries: "global.carthandling.dockretries) print(" initial distance: "global.carthandling.initialpickupdist" m") if(global.carthandling.searchforgmk) <block> print(" gmk-search: ON - "gmkgetattemptlimit" attempts to get guidemark before moving") print(" - search width: +-"searchwidthangle/2" degrees") print(" - increment size: "rotateincrement" degrees") </block> else <block> print(" gmk-search: OFF") </block> print("") if(now() - global.tiltkinectcalibratefront.successtime + now() - global.tiltkinectcalibrateback.successtime < 10) print(" Front and back kinects successfully calibrated!") print("--------------------------------------------------------------------") doinit = 0 </rule> </init> wait() : false </rule>

Mikkel Viager, July 2012

46

Flexible Mission Execution for Mobile Robots

dockdemo.rule
<?xml version="1.0" ?> <!-- simple docking demonstration for 326 <rule name="dockdemo" run="true"> <init> global.dockdemo.run = false global.dockdemo.loops = 0 global.dockdemo.done = true highspeed = 0.2 lowspeed = 0.1 speed = 0 currx = 0 curry = 0 currth = 0 pi = 3.1416 smr.do("laser \push cmd='localize' t=1\ ") smr.do("laser \push cmd='mrcobst width=0.8' t=0.5\ ") <rule name="speedadjust" if="true" > if(global.carthandling.carryingcartnow) speed = lowspeed else speed = highspeed </rule> <rule name="loop" if="(global.dockdemo.done) and global.dockdemo.run" > global.dockdemo.done = false global.dockdemo.loops = global.dockdemo.loops + 1 global.gmklocalize.run=1 wait() : (now() - global.gmklocalize.successtime < 1.0) # wait for localization smr.do("turn -90 @v"speed) smr.do("drivew 3.2 0.9 180 @v"speed" : ($targetdist < 0.15)") smr.do("fwd 0.15 @v"speed) smr.do("turn -90 @v"speed) smr.putEvent(15) wait() : smr.gotEvent(15) global.carthandling.pickup = true wait() : (now() - global.carthandling.pickupsuccesstime < 1.0) #smr.do("drivew 3.2 0.6 -90 @v0.1 : ($targetdist < 0.15)") currx = mappose.pose[0] curry = mappose.pose[1] currth = mappose.pose[2] smr.do("turn "-pi/2-currth" \rad\ @v"speed) smr.do("fwd "curry-1.10" @v"speed) smr.do("turn "-currth" \rad\ @v"speed*0.6) smr.do("fwd "-(currx-1.30)" @v"speed) smr.putEvent(16) wait() : smr.gotEvent(16) global.carthandling.dropoff = true wait() : (now() - global.carthandling.dropoffsuccesstime < 1.0) -->

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

47

smr.do("drive 0.3 @v"speed" : ($drivendist > 0.3)") smr.do("turnr 0.5 -160 @v"speed) smr.do("drive 0.1 @v"speed" : ($drivendist > 0.1)") smr.do("fwd 0.1") smr.putEvent(17) wait() : smr.gotEvent(17) global.gmklocalize.run=1 wait() : (now() - global.gmklocalize.successtime < 1.0) # wait for localization #smr.do("turn 130 @v" -speed) smr.do("drivew 2.2 0.9 90 @v"speed" : ($targetdist < 0.15)") smr.do("fwd 0.15 @v"speed) smr.do("turn 90 @v"speed) smr.putEvent(18) wait() : smr.gotEvent(18) global.carthandling.pickup = true wait() : (now() - global.carthandling.pickupsuccesstime < 1.0) currx = mappose.pose[0] curry = mappose.pose[1] currth = mappose.pose[2] smr.do("turn "-currth" \rad\ @v"speed) smr.do("fwd "3.25-currx" @v"speed) smr.do("turn "-pi/2-currth" \rad\ @v"speed*0.6) smr.do("fwd "-(1.45-curry)" @v"speed) smr.putEvent(19) wait() : smr.gotEvent(19) global.carthandling.dropoff = true wait() : (now() - global.carthandling.dropoffsuccesstime < 1.0) smr.do("fwd 0.2 @v"speed) smr.do("turn 110 @v"speed) smr.do("fwd 1.2 @v"speed) smr.do("wait 1") smr.putEvent(20) wait() : smr.gotEvent(20) global.carthandling.dropoff = true global.dockdemo.done = true print("Mission done - loop " global.dockdemo.loops) </rule> print("---------Dockdemo initialized----------") </init> wait() : false </rule>

Mikkel Viager, July 2012

48

Flexible Mission Execution for Mobile Robots

DTU326-map.xml
<?xml version="1.0" ?> <!-Graph describing room 011 in building 326 Coordinate system: Origo is defined at guidemark 30 located on the east wall exact is right below the guidemark, close to the wall. The x-axis goes straight across the room to the west wall The y-axis follows the wall on which the guidemark is placed (positive away from the windows) The names of lines are given from actual geographical direction eg. the windows are are placed in the north wall In the primary map, only wall elements are defined. Lines from equipment and artificial ones can be added if needed, but we hope not to (as they are moved all the time...) --> <graph name="DTU byg. 326"> <!-- Waypoint nodetype --> <nodetype name="waypoint"> <dimension shape="circle" radius="0.20"/> <connector name="n" x="0.10" y="0.0" th="0.0" autoconnect="1"/> <connector name="w" x="0.0" y="0.10" th="90.0" autoconnect="1"/> <connector name="s" x="-0.10" y="0.00" th="180.0" autoconnect="1"/> <connector name="e" x="-0.00" y="-0.10" th="270.0" autoconnect="1"/> <connector name="nw" x="0.075" y="0.075" th="45.0" autoconnect="1"/> <connector name="sw" x="-0.075" y="0.075" th="135.0" autoconnect="1"/> <connector name="se" x="-0.075" y="-0.075" th="225.0" autoconnect="1"/> <connector name="ne" x="0.075" y="-0.075" th="315.0" autoconnect="1"/> </nodetype> <!-- Nodes definitions --> <node name="room011node" x="0.0" y="0.0" th="0.0"> <dimension shape="rect" left="9.5" top="3.3"/> <!-- Door connectors --> <!-- Guidemark definitions --> <connector name="guidemark30" x="0.0" y="0.0" z="0.30" th="0.0"> <id value="30"/> <stationary bool="1"/> </connector> <connector name="guidemark32" x="5.80" y="2.66" z="0.32" th="3.14159"> <id value="32"/> <stationary bool="1"/> </connector> <connector name="guidemark81" x="0.0" y="0.0" z="0.0" th="0.0"> <id value="81"/> <stationary bool="0"/> </connector>

Mikkel Viager, July 2012

Flexible Mission Execution for Mobile Robots

Mi

49

<!-- Nodes definitions --> <node name="N1" x="2.6" <node name="N2" x="1.4" <node name="N3" x="3.0" <node name="N4" x="4.5"

y="0.0" y="0.0" y="1.9" y="2.0"

th="0.0" th="0.0" th="0.0" th="0.0"

type="waypoint"/> type="waypoint"/> type="waypoint"/> type="waypoint"/>

<!-- Route definitions --> <edge start="N1.w" end="N2.e"/> <edge start="N2.ne" end="N3.sw"/> <edge start="N3.e" end="N4.w"/> <edge start="N4.sw" end="N1.ne"/> </node> <!-- End of building testmap node --> </graph>

Mikkel Viager, July 2012

G ie o Jyt k aiain ud b to sc N v t i g o
Gi v0 ue1 d .

(& )o ge o sc C nrl A B Tg lJyt k o t i o

() dfdrcin B Mo i i t y e o

() v a d un A Mo e n tr

() v fr ad B Mo e ow r

() v b cw rs B Mo e ak ad

Y h e o d e e o is ra u rghro o a t ie n n po fmn vi t o t u v w n p d t tn o e e e b. n T f c nr t t a o i a iit wh a Bb e h u t s l do c po rnce iAn a v e ni e e eh tn e dad t d o . o a W eao i An rueo h d ra u rg d so r ga ri nspoB he po oy qr n a fmn vi io n oe srt es ao i . rs tn le i e n o e e t e t a e pco s n i tn

1 0 1M kVg 7 8 1 ie ie / - k la r

You might also like