You are on page 1of 160

DESIGN, SIMULATION AND FABRICATION

OF A 5-DOF ROBOTIC ARM

(with implementation of inverse kinematics)

BY

ORIDATE Ademola Ayodeji


(MEE/2008/071)

A THESIS SUBMITTED IN PARTIAL FULFILMENT

OF THE REQUIREMENTS FOR THE AWARD OF

BACHELOR OF SCIENCE DEGREE

IN MECHANICAL ENGINEERING

OBAFEMI AWOLOWO UNIVERSITY ILE-IFE, NIGERIA.

May, 2014

i
CERTIFICATION

This is to certify that this report on Design, Simulation and Fabrication of a

5-DoF Robotic Arm (with implementation of inverse kinematics) was submitted by

ORIDATE Ademola Ayodeji (MEE/2008/071) in partial fulfilment of the

requirements for the award of Bachelor of Science Degree in Mechanical

Engineering, Obafemi Awolowo University Ile-Ife, Nigeria.

Dr. A.O. Oluwajobi

(Supervisor)

............................

Date

Dr. A.O. Oluwajobi

(Head of Department)

.....................................

Date

ii
DEDICATION

This work is dedicated to the saviour of my soul.

iii
ACKNOWLEDGEMENTS

Firstly, my appreciation goes to God Almighty for seeing me thus far and for His

unending support throughout the course of this project.

I will like to appreciate my mother, Mrs. M.O. Oridate for her support, morally and

financially.

My appreciation also goes to my supervisor, Dr. A.O. Oluwajobi for his counsel and

encouragement.

I would like to express my gratitude to my uncles, Mr. Soretire and Mr. Soretire for

their technical support during the fabrication stage of this project.

I would also like to appreciate Mr. Akinwale of the Department of Electronic and

Electrical Engineering for his thorough work on the kinematics of a robotic arm,

which proved very useful in the course of this project.

Lastly, I would like to appreciate my good friend, Itimi Richard for his assistance with

the programming of the robotic arm.

iv
Table of Contents
CERTIFICATION ......................................................................................................... ii

DEDICATION .............................................................................................................. iii

ACKNOWLEDGEMENTS .......................................................................................... iv

LIST OF FIGURES .................................................................................................... viii

LIST OF TABLES ......................................................................................................... x

ABSTRACT .................................................................................................................. xi

CHAPTER ONE ............................................................................................................ 1

INTRODUCTION ......................................................................................................... 1

1.1 Background .......................................................................................................... 1

1.2 Classification and uses of robots .......................................................................... 1

1.3 Robotic Manipulators (The Robotic Arm) ........................................................... 3

1.3.1 Links and joints ............................................................................................. 4

1.3.2 Actuators ....................................................................................................... 4

1.3.3 Controller ...................................................................................................... 5

1.3.4 End-effector .................................................................................................. 6

1.3.5 Sensor ............................................................................................................ 6

1.4 Applications of the Robotic Arm ......................................................................... 7

1.5 Project Aim .......................................................................................................... 8

1.6 Project Objectives ................................................................................................ 9

1.7 Justification for the Project .................................................................................. 9

1.8 Scope .................................................................................................................. 10

v
CHAPTER TWO ......................................................................................................... 12

LITERATURE REVIEW ............................................................................................ 12

2.1 The Field of Robotics......................................................................................... 12

2.2 Robots ................................................................................................................ 13

2.3 The Robotic Arm ............................................................................................... 15

2.4 The Denavit-Hartenberg (D-H) Convention ..................................................... 36

CHAPTER THREE ..................................................................................................... 39

METHODOLOGY ...................................................................................................... 39

3.1 Choice of Material ............................................................................................. 39

3.2 Implementation Plan .......................................................................................... 40

3.3 Robotic Arm Structure ....................................................................................... 40

3.4 Kinematic Analysis ............................................................................................ 40

3.4.1 Kinematic analysis using Denavit-Hartenberg convention ......................... 45

3.4.2 Kinematic Analysis and Simulation using MATLAB® .............................. 52

3.5 Determination of Torque Requirement of Joints ............................................... 64

3.6 Motor Selection .................................................................................................. 76

3.6.1 Gripper joint ................................................................................................ 77

3.6.2 Wrist joint ................................................................................................... 79

3.6.3 Elbow Joint ................................................................................................. 79

3.6.4 Shoulder Joint ............................................................................................. 79

3.6.5 Base Joint .................................................................................................... 80

3.7 Servo Controller Selection ................................................................................. 80


vi
3.8 Interfacing Software........................................................................................... 84

3.9 Final Arm Model................................................................................................ 86

3.10 Cost .................................................................................................................. 86

CHAPTER FOUR ...................................................................................................... 100

RESULTS AND DISCUSSION ................................................................................ 100

CHAPTER FIVE ....................................................................................................... 112

CONCLUSION AND RECOMMENDATIONS ...................................................... 112

REFERENCES .......................................................................................................... 113

APPENDIX 1 ............................................................................................................. 117

APPENDIX 2 .............................................................................................................. 132

APPENDIX 3 .............................................................................................................. 147

vii
LIST OF FIGURES

Figure 2. 1: ASIMO demonstrates its ability to climb stairs at Auto Zurich Car Show

...................................................................................................................................... 14

Figure 2. 2: Omnidirectional mobile home-care robot ................................................ 23

Figure 2. 3: Model of the robotic arm and the complete robot system by Jenne ......... 25

Figure 2. 4: Final TETRIX robot arm and hand assembly by Blackmore et al. .......... 28

Figure 2. 5: RA-01 robotic arm .................................................................................... 30

Figure 2. 6: Adebola's Improvised robotic arm in operation ....................................... 33

Figure 2. 7: Herman's robotic arm built with standard bricks, before being mounted on

a Turtlebot™ prototype ................................................................................................ 35

Figure 2. 8: Robotic arm for picking beams for static bending test ............................. 37

Figure 3. 1: Implementation flow chart of robotic arm................................................ 41

Figure 3. 2: Software flowchart for the design of the robotic arm............................... 42

Figure 3. 3: Schematic diagram of robotic arm setup .................................................. 43

Figure 3. 4: CAD model of proposed robotic arm ....................................................... 44

Figure 3. 5: Attachment of frames to the links of a robotic arm ................................. 46

Figure 3. 6: Schematic of robotic arm showing chosen joint axes .............................. 48

Figure 3. 7: Robotic arm pose corresponding to joint angle set qn ............................. 55

Figure 3. 8: Robotic arm pose corresponding to joint angle set qs .............................. 56

Figure 3. 9: Robotic arm pose corresponding to joint angle set qz.............................. 57

Figure 3. 10: Robotic arm pose corresponding to joint angle set qr ............................ 58

Figure 3. 11: Illustration of the use of teach method for simulating a robotic arm ..... 60

Figure 3. 12: Labelling of lengths of the robotic arm links ......................................... 65

Figure 3. 13: Free-body-diagram of robot arm in stretched out pose .......................... 66

viii
Figure 3. 14: Autodesk® Inventor® Professional iproperties™ displaying the moments

of inertia of link L2 ....................................................................................................... 70

Figure 3. 15: Diagram to illustrate the Parallel Axis Theorem .................................... 73

Figure 3. 16: Components of a standard servo ............................................................ 78

Figure 3. 17 : Pictures of selected servos for the robot arm......................................... 82

Figure 3. 18: Pololu's® Micro Maestro 6-channel servo controller ............................ 85

Figure 3. 19: Final model of the robotic arm ............................................................... 87

Figure 3. 20: Final model of the robotic arm (upright pose)........................................ 88

Figure 3. 21: Components of the robotic arm .............................................................. 98

Figure 4. 1 : Robotic arm in upright pose .................................................................. 101

Figure 4. 2 : Set-up for the centring of servo horn ..................................................... 102

Figure 4. 3 : GUI of interfacing software for controlling robotic arm ....................... 104

Figure 4. 4 : Setup for measuring coordinates of gripper position ............................ 105

Figure 4. 5 : GUI of interfacing software for controlling robotic arm (pick-and-place

task) ............................................................................................................................ 108

Figure 4. 6 : Robotic arm performing a pick-and-place task (a) ................................ 109

Figure 4. 7 : Robotic arm performing a pick-and-place task (b)................................ 110

Figure 4. 8 : GUI of alternate interfacing software for controlling robotic arm (using

sliders) ........................................................................................................................ 111

ix
LIST OF TABLES

Table 1. 1 : Rough Cost Estimate of Robotic Arm ...................................................... 11

Table 2. 1 : Linguistic rules for fuzzy logic control of robotic arm ............................. 21

Table 3. 1 : D-H parameters for the Robotic Arm ....................................................... 49

Table 3. 2 : List of selected servos and their specifications ......................................... 81

Table 3. 3 : Cost Estimate of Robotic Arm .................................................................. 99

Table 4. 1 : Coordinate input test result for robotic arm ............................................ 106

x
ABSTRACT

This report contains the design and implementation of a 5-DoF (Degree of Freedom)

robotic arm. The robotic arm is intended for educational purposes. Bearing in mind

that a similar project has been previously undertaken in the department, this project

work goes a step further by redesigning the robotic arm for improved accuracy by

using servos to power the joints as well as implementing the inverse kinematics of the

robotic arm.

A simulation of the robotic arm to visualise the joint movements using Robotics

Toolbox for Matlab® is also covered in detail.

The report covers the procedure for selection of the servos used to power each joint of

the arm in details. Polymethyl methacrylate was selected to fabricate the components

of the robotic arm. The torque exerted at each of the joints was calculated and a servo

with the required torque rating was selected for each joint.

A suitable servo controller was selected for the project and a control software for the

robotic arm was developed using Microsoft's C# programming language. The software

allows the gripper of the robotic arm to be postioned in space by specifying the

coordinates of its centre position.

The software was tested by specifying various coordinates for the gripper with

reference to the robot's base and measuring the corresponding coordinates of the

centre position of the gripper. Satisfactory results were obtained from the project.

xi
CHAPTER ONE

INTRODUCTION

1.1 Background

The first usage of the term 'robot' was in a 1921 Czech science fiction play -

'Rossum’s Universal Robots' - by Karel Capek. The robots were artificial people or

androids and the word was derived from the word 'Robata', a Czech word for slave

(Corke, 2011).

A question of perpetual interest is to define a robot. Since the beginning of the study

of robotics, there has been some controversy in the definition of a robot. So long as

the evolution of robotics continues, the definition of the robot will change from time

to time, depending on the technological advances in its sensory capability and level of

intelligence (Deb, 2010).

However, the most widely accepted definition of a robot was given by the Robotic

Institute of America (RIA) in 1979. RIA defines a robot as a reprogrammable,

multifunctional manipulator designed to move materials, parts, tools, or specialized

devices through various programmed motions for the performance of a variety of

tasks.” (Diegel et al., 2005)

1.2 Classification and uses of robots

Various criteria could be used as bases for the classification of robots, but two modes

of classification are important from the point of view of this project.


2

Robots could be classified based on their kinematics and locomotion method. Based

on this mode of classification, the two major types of robots are stationary robots

(including robotic arms with a global axis of movement) and mobile robots. Mobile

robots can be further classified as wheeled robots, tracked robots, legged robots,

swimming robots and flying robots (International Federation of Robotics, 2003).

Robots could also be classified based on their uses or applications. Based on this

mode of classification, robots could be classified as:

• Industrial robots – robots employed in an industrialized manufacturing

atmosphere. Typically, they are robotic arms particularly created for

applications like material handling, painting and welding.

• Medical robots – robots employed in medicine and medicinal institutes. A

typical example is the da Vinci Surgical System - a robotic surgical system .

• Service robots – robots which operate semi- or fully autonomously to perform

services useful to the well-being of humans and equipment, excluding

manufacturing operations. Examples include lawn mowing robots, vacuum

cleaning robots and sewer robots.

• Military robots – robots brought into play in military and armed forces. They

consist of bomb discarding robots, shipping robots and exploration drones.

• Entertainment robots – robots employed for entertainment. They include

model robots such as robosapien or the running photo frames as well as

articulated robotic arms employed as movement simulators.

• Space robots – robots employed in space exploration and other space activities

A Common example is the mars exploration rover (International Federation of

Robotics, 2003).
3

1.3 Robotic Manipulators (The Robotic Arm)

Of all robotic mechanical systems, manipulators have always been given special

attention for various reasons. One reason is that, in their simplest form, as robotic

arms, they occur most frequently in industry. Another is that the architecture of

robotic arms constitutes the simplest of all robotic architectures, and hence, appear as

constituents of other, more complex robotic mechanical systems (Angeles, 2007).

A manipulator, in general is a mechanical system aimed at manipulating objects.

Manipulating, in turn, means to move something with one's hands, as the word is

derived from the Latin word 'manus', meaning hand. The basic idea behind the

foregoing concept is that hands are among the organs that the human brain can control

mechanically with the highest accuracy as the work of an accomplished guitar player

or a surgeon can attest (Angeles, 2007).

Robotic manipulators resembling the human arm are known as robotic arms. They are

constituted by a structure consisting of structurally robust links coupled by either

rotational joints (also referred to as revolute joints) or translating joints (also referred

to as prismatic joints). These structures are a concatenation of links, thereby forming

an open kinematic chain (Angeles, 2007).

A robotic arm is thus a type of mechanical arm, usually programmable, with similar

functions to a human arm. It may be the sum total of the mechanism or may be part of

a more complex robot (Patil et al., 2009).

Many have onboard controllers or translators to simplify communication, though they

may be controlled directly or in any number of ways. Due to this fact, standalone arms

are often classified as full robots. Notable examples of robotic arms are the SCARA -

Selective Compliance Articulated Robot Arm and the Unimation Puma 560.
4

A typical robotic arm has the following components:

• Links and joints

• Actuators

• Controller

• End-effector

• Sensor (not present in some robots)

1.3.1 Links and joints

A link is considered as a rigid body that defines the relationship between two

neighbouring joint axes of a manipulator. Manipulators consist of rigid links, which

are connected by joints that allow relative motion of neighbouring links. The links

move to position the end-effector (Craig, 2005).

1.3.2 Actuators

Actuators play the same role the muscles play in the human arm - they convert stored

energy into movement. Actuators are used to move a robot’s manipulator joints. The

three basic types of actuators currently are used in contemporary robots are

pneumatic, hydraulic, and electrical actuators. Pneumatic actuators employ a

pressurized gas to move the manipulator joint. They are inexpensive and simple, but

their movement is not precise (Angelo, 2012).

Hydraulic actuators on the other hand employ a pressurized liquid (typically oil) to

move the manipulator joint. They are quite common and offer very large force

capability as well as high power-to-weight ratios. The main disadvantages of

hydraulic actuators are their accompanying apparatus (fluid pumps and storage tanks)

and problems with fluid leaks (Siciliano and Khatib, 2008).


5

Electric motor-driven actuators provide smoother movements, can be controlled very

accurately, and are very reliable. However, these actuators cannot deliver as much

power as hydraulic actuators of comparable mass. Nevertheless, for modest power

actuator functions, electrical actuators are often preferred. The various types of

electric motors used as actuators for robotic applications are direct current (dc)

motors, stepper motors and servo motors (Angelo, 2012).

1.3.3 Controller

The controller is the main device that processes information and carries out

instructions in a robot. It is the robot's 'brain' and controls the robot's movements. It is

usually a computer of some type which is used to store information about the robot

and the work environment and to store and execute programs which operate the robot.

It contains programs, data algorithms, logic analysis and various other processing

activities which enable the robot to perform its intended function (Carlson et al.,

2013).

Most robots incorporate computer or microprocessor-based controllers. These

controllers perform computational functions and interface with and control sensors,

grippers, tooling, and other peripheral equipment (Angelo, 2012).

Most modern servomotor actuators are designed and supplied around a dedicated

controller module from the same manufacturer. Such controllers are usually developed

around microcontrollers and common examples are the Pololu Mini Maestro 12-

Channel USB Servo Controller and Lynxmotion's SSC-32 servo controller.


6

1.3.4 End-effector

In robotics, an end-effector is the device at the end of a robotic arm, designed to

interact with the environment. The exact nature of this device depends on the

application of the robot. Typical functions of the end-effector include grasping,

pushing and pulling, twisting, using tools, performing insertions, welding and various

types of assembly activities. Thus, the major types of robot end-effectors are:

• grippers - Grippers are the most common type of end-effectors. They can use

different gripping methods (such as vacuum or use of fingers).

• material removal tools - These include cutting, drilling and deburring tools

installed as robot tools.

• welding torches - Welding is a very popular robotic application. Welding

torches have thus become very efficient end-effectors that can be controlled in

a sophisticated way for optimized welding.

• tool changers - Tool changers are used when many different end effectors need

to be used in sequence by one robot. They are used to standardized the

interface between the robot flange and the base of the tool. They can be

manual or automatic

Surgical robots have end-effectors that are specifically manufactured for performing

surgeries (Angelo, 2012).

1.3.5 Sensor

Sensors are physical devices that enable a robot to perceive its physical environment

in order to get information about itself and its surroundings. They allow the robotic

arm to receive feedback about its environment. They can give the robot a limited

sense of sight and sound. The sensor collects information and sends it electronically to

the robot controller.


7

One use of these sensors is to prevent collision with obstacles (collision detection and

avoidance). Another common application of sensors is the use of vision sensors to

give a pick and place robotic arm the capability to differentiate between items to

choose and items to ignore (Mataric, 2007).

1.4 Applications of the Robotic Arm

According to Angelo (2007), robots now play a prominent and indispensable role in

modern manufacturing. The use of robots has led to increased productivity and

improved product quality. It has helped to keep manufacturing industries viable in

high-labour cost countries.

Today, many products we buy have been assembled or handled by a robot (Corke,

2011). The most common manufacturing robot is the robotic arm.

Robotic arms are typically used in industry. Repetitive autonomous robots perform

one task repeatedly based upon predetermined movements and specifically located

objects. Start and stop commands are determined by position, acceleration,

deceleration, distance, and direction. More complex actions are executed based upon

sensor processing. If object orientation or position is unknown, arms are often paired

with machine vision and artificial intelligence to identify the object and subsequently

control the arm.

Contemporary applications of the robotic arm range from doing an accurate and

reliable job of spray-painting an automobile on an assembly line, to robotic surgery.

The da Vinci surgical robot uses robotic arms equipped with scalpels and other

instruments to more accurately target surgical objectives, allowing surgeons to use

smaller, less invasive incisions (Angelo, 2012).


8

Other applications of the robotic arm include:

• welding - arc welding, laser welding, plasma welding, welding automation,

resistance welding and plasma cutting;

• material handling - dispensing, injection moulding, machine loading, pick and

place, packaging, parts transfer and press tending;

• painting automation;

• fibreglass cutting;

• assembly operations;

• foundry operations.

In the automobile industry, robotic arms are used in diverse manufacturing processes

including assembly, spot welding, arc welding, machine tending, part transfer, laser

processing, cutting, grinding, polishing, deburring, testing, painting and dispensing.

Robots have proved to help automakers to be more agile, flexible and to reduce

production lead times. They can be programmed to perform precise intricate duties at

much faster speeds than any human could be expected to. They also have the

advantage of being able to withstand exposure to conditions adverse to humans such

as extreme heat and presence of dangerous chemicals in the air (Angelo, 2012).

1.5 Project Aim

The aim of the project is to design and fabricate a robotic arm that can be used for

demonstrative and educational purposes.

Bearing in mind that a similar project has been previously undertaken in the

Department by Adebola (2012), this project work goes a step further by redesigning

the robotic arm for improved accuracy by using servos to power the joints.
9

In addition to this, a demonstration of the implementation of the inverse kinematics of

the arm will be made.This will make it possible to postion the gripper of the robotic

arm by specifying the coordinates of its centre position in a Graphic User Interface

(GUI) control software. The derivation of the inverse kinematics equations is however

not intended due to its tedious nature. As such, the structure of the robotic arm will be

similar to the RA-01 robotic arm used by Akinwale (2011) for his M.Sc. project. The

kinematics of the robotic arm were extensively covered in the report and the equations

he derived will be used in the course of this project.

1.6 Project Objectives

The objectives of this project are to:

• select a suitable material for the fabrication of a 5-DoF robotic arm;

• obtain suitable design parameters for the robotic arm;

• create a 3-D model of the robotic arm based on the design parameters;

• simulate the movement of the robotic arm;

• fabricate the robotic arm

• create and deploy a suitable software to implement the inverse kinematics of

the robotic arm, making it possible to position the robot end-effector by

specifying desired coordinates.

1.7 Justification for the Project

The field of robotics has become indispensible to mankind. Robotics technologies are

helping to improve healthcare (through the use of the da Vinci surgical system),

national defence (through the use of robotic bomb detonators), manufacturing (as

witnessed in the automobile manufacturing industry), consumer goods sectors (for

packaging finished goods) and so on.


10

The availability of a robotic arm that can be used for demonstrative and educational

purposes in the Department of Mechanical Engineering will go a long way in

stimulating the interest of students in robotics. It will provide a tool to use for

learning and experimenting with robotics. Students with a flair for programming can

reprogram the robot to adapt it to different tasks.

An off-the-shelf robotic arm could suffice , but the cost of such robotic arms is usually

very prohibitive. Lynxmotion's AL5B robot, a 6-DoF (Degree of Freedom)

educational robotic arm costs $548.03 ( 87,630.00 )with control software and

shipping cost included (Lynxmotion, 2013), Images SI Inc.'s RA-02 robotic arm - a 6-

DoF robotic arm - costs $599.95 ( 95,932.01) without inclusion of shipping cost

(Images SI Inc., 2013) while KUKA's youBot™ - a 5-DoF arm with 2-finger gripper-

costs a whooping £15,999.00 ( 4,134,116.00) without inclusion of shipping cost

(KUKA, 2013). The proposed robotic arm on the other hand has an estimated cost of

about 39,530.00 as shown in Table 1.1. Thus, building a robotic arm is a more

economically feasible and viable option.

1.8 Scope

This work covers material selection, design, simulation, programming and fabrication

of a basic robotic arm system. It also covers the implementation of the kinematics of

the arm but does not consider the details of the derivation of the kinematic equations.
11

Table 1. 1 : Rough Cost Estimate of Robotic Arm


Source: www.robotshop.com

Component Quantity Price


($) ( )

Servo motor 5 103.45 16,583.04

USB servo 1 19.95 3,197.99


controller (assembled)

USB A to mini-B cable 1 1.95 317.39

Wall Adapter (6v, 2A) 1 19.95 3,197.99

Perspex sheet 2m x 2m 1,750.00

Shipping 77.88 12,484.16

Miscellaneous 2000.00

Total cost 39,530.57


12

CHAPTER TWO

LITERATURE REVIEW

2.1 The Field of Robotics

A brief and clear definition of robotics is given in The Chamber's Dictionary (2011). It

defines robotics as the branch of technology dealing with the design, construction and

use (operation and application) of robots. A more technical definition was given by

Craig (2005).He defined robotics as a field that concerns itself with the desire to

synthesize some aspects of human function by use of mechanisms, sensors, actuators

and computers.

The field of robotics is divided into different aspects and robotics research are carried

out by experts in various fields. It is usually not the case that any single individual has

the entire area of robotics in his or her grasp. A partitioning of the field of robotics is

hence necessary. The four major sub-disciplines of robotics are: mechanical

manipulation, locomotion, computer vision and artificial intelligence (Craig, 2005).

By and large, the field of robotics is not a new science, but merely a collection of

topics from classical fields in science and technology. Mechanical engineering

contributes methodologies for the study of the kinematics and dynamics of robots.

Mathematics supplies tools for describing spatial motions and other attributes of

robots. Control theory provides tools for the design and evaluation of algorithms to

realize desired motions or force applications. Electrical and Electronic engineering

techniques are brought to bear in the design of sensors and interfaces for industrial

robots, and computer science contributes a basis for programming these devices to

perform a desired task (Craig, 2005).


13

2.2 Robots

The term robot was first used in a 1921 Czech science fiction play - “Rossum’s

Universal Robots” - by Karel Capek. It was coined from the word "Robata", which is

the Czech word for forced labor or servitude (Angelo, 2007).

The concept of artificial creatures has been around for a long time. Human beings

have always been fascinated with the concept of artificial life and the construction of

machines that look and behave like people. The medieval legend of the Golem (an

animated anthropomorphic being, created entirely from inanimate matter) is a

common example (Angelo, 2007).

However, the first patent for what we would now consider a robot was filed in 1954

by George C. Devol and issued in 1961. Devol's robot was the first industrial robot, a

system designed specifically to pick and place objects in a factory environment.

(Angelo, 2007)

Today, more advanced and highly sophisticated robots have been created. An example

is a humanoid robot manufactured by Honda Robotics, named ASIMO (acronym for

Advanced Step in Innovative Mobility).

ASIMO has 34 degrees of freedom (34-DOF) that allow it to do things like turn on a

light switch, open doors, carry objects and push carts. It has the ability to recognize

moving objects, postures, gestures, its surrounding environment, sounds and faces,

which enables it to interact with humans. Also, its Intelligent Real-Time Flexible

Walking (codenamed i-WALK) as well as its environment recognition technologies

allow it to take on complex tasks such descending stairs, asccending stairs (shown in

Figure 2.1) and navigating a slope. Honda Robotics refers to ASIMO as "the world's

most advanced humanoid robot" (Honda Robotics, 2013).


14

Figure 2. 1: ASIMO demonstrates its ability to climb stairs at Auto Zurich Car Show
Source: http://asimo.honda.com/gallery/
15

The use of robots has led to increased productivity and improved product quality. It

has helped to keep manufacturing industries viable in high-labour cost countries.

Today many products we buy have been assembled or handled by a robot (Corke,

2011).

2.3 The Robotic Arm

George Devol received the first patents for robotics in 1954. He invented the

first robotic arm - a device that comprised a mechanical arm with a gripper that was

mounted on tracks and the sequence of motions was encoded as magnetic patterns

stored on a rotating drum. Devol originally called the device 'Universal Automation',

but later shortened the term to 'Unimation'. Devol's Unimation was the first industrial

robotic arm, a system designed specifically to pick and place objects in a factory

environment. A U.S. patent was issued for his design in 1961(Angelo, 2007).

In 1973, the wide range of technological experience gained by KUKA, combined with

demands from the automotive industry for powerful and reliable robots culminated in

the development of KUKA's own industrial robot. Known as FAMULUS, it was the

first robotic arm to have six electric motor-driven axes (KUKA, 2013).

Interest in industrial robotics increased in the late 1970s and many companies

entered the field, including large firms like General Electric, and General

Motors (GM) which formed the joint venture GMFanuc Robotics Corporation with

FANUC (Factory Automation Numerical Control) Ltd. of Japan in 1982 (FANUC,

2013).
16

At the height of the robot boom in 1984, Unimation was acquired by Westinghouse

Electric Corporation for 107 million U.S. dollars.

Westinghouse sold Unimation to Stäubli Faverges SCA of France in 1988, which is

still making articulated robots for general industrial and cleanroom applications and

even bought the robotic division of Bosch in late 2004 (Ryan, 2012).

Robots are still widely in use in today's economy. According to worldrobotics.org - a

website of the International Federation of Robotics Statistical Department - (2013),

about 165,000 industrial robots were sold worldwide in the year 2011 alone.

The Academia has also contributed immensely to the advancement in the field of

robotics.

In 1969, Victor Scheinman at Stanford University invented the Stanford arm, an all-

electric, 6-axis articulated robotic arm designed to permit an arm solution. This

allowed the robot to accurately follow arbitrary paths in space and widened the

potential use of the robot to more sophisticated applications such as assembly and

arc welding (Ryan, 2012).

Scheinrnan then designed a second arm for the Massachusetts Institute of Technology

(MIT) A1 Laboratory, called the "MIT arm". Sheinman sold his designs to

Unimation which further developed it with support from General Motors and later

sold it as the famous PUMA - Programmable Universal Machine for Assembly

(Ryan, 2012) .

The PUMA was a very versatile pick-and-place industrial robotic arm that could

perform a variety of tasks in assembly, material handling and transfer, joining,

inspection, and palletizing (Angelo, 2007).


17

In recent times, various research works on the robotic arm have been carried out by

undergraduate and post-graduate students of numerous tertiary institutions, both home

and abroad. They are discussed below.

Okedeyi (1998), of the Department of Mechanical Engineering, Obafemi Awolowo

University in his B.Sc. final year project, developed a serial algorithm for the dynamic

simulation of a two-link constrained robotic manipulator.

The manipulator was assumed to be made up of rigid bodies connected by ideal

revolute joints and powered by ideal actuators. It was also assumed to be non-

redundant and its end-effector was assumed to be in contact with some rigid body in

its environment so as to form a closed kinematic chain configuration. In addition to

these, effects of backlash, elasticity and actuator dynamics were neglected.

The Euler-Lagrangian formulation was used to derive the equations describing the

dynamics of the manipulator. The equation related the torques acting at the joints due

to the motion of the manipulator links, inertia effects, centripetal effects (due to

rotation of the links around a joint), gravitational effect and the velocities of the links

(represented by the Jacobian Matrix). The joint accelerations were then solved, with

the assumption that the torques were given after which numerical integration

technique was used to solve for the next state joint positions and rates of the

manipulator.

A two-link manipulator was used as a case study for the simulation. The step-by-step

serial algorithm was used to develop a computer program for the simulation.

FORTRAN 77 (FORmular TRANslator) was employed in writing the computer

program.
18

The report showed that the algorithm derived was practical and result-oriented. The

approach used for the simulation is however only efficient for few links but

computationally intensive for manipulators with higher number of links such as five

and above.

Fadahunsi (1998), of Mechanical Engineering Department, Obafemi Awolowo

University in his B.Sc. final year project, applied geometric modelling to the

simulation of a robot's manipulator.

Modelling is the art of representing a process, system, concept or an object in another

form. The model form, though different from the original object or concept, simulates

all the qualities of the original. In geometric modelling, the object shape is accurately

described by a set of mathematical equations and logical relationships based on

constrains derived from the properties of the original object.

The objective of the project was to demonstrate the application of geometric

modelling in the design of a robot's manipulator by generating and animating its

wireframe model. The Bezier curve-defining technique was used to create the

wireframe model of the manipulator. The technique was chosen because it allowed for

easy control of the model shape by changing a few parameters as well as its ability to

remain stable and invariant under affine transformations such as translation and

rotation.

The generated wireframe model of the manipulator was then used to simulate the

rotation of its links using a suitable algorithm. The algorithm was used to write a

computer program to handle the tedious and complex mathematical computations that

the simulation entailed. FORTRAN 77 (FORmular TRANslator) was employed in

writing the computer program.


19

Although, the objectives of the project were met, it had some shortcomings. The

wireframe models did not represent the robot manipulator accurately. A surface or

solid modelling technique would have been more appropriate, though it would have

involved a more tedious programming task.

Adekoya (2001), of Mechanical Engineering Department, Obafemi Awolowo

University in his B.Sc. final year project, investigated the application of Fuzzy

Control Systems in Mechanical Engineering and utilised it in designing a controller

for a robotic arm. The design was configured for a two-degree of freedom system in

which the robotic arm could move on a plane (i.e. displacement in two perpendicular

directions) with the aid of motors placed at the joint locations of the arm.

Fuzzy logic is a form of many-valued logic reasoning that recognizes more than

simple true and false values. Compared to traditional binary sets (where variables may

take on true or false values), fuzzy logic variables may have a truth value that ranges

in degree between 0 and 1.

Fuzzy logic was used to design a control for the robotic arm based on results obtained

from sensors placed at strategic locations of the arm. The major reason for

incorporating fuzzy control into the arm was to implement collision detection and

avoidance during motion.

The sensors enabled the determination of relative distances between the robotic arm,

the object it was transporting and obstacles in the environment.

The design was based on the Model Approach. Objective information was represented

using mathematical models while subjective information was represented by linguistic

statements that were converted to rules that were then quantised using fuzzy logic.
20

Based on the distances from obstacles determined by the four sensors (which formed

the inputs) and two membership functions each, a total of sixteen (16) linguistic rules

were obtained to implement the control of the robotic arm.

The linguistic rules were used to instruct the motors to rotate at a certain speed and in

a direction based on the conditions of the robotic arm's environment.

The linguistic rules obtained are represented in Table 2.1.

The speed and direction of the motors were represented with the following acronyms;

PF - Positive Fast

PM - Positive Medium

PS - Positive Slow

NS - Negative Slow

NM - Negative Medium

NF - Negative Fast

Using Rule 2 as an example, the linguistic rules can be read as follows;

If the distance to the left is Big and the distance in front is Big and the distance to the

right is Small and the distance at the back is Big, then the Forward-Backward motor

speed is Positive Slow and the Left-Right motor speed is Positive Slow.

The design was able to achieve the aim of ensuring the robotic arm reached its

destination while effectively avoiding collision with obstacles that get into its

predetermined path of motion.


21

Table 2. 1 : Linguistic rules for fuzzy logic control of robotic arm


Source : Adekoya (2001)

Distances Motor Speeds

Forward-

Left Front Right Back Backward Left-Right

Motor Motor

Rule 1 Big Big Big Big PM Stop

Rule 2 Big Big Small Big PS PS

Rule 3 Small Big Big Big PS NS

Rule 4 Small Big Small Big PM Stop

Rule 5 Big Big Big Small PM Stop

Rule 6 Big Big Small Small PF PF

Rule 7 Small Big Big Small PF NF

Rule 8 Small Big Small Small PM Stop

Rule 9 Big Small Big Big NS PM

Rule 10 Big Small Small Big NF PS

Rule 11 Small Small Big Big NF NS

Rule 12 Small Small Small Big NM Stop

Rule 13 Big Small Big Small Stop NM

Rule 14 Big Small Small Small Stop PM

Rule 15 Small Small Big Small Stop NM

Rule 16 Small Small Small Small Stop Stop


22

The controller can be further improved by increasing the number of sensors, using

sensors with higher sensitivities and increasing the number of membership functions

for both distances and motor speeds. Also, increasing the degrees of freedom of the

robotic will enhance its capability for collision avoidance due to increased ability to

make more complex movements.

Chen et al (2006) of the Department of Electrical Engineering, National Chung-Hsing

University designed and implemented an Omnidirectional Mobile Home Care Robot

for the Nios II Embedded Processor Design Contest. The project involved the design

of a smart robot to expand the applications for which robotic arms are traditionally

used. The robotic arm was installed on an omnidirectional mobile vehicle and a

network camera was installed on the front of the arm.

The omnidirectional mobile home-care robot provides home care for the disabled and

improves their quality of life. The smart robot can perform various tasks such as

reaching for an item with the help of its network camera, the platform, and the robotic

arm. The user grabs the article’s image on the camera LCD (Liquid Crystal Display)

screen and the system directs the platform and arm to pick up the object. The robot

helps the disabled person pick up objects and perform household tasks. Figure 2.2

shows the omnidirectional mobile home care robot. The robotic arm and the

omnidirectional mobile vehicle system use two Altera® Nios® II test boards. Stratix®

and Stratix II devices control the system, motor position, and speed while a PC

performs image processing. The robot arm has five servomotors and the platform has

three.

The user uses the webcam network camera to select image information, which is

delivered to the PC for coordinate processing.


23

Figure 2. 2: Omnidirectional mobile home-care robot


Source : Chen et al (2006)
24

Upon completion of the image processing, the system determines the object’s three-

dimensional (3-D) coordinates compared to the robot’s. Then, the system substitutes

the coordinates in an algorithm to obtain the rotation angle of the robot arm’s five-axis

motor as well as the platform’s best rotating X and Y coordinates. The system then

transmits the control signal to the hub via TCP (Transmission Control Protocol) and

Winsockets and delivers the rotation angle and the XY coordinates to the development

boards, which controls the speed and position of the robotic arm and platform in their

attempt to reach the object's location.

Jenne (2009) of the Swiss Federal Institute of Technology, Zurich designed and

implemented an adaptive motion control for a complex robot system consisting of a

mobile platform, realized with a differential drive and a 6-DoF robotic arm (shown in

Figure 2.3). Firstly, the kinematic models of the system were analyzed separately.

With the theory of the homogeneous matrix exponential method, a combination of the

kinematic models was established. Further on, the motion of the mobile platform was

realized with a kinematic position controller.

Also, an algorithm was developed to control the performance of the robotic arm and

its mobile platform together. With a priority distribution, the differential drive could

manoeuvre the arm to a convenient position and the arm's end-effector could execute

its motion following a trajectory to the desired goal position.

To implement the controller on the real robot, data structures and a transfer protocol

were created to communicate with the robot. The software did not take the robot's

inertia into account, so the speed controller had to be modified. It proved to be highly

suitable to control the combined system with a standardized control algorithm that

could be added on any similarly designed robot.


25

Figure 2. 3: Model of the robotic arm and the complete robot system by Jenne
Source : Jenne (2009)
26

Blackmore et al. (2009) from Portland State University worked on a project that

involved the designing and building of a mobile robot capable of human-like

behaviours. The means of achieving the project objectives were contained in three

aspects of the robot's design. The first was a mobile base that allowed the robot to

navigate through an indoor space. The second was a head and neck capable of

producing several human-like gestures coordinated with speech. The third was an arm

and hand assembly used to supplement the robot's gestures and allow it to grasp and

move objects.

The robotic arm had the ability to grasp an object and place it in a different location. It

was similar in scale to that of a human arm and could replicate similar motions. The

arm was made with standard components, such that it could be easily reproduced and

mirrored to create left and right versions. Also, the arm could be easily mounted to

the mobile base.

The ability of the arm to move an object was given attention. With this in

consideration, an object with a negligible mass was chosen. This reduced the torque

requirements for the arm's actuators and allowed the overall design to be built to a

larger scale.

To simplify the software design, each joint on the arm was manually controlled.

Although, the developers intended to make provision for more sophisticated motion

control in future phases of the project, the simplified approach allowed the robot's

actuators and ranges of motion to be controlled and tested in a straightforward way.

As a first step to higher levels of motion control, the forward kinematic equations for

position were carried out by the team.


27

The robotic arm was built with TETRIX robot kit components. In addition to the stock

TETRIX components, welding rod was used to fashion the fingers of the hand. The

plate that the fingers attached to was also fabricated. Figure 2.4 shows the complete

TETRIX assembly and hand.

The electronic components used to operate the arms consisted of two direct current

(dc) motors, four servomotors, one dc motor controller, one servo controller, and the

NXT brick. The dc motors were used on the elbow and shoulder joints to provide

more torque and stability while servomotors were used to control the hand, wrist, and

arm rotation. All these components were part of the Lego Tetrix Robotics toolkit.

Using the Tetrix parts along with the NXT brick allowed for less time spent

integrating and developing drivers, because when programmed with RobotC, the

drivers and control functions are already integrated into the system allowing for more

of a plug-and-play environment. This saved time in developing code for controlling

the arm.

The main control of the arm was done by the NXT brick (a control unit run by a 32bit

ARM7 microprocessor and an 8 bit AVR microcontroller, mainly programmed using

the NXT graphical interface language, LabVIEW, RobotC, or NXT++).

Adhikari et al (2010) of the Worcester Polytechnic Institute, USA, designed,

simulated and implemented a mobile robotic base with an attached robotic arm. The

aim of the project was to create a robot capable of assisting elderly people with tasks

in their everyday lives.

One of the project aims was to design a robot that could come to the aid of the user.

Thus, the robot had to run in an autonomous mode because it was intended to be used

by an aged person. The robot could travel to the user when it was called for help.
28

Figure 2. 4: Final TETRIX robot arm and hand assembly by Blackmore et al.
Source : Blackmore et al. (2009)
29

Thus, the robot had the ability to know the position of the user or at least get a general

direction of where the user was located.

The robot was designed to perform indoor user location, thus decreasing the accuracy

of common location technologies such as GPS (Global Positioning System), and

necessitating the use of indoor tracking methods such as a WLAN (Wireless Local

Area Network) - based tracking system. The location of the user was accomplished by

trilateration, which required three wireless base stations.

To be able to perform tracking, the robot had sensors that could carry out dead

reckoning. Dead reckoning is a simple mathematical procedure for determining the

present location of a vehicle by advancing some previous position through known

course and velocity information over a given length of time.

Object avoidance was also incorporated into the robot to enable it to navigate properly

through the room. This is was achieved using an infrared transmitter and receivers that

could detect an obstacle. Then the data was processed in the micro-processor to

calculate distance and perform obstacle avoidance techniques.

In order for the robot to find and pick up objects, it was equipped with the ability to

identify and locate them. The method used to achieve this involved the translation of

visual sensory input to actuation, called visual servoing.

Akinwale (2011) of Electrical and Electronics Department, Obafemi Awolowo

University in his M.Sc. degree project, developed a robust remote laboratory

platform for conducting robotic arm experiments.

A RA-01 robotic arm with five degrees of freedom from Images Incorporated, New

York (shown in Figure 2.5)was selected for use in his study.


30

wrist
rotation

gripper
elbow open-close
movement

shoulder
movement

base rotation

Figure 2. 5: RA-01 robotic arm


Source: Akinwale (2011)
31

The robotic arm platform was set up and the robotic arm was interfaced with a

computer’s RS-232 port via an SMC-05 servomotor controller, which employed

pulse width modulation using a PIC16F873 microcontroller. He computed the

forward and inverse kinematics of the arm using the Denavit-Hartenberg (D-H)

parameters .

The online laboratory platform was implemented using the Massachusetts Institute

of Technology interactive iLab architecture. The successful control of the robotic

arm and integration of the experiment engines into the iLab Interactive

architecture were achieved. Error-checking algorithms were implemented to

ensure laboratory safety without human supervision and hence, ensure robustness

of the setup.

Abebola (2012) of the Department of Mechanical Engineering, Obafemi Awolowo

University designed an improvised robotic arm system for educational instructional

purposes. The system was implemented using improvised materials while the

software was developed using the C# (C Sharp) programming language.

He fabricated two models (A and B), evaluated and compared their results. Test

results showed that Model A failed due to the inability of its electronic circuit to

supply the power required by its motors. Further tests showed that Model B

was fully functional having all of its joints (apart from the gripper) capable of

360o movement.

In line with emphasizing local content, he obtained motors from old antennas and

from children toys. Perspex was used for each link and the gripper of the successful

model.
32

Forward and inverse kinematics of the robotic design were given in his report. The

results of the kinematic analysis were however not implemented in the design of the

robot. Doing that would have been difficult due to its simplistic method of control.

Instead the robot used the simple mechanism of the human eye as its control.

Supposing the student controlling the robot wanted the end-effector to move from

position A to position B, he had to move each link of the robot until the end-effector

reached position B.

His study showed that an educational instructional robotic arm made using local

materials was feasible and viable for teaching robotics especially at early stages. The

robotic arm is shown in Figure 2.6.

Herman et al. (2012) ,of the University of Zaragoza (Spain), designed and built a low-

cost mobile platform equipped with a robotic arm. The low-cost mobile platform was

a TurtlebotTM prototype. The robotic arm was built with off-the-shelf parts and

controlled from an Arduino board. A Kinect sensor was used for object detection.

Two ROS (Robot Operating System) nodes managed all the elements, commanding

the mobile base and defining targets to be grasped.

A demonstration program, which simulated a simple cleaning task in which soft

objects were moved out of the way was described. The initial prototype had a simple

pincer that enabled grasping of easy objects only.

The project augmented a standard vacuum cleaner base with a custom arm capable of

grasping simple items, like small pieces of clothing, that were in the way of the robot.

The goal was to allow the robot to identify objects in its path and, if they were suitable

targets, remove them using the arm, in order to proceed with a hypothetical vacuum-

cleaning task.
33

Figure 2. 6: Adebola's Improvised robotic arm in operation


Source : Adebola (2012)
34

The prototype, shown in Figure 2.7 was capable of detecting and grasping small soft

objects, like light clothes. It had trouble with larger or rigid objects, since only a basic

centroid grasping strategy was implemented.

Ishola (2012), of the Department of Electronic and Electrical Engineering, Obafemi

Awolowo University in his B.Sc. final year project, developed a remote online

Strength of Materials laboratory , a platform for performing static bending

experiments over the internet.

The experiment involved applying a known force at any point between the ends of a

beam supported on a pivot at both ends and thereafter measuring the deflection of the

beam using a suitable instrument. The value of deflection was then used to calculate

the modulus of elasticity which is a unique property for every material.

The design put into consideration, a case of permanent deformation or damage of the

beam in the event that a student applied an extremely high force on the beam.

Considering the absence of a lab attendant to replace the beam, as it was a remote

laboratory where everything was expected to be automated and the experiment had to

continue, a decision was made to implement a configuration of automated systems

that included a robotic arm for replacing the beams.

Thus, a robotic arm with four degrees of freedom (4-DoF) was designed and

implemented. The joints of the arm were powered by AX-12™ dynamixel™ servos

manufactured by Robotis®.

A continuous rotation servo was used to effect a 360° rotation at the base joint. The

end-effector was a gripper used for picking the beams.


35

Figure 2. 7: Herman's robotic arm built with standard bricks, before being mounted on
a Turtlebot™ prototype

Source : Herman et al. (2012)


36

Control of the servos from a PC was achieved using the USB2Dynamixel™ - a TTL

(Transistor–transistor logic) level converter. The USB2Dynamixel is basically a

“Virtual Serial Port” that attaches to the USB port on a PC. It was used as the

connector between the dynamixel servos and the computer, thus allowing the servos to

receive instructions from the PC.

A 9V, 4.5A power supply was used to power the dynamixel actuators with each

actuator needing about 900mA current at 9V. The power supply circuit used a pass

transistor to deliver the high current needed.

The interfacing software for communicating with the robotic arm was developed using

Microsoft's C# programming language.

Also, the inverse kinematics equation of the arm were derived and implemented so as

to give it the ability to pick the beam at the specified location. The robotic arm is

shown in Figure 2.8.

2.4 The Denavit-Hartenberg (D-H) Convention

In order to compute the forward kinematics equation for an open-chain manipulator

(robotic arm), a systematic, general method is needed to define the relative position

and orientation of two consecutive links. The problem is that to attach two frames to

the two links and compute the coordinate transformations between them. In general,

the frames can be arbitrarily chosen as long as they are attached to the link they are

referred to. Nevertheless, it is convenient to set some rules for the definition of the

link frames (Siciliano et al., 2009).


37

Figure 2. 8: Robotic arm for picking beams for static bending test
Source : Ishola (2012)
38

The D-H convention is a representation technique used to achieve this. Jacques

Denavit and Richard Hartenberg introduced this convention in 1955 in order to

standardize the coordinate frames for spatial linkages (Denavit and Hartenberg, 1955).

The D-H representation will be used in carrying out various analyses in the design

process of the proposed robotic arm. Thus, the procedure for determining the Denavit-

Hartenberg parameters of a serial link manipulator is discussed below.

It should be noted that there are two quite different forms of the Denavit-Hartenberg

representation for serial-link manipulator kinematics:

• Classical as per the original 1955 paper of Denavit and Hartenberg, and used

in text-books such as by Paul (1981) and Spong (1989).

• Modified form as introduced by Craig(1986) in his text book.

Both notations represent a joint as two translations and two rotation angles.

However, the expressions for the link transform matrices of the two notations are quite

different. Unfortunately many robotics literatures do not specify this crucial piece of

information. Most textbooks cover only one and do not even allude to the existence of

the other. The modified form will be adhered to in the course of the analyses in this

report (Corke, 2011).


39

CHAPTER THREE

METHODOLOGY

This chapter covers the detailed design and method of construction of the robotic arm

and its controller.

3.1 Choice of Material

The following material properties were put into consideration during the material

selection process:

• strength

• lightness

• availability

• ease of cutting.

The material should posses sufficient strength so as to ensure that each link of the arm

is able to bear the load imposed on it by motors, other attached links and the payload.

Lightness of the material reduces the torque requirement of the robot actuators,

thereby minimising cost. The material also had to be readily available and easy to cut

because the fabrication of some parts of the robotic arm involved the cutting of

intricate shapes.

Perspex (polymethyl methacrylate or PMPA plastic) satisfies all the above criteria and

was thus selected for the project. Perspex can be easily cut with readily available hand

tools such as a fret saw. In addition to this, it offers outstanding strength and stiffness

(Polymech, 2012) which made it suitable for the project.


40

It is also cheap and readily available. With a density of 1.18g/cm3 (Boedeker.com,

2013), it's lightness is a key advantage for any demonstrative or educational robotics

project, because it reduces the torque requirement of the robot actuators.

3.2 Implementation Plan

The entire implementation plan of the robotic arm from start to finish is represented

by a block flow diagram shown in Figure 3. 1. The software used for each stage is

indicated in Figure 3. 2 while Figure 3. 3 shows a schematic diagram of the robotic

arm setup.

3.3 Robotic Arm Structure

As earlier stated in the introductory chapter of this report, the robotic arm has five

degrees of freedom (5-DOF) with the inclusion of the gripper. A CAD (Computer

Aided Drafting) model of the robotic arm structure was designed and rendered using

Autodesk's AutoCAD® software. The CAD model is shown in Figure 3. 4.

3.4 Kinematic Analysis

Kinematics is the branch of mechanics that studies the motion of a body or a system of

bodies without giving consideration to its mass or the forces acting on it. The

kinematics of a robotic arm can be treated under two categories:

• Forward kinematics

• Inverse kinematics

Forward Kinematics

The goal of the kinematic analysis of a robotic arm is to develop equations that

can be used to determine the position of the end-effector given any set of joint

values. The equations developed relate the end-effector position to the joint

variables of the robotic arm (Corke, 2011).


41

Determine D-H Input D-H parameters


3D modelling of
parameters of robot into Matlab® Robotics
structure with Autocad®
arm Toolbox

Import 3D model into


Autodesk® Inventor® Carry out forward
kinematic analysis of
robot arm

Determine weight and inertia


of each link of robotic arm
using Autodesk® Inventor®
iproperties®
Carry out inverse
kinematic analysis
of robot arm

Calculate torque required at


each joint based on obtained
weights and inertia

Simulate forward
Obtain kinematic and inverse
equations of kinematics of robot
Select servos with
robotic arm arm (graphically)
required torque rating

Select servo controller

Develop and deploy


interfacing software
for communicating Fabricate
with servo robotic
controller using arm
Microsoft's Visual
C#

Figure 3. 1: Implementation flow chart of robotic arm


42

Based on the structure of the


robotic arm, a schematic
3D modelling of diagram of the robotic arm is Input D-H
robotic arm structure drawn with a frame attached parameters into
to communicate to each joint according to the Matlab® Robotics
intended design Denavit-Hartenberg Toolbox
convention. This is done
manually.

Importation of 3D model(in
.dwg format) to determine the
weight and inertia of each link of
the robotic arm (using Autodesk Obtain equations
Simulate forward
Inventor® iproperties™) for for the for forward
and inverse
torque calculation. The purpose and inverse
kinematics of robot
of the torque calculation is to kinematics of
arm (graphically)
facilitate the selection of robotic arm
appropriate motors for each joint
of the arm.

Develop control software


with C# using inverse
kinematic equations

Figure 3. 2: Software flowchart for the design of the robotic arm


43

Interfacing Software Servo controller


User input = coordinates of desired
(generates PWM signals)
position of gripper centre

serial bytes

Five PWM

signals to control

servo rotation

Figure 3. 3: Schematic diagram of robotic arm setup


44

Figure 3. 4: CAD model of proposed robotic arm


(Drafted and rendered with AutoCAD® Mechanical 2011 Student Version)
45

Forward kinematics refers to the use of the kinematic equations of the robotic arm to

compute the position of the end-effector from specified values for the joint parameters

(Paul,1981).

Inverse Kinematics

A method of determining the pose of the end-effector of the robotic arm, given the

joint coordinates has been discussed. "A problem of real practical interest is the

inverse problem: given the desired pose of the end-effector, what are the required joint

coordinates? For example, if we know the Cartesian pose of an object, what joint

coordinates does the robot need in order to reach it? " (Corke, 2011).

3.4.1 Kinematic analysis using Denavit-Hartenberg convention

The Denavit-Hartenberg (D-H) convention was used in computing the forward

kinematics of the robotic arm. Based on the procedure for determining the Denavit-

Hartenberg Parameters of a serial link , the D-H parameters of the robot were

determined.

Refer to Figure 3.5. Firstly, frames were attached to each of the links of the

manipulator at the joint positions. The convention for locating frames on the links is

as follows: The Z-axis of frame {i}, Zi , is coincident with the joint axis i. The origin

of frame {i} is located where ai perpendicularly intersects with the joint i axis. Xi

points along ai in the direction from joint i to joint i+1.

In the case of ai = 0, Xi is normal to the plane of Zi and Zi+1. We define αi as being

measured in the right-hand sense about Xi, so we see that the freedom of choosing the

sign of α in this case corresponds to two choices for the direction of Xi. Yi is formed by

the right-hand rule to complete the ith frame.


46

Axis i - 1 Axis i
Link i - 1

Link i
Zi-1
Yi
Yi-1 Zi

ai
ai-1
Xi-1 di Xi

θi
αi-1

Figure 3. 5: Attachment of frames to the links of a robotic arm


Source : Craig (2005)
47

If the link frames have been attached to the links according to the above convention,

the following definitions of the link parameters are valid:

αi-1 = angle between zi-1 and zi measured about xi-1

ai-1 = distance between zi-1 and zi measured along xi

di = distance between xi-1 and xi measured along zi

θi = angle between xi-1 and xi measured about zi

where 'α' is called the link twist, 'a' is called the link length, 'd' is called the link offset

and 'θ' is called the joint variable (in case of revolute joints).

(Craig, 2005).

A schematic diagram of the robotic arm showing the chosen joint axes is shown in

Figure 3. 6. The D-H parameters of the robotic arm (shown in Table 3.1) were

obtained from the schematic diagram as follows:

for i = 1 : αi-1 = α0 = angle between z0 and z1 measured about x0 = 0°

ai-1 = a0 = distance between z0 and z1 measured along x0 = 0mm

di = d1 = distance between x0 and x1 measured along z1 = 0mm

θi = θ1 = angle between x0 and x1 measured about z1 = θ1

for i = 2 : αi-1 = α1 = angle between z1 and z2 measured about x1 = 90°

ai-1 = a1 = distance between z1 and z2 measured along x1 = 0mm

di = d2 = distance between x1 and x2 measured along z2 = 0mm

θi = θ2 = angle between x1 and x2 measured about z2 = 90°-θ2


48

Note: Oi is the origin of the ith frame

Figure 3. 6: Schematic of robotic arm showing chosen joint axes


49

Table 3. 1 : D-H parameters for the Robotic Arm

i αi-1 ai-1 di θi

1 0° 0 0 θ1

2 90° 0 0 90°- θ2

3 0° L2 0 θ3

4 90° L3 0 180° - θ4

5 0° 0 0 180°
50

for i = 3 : αi-1 = α2 = angle between z2 and z3 measured about x2 = 0°

ai-1 = a2 = distance between z2 and z3 measured along x2 = L2

di = d3 = distance between x2 and x3 measured along z3 = 0mm

θi = θ3 = angle between x2 and x3 measured about z3 = θ3

for i = 4 : αi-1 = α3 = angle between z3 and z4 measured about x3 = 90°

ai-1 = a3 = distance between z3 and z4 measured along x3 = L3

di = d4 = distance between x3 and x4 measured along z4 = 0mm

θi = θ4 = angle between x3 and x4 measured about z4 = 180°-θ4

for i = 5 : αi-1 = α4 = angle between z4 and z5 measured about x4 = 0°

ai-1 = a4 = distance between z4 and z5 measured along x4 = 0mm

di = d5 = distance between x4 and x5 measured along z5 = 0mm

θi = θ5 = angle between x4 and x5 measured about z5 = 180°

The D-H parameters of robot arm are exactly identical to those of the RA-01 robotic

arm used by Akinwale (2011) for his M.Sc. thesis. The kinematic analysis of the robot

arm using transformation matrices was extensively discussed in Appendices 1 and 2 of

the thesis. According to Akinwale (2011), the location of the central point of the end

effector (gripper) of the robotic arm is given as (x, y, z), where:

x = cosθ1 [L4 cos(θ2 + θ3) - L3 sin(θ2 + θ3) - L2 sin θ2 ]

y = sinθ1 [L4 cos(θ2 + θ3) - L3 sin(θ2 + θ3) - L2 sin θ2 ]

z = L4 sin(θ2 + θ3) + L3 cos(θ2 + θ3) + L2 cos θ2


51

The above equations represent the forward kinematics of the robotic arm - given the

values of the joint angles, they can be used to obtain the location of the end effector.

The inverse kinematics problem was to solve this equation for θ1, θ2 and θ3 given the

values of x, y and z.

According to Akinwale (2011);

θ1 = tan-1

±
-1
θ2 = sin

where k1 = L4 sinθ3 + L3 cosθ3 + L2

k2 = L4 cosθ3 - L3 sinθ3

±
θ3 = sin-1

where = + + − + ! +

"The three equations above together make up the joint configuration of the robotic

arm to locate the gripper at a particular point in space. With each computation of the

inverse kinematics, a maximum of one solution is possible for θ1, two results for θ3

and four results for θ2. If the values of x, y and z used for computing the inverse

kinematics lie within the workspace of the robotic arm, then there would be four
52

solutions in each computation. Each solution is called a configuration of the robotic

arm. One of these configurations will definitely place the end-effector of the robotic

arm at the specified location. The other configurations may or may not place the

robotic arm at the specified location." Akinwale(2011).

Thus, a decision was made to implement a feedback algorithm to check if a particular

solution would successfully place the robotic arm at the desired location after finding

the inverse kinematics of the robotic arm during the implementation.

The feedback algorithm runs the obtained inverse kinematics through the forward

kinematics to obtain a location in space after which the location is compared against

the target location specified. The first solution to yield the correct location in space is

chosen. The algorithm is shown in Appendix 1.

3.4.2 Kinematic Analysis and Simulation using MATLAB®

MATLAB® was used to graphically illustrate the kinematic analysis of the robotic

arm. A strength of MATLAB® is its support for Toolboxes which are collections of

functions targeted at particular topics. To simplify the task of carrying out the

kinematic analysis with MATLAB, a third party toolbox developed by Peter Corke

was used. The toolbox, Robotics Toolbox for Matlab (release 9.8) simplifies such

tasks as kinematics, dynamics, and trajectory generation for serial arm-type robots

(petercorke.com, 2013).

The Toolbox is based on a very general method of representing the kinematics and

dynamics of serial-link manipulators, known as the Denavit-Hartenberg (D-H)

notation (petercorke.com, 2013).


53

The D-H parameters (given in Table 3.1) were used to create a vector of 'Link' objects

as shown below:

>> % theta d a alpha joint type D-H type

L(1) = Link ( [0 0 0 0 0 ], 'modified');

L(2) = Link ( [(pi/2) 0 0 pi/2 0 ], 'modified');

L(3) = Link ( [0 0 0.117994 0 0 ], 'modified');

L(4) = Link ( [pi 0 0.157588 pi/2 0 ], 'modified');

L(5) = Link ( [pi 0 0 0 0 ], 'modified');

The joint angles alpha are variables since all the joints are revolute. The zeros only

act as placeholders for the joint angle variables.

MATLAB returned the following;

>>L
L=
theta=q1, d= 0, a= 0, alpha= 0 (R,modDH)

theta=q2, d= 0, a= 0, alpha= 1.571 (R,modDH)

theta=q3, d= 0, a= 0.118, alpha= 0 (R,modDH)

theta=q4, d= 0, a= 0.1576, alpha= 1.571 (R,modDH)

theta=q5, d= 0, a= 0, alpha= 0 (R,modDH)

The link objects were then passed to the constructor 'SerialLink' as shown below:

>>Ademola01 = SerialLink (L, 'name', 'Ademola01')

It returned a SerialLink object that was displayed thus;


54

This provides a concise description of the robot. The robot has 5 revolute joints as

indicated by the structure string 'RRRRR', it is defined in terms of modified Denavit-

Hartenberg parameters and gravity is acting in the default z-direction.

The kinematic parameters of the link objects are also listed and the joint variables are

shown as variables q1, q2, q3, q4 and q5. The robot was also assigned a name

(Ademola01) which is shown whenever the robot is displayed graphically.

Although a value was given for θ, it is not displayed – that value simply served as a

placeholder in the list of kinematic parameters. θ is substituted by the joint variable q

and its value in the Link object is ignored. The value is managed by the Link object.

To illustrate different poses of the robotic arm, four different sets of joint angles were

defined and the plot method was invoked. By invoking the plot method, the robot

could be visualized graphically as stick figures. The graphical representation included

the robot’s name ,the joints and their axes, and a shadow on the ground plane. The

various poses are shown in Figure 3. 7 to Figure 3. 10.


55

Figure 3. 7: Robotic arm pose corresponding to joint angle set qn


56

Figure 3. 8: Robotic arm pose corresponding to joint angle set qs


57

Figure 3. 9: Robotic arm pose corresponding to joint angle set qz


58

Figure 3. 10: Robotic arm pose corresponding to joint angle set qr


59

The joint set was defined as follows;


>> qn = [0 pi/4 pi/2 0 pi/4]

MATLAB returned;
qn =

0 0.7854 1.5708 0 0.7854

The plot method was then invoked as follows;


>> Ademola01.plot (qn)

MATLAB returned a graphical representation of the robot pose shown in Figure 3.5.

The same procedure was repeated for qs = [0 0 -pi/2 0 0] , qz = [0 0 0 0 0] and

qr = [0 pi/2 -pi/2 0 0] and MATLAB returned a graphical representation of the

robot pose corresponding to each set of joint angles as shown in Figures 3.6 to 3.8.

Also, the effect of changing set of joint angles on the pose of the robotic arm was

simulated using the "teach" method in the Robotics Toolbox. The teach method is a

useful tool in the MATLAB® Robotics Toolbox for simulating robots. It is used to

drive (the joints of) a graphical robot by means of a graphical slider panel. The

graphical slider panel provides a means of changing the set of joint angles of the

robotic arm (Corke, 2008). Figure 3. 11 shows a screenshot of the teach method in

use.

Each pose of the robotic arm positions the frame of the end effector in a point in space

that can be represented by a transformation matrix. The goal of the forward kinematic

analysis of the arm using the Robotics toolbox is to obtain this transformation matrix

for any set of joint values. The transformation matrix is a representation of the

position of the end effector of the arm in space relative to a reference frame. The

"fkine" method was used to illustrate this.


60

Figure 3. 11: Illustration of the use of teach method for simulating a robotic arm
61

>> T = Ademola01.fkine (qn)

MATLAB returned;

T=

-0.5000 0.5000 0.7071 -0.0280

-0.7071 - 0.7071 -0.0000 0.0000

0.5000 - 0.5000 0.7071 0.1949

0 0 0 1.0000

The method returns the homogenous transform that represents the pose of the end-

effector.

The rotation matrix component of T is:


>> R=t2r(T)

R=

-0.5000 0.5000 0.7071

-0.7071 -0.7071 -0.0000

0.5000 -0.5000 0.7071

The significance of this rotation matrix can be further explained by using Euler’s

Three-Angle Representation. It contains information about the manner and sequence

in which a frame having a given orientation was rotated to obtain a different

orientation. Thus;

>> tr2eul(R)

ans =

-0.0000 0.7854 -2.3562


62

This implies successive rotations of the end effector frame about the z-axis, the

y-axis and the z-axis again.

The translation component of T is a vector:


>> transl(T)'

ans =

-0.0280 0.0000 0.1949

This represents translations along the original x, y and z-axes respectively.

Thus, the position and orientation of the end effector frame corresponding to the final

pose of arm was achieved by;

• a translation of 0.0280units along the original x-axis in the negative direction

from the origin,

• a translation of 0.0000units along the original x-axis,

• a translation of 0.1949units along the original x-axis in the positive direction

from the origin,

• a rotation of the frame by 0.0000radians about the original z-axis,

• a clockwise rotation of the frame by 0.7854radians about the original z-axis

and

• a counter clockwise rotation of the frame by 2.3562radians about the original

z-axis.

The same procedure can be repeated for any set of joint angles.

The inverse kinematics of the arm was illustrated and visualised using the ikine6s

method of the Robotics Toolbox. Assuming the robot end effector is initially located

at a point corresponding to the set of joint angles qr and it is desired to position it at a

point in space described by the homogenous transform T1, where T1 is given as;
63

>> T1 = Ademola01.fkine

T1 =

0.0000 0.0000 -1.0000 0.1180

-0.0000 -1.0000 -0.0000 -0.0000

-1.0000 0.0000 -0.0000 -0.1576

0 0 0 1.0000

The inverse kinematic problem involves obtaining the joint angles (qs2) that

correspond to this position of the end effector. This is effected as follows:

qs2 = Ademola01.ikine ( T1, qr, [1 1 1 1 1 0], 'pinv' )

MATLAB returned:

Warning: Initial joint angles results in near-singular configuration, this may slow

convergence

> In SerialLink.ikine at 139

qs2 =

0.0000 0.0000 -1.5708 0.0000 0.0000

The value of qs2 computed by the ikine method is equal to the original value of qs

(qs = 0 0 -1.5708 0 0).

The argument '[1 1 1 1 1 0]' is a mask matrix which instructs the 'ikine' method to

ignore the axis with zero value in arriving at the solution for the inverse kinematics.
64

This was necessary because the robotic arm has five degrees of freedom as opposed to

six degrees of freedom for standard robotic arms.

The argument 'pinv' instructs the toolbox to use pseudo-inverse method instead of

Jacobian transpose method in arriving at the solution.

3.5 Determination of Torque Requirement of Joints

To effect motion of the links of a robotic arm, an actuator is fixed at each joint of the

arm. The actuator applies torque at the joint to overcome the resistance of the link to

motion. Resistance of any link to motion is due to gravity and inertia effects.

The effect of gravity on any link of the robot is to pull and accelerate it towards the

centre of the earth due to its own weight, thus exerting a resistive torque on it.

Therefore, a portion of the torque generated by the actuator is needed to overcome this

resistive torque (due to gravity). It was necessary to calculate the value of the gravity-

induced resistive torque acting on each link of the arm so that an actuator with

sufficient torque rating could be selected for each joint.

The resistive torque exerted on a joint due to gravity acting on the robot depends very

strongly on the robot’s pose. Intuitively the torque on the shoulder joint is much

greater when the arm is stretched out horizontally. Thus, in order to calculate the

torque required at each joint, the worst case scenario (arm completely stretched out)

was chosen. The arm is subjected to the highest torque when the arm is stretched

horizontally (Corke, 2011).

Figure 3. 12 shows a labelling of the lengths of the links of the robot arm while a free-

body-diagram (FBD) of the stretched-out arm is shown in Figure 3. 13.


65

Figure 3. 12: Labelling of lengths of the robotic arm links


66

Figure 3. 13: Free-body-diagram of robot arm in stretched out pose


67

From Figure 3. 13:

J1 = Waist or base joint

J2 = shoulder joint

J3 = elbow joint

J4 = wrist joint

J5 = gripper joint

W1 = Weight of link L1

W2 = Weight of link L2

W3 = Weight of link L3

W4 = Weight of link L4

Wgripper = Weight of gripper

Wpayload = Weight of payload

Wj2 = Weight of joint 2

Wj3 = Weight of joint 3

Wj4 = Weight of joint 4

Wj5 = Weight of joint 5

The calculation of the resistive torque exerted on each joint due to gravity effect is as

follows:

Let resistive torque at joint 1 due to gravity = T1g

Let resistive torque at joint 2 due to gravity = T2g

Let resistive torque at joint 3 due to gravity = T3g

Let resistive torque at joint 4 due to gravity = T4g

Let resistive torque at joint 5 (gripper joint) due to gravity = T5g


68

T1g = 0 N.m since waist rotation does not cause motion of any link in the vertical

plane (i.e. against gravity).

T2g =W2 " 2 $+Wj3 L2 + W3" + 3 $+(W


2 2 j4 + W4 + Wj5 + Wgripper + Wpayload)(L2 + L3)

T3g = W3 " 23 $ + (Wj4 + W4 + Wj5 + Wgripper + Wpayload) (L3)

T4g = 0 N-m since wrist rotation does not result in motion against gravity

T5g = 0 N-m since opening and closing of gripper jaws does not result in motion
against gravity

The resistance that is shown by a body to change its rotation is called moment of

inertia (I). Thus, the difficulty faced to change the angular motion of a body about an

axis is measured by calculating its moment of inertia about that particular axis. The

moment of inertia plays the same role for rotation as the mass does for a translational

motion. It describes the angular acceleration produced by an applied torque (Jha,

2006).

Thus, for a given angular acceleration of any link of the robot, the inertia of the link

results in a resistive torque applied at the joint where the lower end of the link is

connected. The value of this torque (Ti) is given by the product of the moment of

inertia of the link and it's angular acceleration, α (Jha, 2006).

Ti = Iα
69

According to Benenson et al. (2006), the moment of inertia of a body depends on its

shape and mass distribution as well as the orientation of its rotational axis, thus

necessitating the use of a different formula to calculate the moment of inertia of each

link of the robotic arm.

Considering the fact that the links of the robot are not simple geometrical shapes, thus

resulting in complex mathematical formulae, a decision was made to determine the

moments of inertia using a CAM (Computer Aided Modelling) software. Thus, the

3-D model of the robot arm (created with AutoCAD®) was exported into Autodesk®

Inventor® Professional. Using the "iproperties" feature of Autodesk® Inventor®

Professional, the moments of inertia of the various parts of the robot were determined.

Figure 3. 14 illustrates the use of the CAM software to determine the moments of

inertia of link L2 (in an upright position) with respect to X, Y and Z-axes (i.e. IXX, IYY

and IZZ).

From the figure;

IXX = 0.164 lbm -in2 = 0.164 x 2926.4g-cm2 = 479.929g.cm2

IYY = 0.226 lbm -in2 = 0.226 x 2926.4g-cm2 = 661.366g.cm2

IZZ = 0.072 lbm -in2 = 0.072 x 2926.4g-cm2 = 210.701g.cm2

Using the same procedure, IXX, IYY and IZZ were evaluated for the various components

of the robotic arm.

A decision was made to design the robot such that each motor rotates by one degree

step every 20 milliseconds which is equivalent to an angular velocity of 50

degrees/second or 0.873rad/s. Although, the motors are capable of rotating at a higher

speed, the moderate speed selected will reduce the rate of wear and tear on the joints

of robotic arm.
70

Figure 3. 14: Autodesk® Inventor® Professional iproperties™ displaying the moments


of inertia of link L2
71

Also, a decision was made to design the robot such that each link moves from rest

position (0rad/s) to the design angular velocity of 0.873rad/s in 0.5 seconds, which

translates into an angular acceleration of 1.746 rad/s2. For the arm to move from a rest

position, this acceleration is required.

Thus, the calculation of the resistive torque exerted on each joint due to inertia effect

is as follows:

Let resistive torque at joint 1 due to inertia effect = T1i

Let resistive torque at joint 2 due to inertia effect = T2i

Let resistive torque at joint 3 due to inertia effect = T3i

Let resistive torque at joint 4 due to inertia effect = T4i

Let resistive torque at joint 5 (gripper joint) due to inertia effect = T5i

Recall from Figure 3.10:

J1 = Waist or base joint

J2 = shoulder joint

J3 = elbow joint

J4 = wrist joint

J5 = gripper joint

A motor (actuator) is placed at each of these joints. Thus, let;

msm = mass of shoulder motor

mem = mass of elbow motor

mwm = mass of wrist motor

mgm = mass of gripper motor


72

mL1 = mass of link L1

mL2 = mass of link L2

mL3 = mass of link L3

mL4 = mass of link L4

mgripper = mass of gripper assembly

Ism = moment of inertia of shoulder motor

Iem = moment of inertia of elbow motor

Iwm = moment of inertia of wrist motor

Igm = moment of inertia of gripper motor

Igripper = moment of inertia of gripper assembly

Ibase = moment of inertia of rotating base and shoulder motor bracket assembly

IL1 = moment of inertia of link L1

IL2 = moment of inertia of link L2

IL3 = moment of inertia of link L3

The values of the moments of inertia obtained from Autodesk® Inventor® Professional

are about the axes passing through each body's centre of mass. In the case of a robotic

arm, the moment of inertia must take into consideration that the part is being

rotated about a pivot point (the joint in this case) located a distance away from the

centre of mass . To evaluate the moments of inertia about the joint axes, the parallel

axis theorem was applied.

Consider a body of mass m, which is made to rotate about an axis z passing through

the body's centre of mass (as shown in Figure 3. 15). Let the body have a moment of

inertia Icm with respect to this axis.


73

Figure 3. 15: Diagram to illustrate the Parallel Axis Theorem


74

The parallel axis theorem states that if the body is made to rotate instead about a new

axis z′ which is parallel to the first axis and displaced from it by a distance h, then the

moment of inertia I with respect to the new axis is related to Icm by;

I = Icm + mh2

...where h is the perpendicular distance between the axes z and z′ (Jha, 2006).

Thus, for each joint, the moment of inertia of each link pivoted about it was calculated

by adding the products of each individual mass (m) by the square of the respective

perpendicular distance (h) between the body's centre of mass and the joint.

Thus;

T1i = [(Izz,base + mbaseh112 ) + (Izz,sm + msmh122 ) + (Izz,L2 + mL2h132 ) + (Izz,em + memh142 ) +


(Izz,L3 + mL3h152 )+(Izz,wm + mwmh162 )+(Izz,gripper + mgripperh172 )+ (Izz,gm + mgmh182 )]α

NB: T1i was calculated with the arm in a stretched out pose (worst case scenario).

h11 = perpendicular distance between z-axes through the centroids of base motor horn

and rotating base-shoulder servo bracket assembly

h12 = perpendicular distance between z-axes through the centroids of base motor horn

and shoulder motor

h13 = perpendicular distance between z-axes through the centroids of base motor horn

and link L2 (stretched out)

h14 = perpendicular distance between z-axes through the centroids of base motor horn

and elbow motor


75

h15 = perpendicular distance between z-axes through the centroids of base motor horn

and link L3 (stretched out)

h16 = perpendicular distance between z-axes through the centroids of base motor horn

and wrist motor

h17 = perpendicular distance between z-axes through the centroids of base motor horn

and gripper assembly (pointing downwards)

h18 = perpendicular distance between z-axes through the centroids of base motor horn

and gripper motor

T2i = [(Ixx,L2 + mL2h212 ) + (Ixx,em + memh222 ) + (Ixx,L3 + mL3h232 ) + (Ixx,wm + mwmh242 )


+ (Ixx,gripper + mgripperh252 ) + (Ixx,gm + mgmh262 )]α

h21 = perpendicular distance between x-axes through the centroids of shoulder motor

horn and link L2

h22 = perpendicular distance between x-axes through the centroids of shoulder motor

horn and elbow motor

h23 = perpendicular distance between x-axes through the centroids of shoulder motor

horn and link L3

h24 = perpendicular distance between x-axes through the centroids of shoulder motor

horn and wrist motor

h25 = perpendicular distance between x-axes through the centroids of shoulder motor

horn and gripper assembly (pointing forward)

h26 = perpendicular distance between x-axes through the centroids of shoulder motor

horn and gripper motor


76

T3i = [ (Ixx,L3 + mL3h312 ) + (Ixx,wm + mwmh322 )+ (Ixx,gripper + mgripperh332 )


+ (Ixx,gm + mgmh342 )]α

h31 = perpendicular distance between x-axes through the centroids of elbow motor

horn and link L3

h32 = perpendicular distance between x-axes through the centroids of elbow motor

horn and wrist motor

h33 = perpendicular distance between x-axes through the centroids of elbow motor

horn and gripper

h34 = perpendicular distance between x-axes through the centroids of elbow motor

horn and gripper motor

T4i = [(Ixx,gripper + mgripperh412 ) + (Ixx,gm + mgmh422 )]α

h41 = perpendicular distance between y-axes through the centroids of wrist motor

horn and gripper motor

h42 = perpendicular distance between y-axes through the centroids of wrist motor

horn and gripper motor

3.6 Motor Selection

A decision was made to use servomotors (commonly called servos) to provide the

required torque at each joint location. The decision was made due to the ability of

servos to control the angular position, velocity and acceleration of the output shaft in a

precise manner.
77

With ordinary DC (Direct Current) motors, it is possible to control whether or not they

are on or off, and the speed, but controlling the direction or position takes more work

and more components. RC servo motors have built into them, the components needed

to control the motor’s position precisely. Inside the servo is a combination of gears

and a potentiometer with circuitry that make it possible to set its position fairly

quickly and precisely and within a 180-degree range of travel (Sullivan and Igoe,

2004).

The various parts of a standard servomotor are shown in Figure 3. 16.

Using the formulae derived in the previous section, the resistive torque expected to act

at each joint was evaluated and a motor (servo) with an appropriate torque rating was

selected for each joint.

3.6.1 Gripper joint

A standard mini servo gripper template from Lynxmotion® Robotics was used to

implement the gripper design. It is based around two counter-rotating gears which use

levers to make the grippers open and close laterally (Lynxmotion.com, 2013).

The design uses a Hitec® HS-225MG servo. Thus, a Hitec® HS-225MG servo was

selected to provide the needed torque for the opening and closing of the gripper jaws.

Note: The 'MG' added to the servo name is an abbreviation for 'Metal Gear'. It

indicates that the drive gears of the servo are made of metals as opposed to plastics

used for most standard servos.


78

output spline
drive gears

servo case
control circuit

potentiometer motor

Figure 3. 16: Components of a standard servo


Source: induino.com
79

3.6.2 Wrist joint

Total resistive torque at wrist joint = T4g + T4i = 6.012 x 10-5N-m or 6.012 x10-4 kg-cm

Thus a Hitec® HS-225MG servo with a torque rating of 3.9kg.cm was selected to

provide sufficient torque at the joint.

3.6.3 Elbow Joint

Total resistive torque at elbow joint = T3g + T3i

T3g = 0.2707N.m

T3i = 0.0049N.m

Hence;

Resistive torque at elbow joint = 0.2707N.m +0.0049N.m = 0.2756N.m or

2.756kg.cm. Thus a Hitec® HS-422 standard servo with a torque rating of 3.30kg.cm

at 4.8volts was selected to provide the torque needed at the joint location.

The extra torque of 0.544kg.cm caters for the extra resistive torque due to inertia of

the payload, servo bearing friction, irregularities in the robot frame design and future

modifications of the arm design.

3.6.4 Shoulder Joint

Total resistive torque at shoulder joint = T2g + T2i

T2g = 0.5903N.m

T2i = 0.0152N.m

Hence;

Resistive torque at shoulder joint = 0.6055N.m or 6.055kg.cm


80

Thus a Hitec® HS-645MG Ultra Torque servo with a torque rating of 7.70kg.cm at

4.8volts was selected to provide the torque needed at the joint location. The extra

torque of 1.645kg.cm caters for the extra resistive torque due to inertia of the payload,

servo bearing friction and irregularities in the robot frame design.

3.6.5 Base Joint

Total resistive torque at base joint = T1g + T1i = 0.0304N.m or 0.3040kg.cm

Thus a Hitec® HS-422 servo with a torque rating of 3.30kg.cm at 4.8volts was

selected to provide the torque needed at the joint location.

The properties of the selected servos are highlighted in Table 3.2 while their pictures

are shown in Figure 3. 17.

3.7 Servo Controller Selection

A servo motor is controlled by sending a series of pulses to it. This is called Pulse

Width Modulation (PWM) signal. A pulse is sent to the servo every 20 milliseconds.

Depending on the pulse width (duration of the pulse), the motor spline or shaft moves

to a position within a 180-degree range. For a hitec® servo, a 0.9 millisecond (900µs)

pulse moves the motor to 0 degrees, a 2.1 millisecond (2100 µs) pulse moves it to 180

degrees. Any pulse between 900µs and 2100µs moves the motor to a position

proportionally between 0 and 180 degrees (Sullivan and Igoe, 2004).

A servo controller is a circuit arrangement that generates the PWM signals that control

the servo motors based on instructions from a micro-controller or a computer.


81

Table 3. 2 : List of selected servos and their specifications


Source: hitecrcd.com

Joint Servo Torque at 4.8v Mass Dimensions


(kg.cm ) (g) (L x W x H)
(mm)

Gripper HS-225MG 2.60 27.94 32.26 x 16.76 x 31.00

Wrist HS-225MG 2.60 27.94 32.26 x 16.76 x 31.00

Elbow HS-422 3.30 45.36 40.39 x 19.56 x 36.58

Shoulder HS-645MG 7.70 55.00 40.39 x 19.56 x 37.59

Base HS-422 3.30 45.36 40.39 x 19.56 x 36.58


82

Figure 3.17a Hitec HS-225MG servo

Figure 3.17b Hitec HS-422 servo Figure 3.17c Hitec HS-645MG servo

Figure 3. 17 : Pictures of selected servos for the robot arm


Source: sevodatabase.com
83

Three servo controllers were considered for the implementation of this project - the

Pololu® Micro Maestro 6-Channel USB (Universal Serial Bus) servo controller,

Images Company's® SMC-05A servomotor controller and Lynxmotion's® SSC-32

servo controller.

The Pololu® Micro Maestro is a highly versatile servo controller. It supports three

control methods: USB for direct connection to a computer, TTL (transistor-

transistor logic) serial for use with embedded systems, and internal scripting for self-

contained, host controller-free applications (Pololu.com, 2013).

Its extremely precise, high-resolution servo pulses have a jitter of less than 200 ns,

making it well suited for high-performance robotics applications, and its in-built speed

and acceleration control for each channel make it easy to achieve smooth, seamless

movements without requiring the control source to constantly compute and stream

intermediate position updates to it (Pololu.com, 2013) . Thus, it is ideal for this

project.

Also, a mail was sent to the manufacturer to ascertain its compatibility with Hitec ®

servo motors(which have been selected for this project) and the company technical

support team verified its compatibility.

Images Company's® SMC-05A servomotor controller on the other hand was

ascertained from the manufacturer's website to be compatible with Hitec® servos. It

allows manual and PC (Personal Computer) control of five servomotors (Hitec® and

Futaba® brands). Servomotors may be controlled manually, via on-board switches, or

by a Windows program through a serial communication port (RS232) on a Windows

PC. Universal three position headers make it easy to connect servomotors by just

plugging them into the board (Imagesco.com, 2013).


84

The SSC-32 (serial servo controller) is a small preassembled servo controller with 32

channels of 1µS resolution servo control, thus enabling accurate positioning of the

servo output shaft. The range of its generated Pulse Width Modulation (PWM) signal

is 0.50ms to 2.50ms for a range of about 180° and it features an Atmel ATMEGA168-

20PU microcontroller. It also supports Futaba® and Hitec® servos. (Lynxmotion.com,

2013).

The three servo controllers could suffice for this project but due to the high cost of

the SMC-05A ($165.95 as opposed to $39.95 for the SSC-32 servo controller and

$19.95 for the Pololu® Maestro), a decision was made not to use it for the project

implementation. The Pololu® Mini Maestro was preferred to the SSC-32 because it

has a USB Software Development Kit that allows a less tedious programming with

Microsoft's C# programming language. Also, it can be directly connected to the host

PC (using the USB connection feature) without the need for a USB-to-serial adapter.

Thus, a decision was made to implement the robotic arm with the Pololu® Micro

Maestro servo controller (shown in Figure 3. 18).

3.8 Interfacing Software

To enable control of a robot from a PC (Personal Computer), an interfacing software,

which enables the operator to send instructions to the servo controller is needed. As

earlier stated, the Pololu® Mini Maestro has a USB Software Development Kit that

allows easy programming with Microsoft's C# programming language.

Thus, a decision was made to interface the servo controller with the host PC and as

well implement the inverse kinematics of the robotic arm (thus making it possible to

position the gripper by specifying X-Y-Z coordinates of a desired position) using C#


85

Figure 3. 18: Pololu's® Micro Maestro 6-channel servo controller


Source: http://www.pololu.com/picture/view/0J1951
86

programming language. The flowchart and C# code for the interfacing software and

implementation of the inverse kinematics is shown in the Appendix 1.

3.9 Final Arm Model

The final robotic arm model with the selected servos included is shown in Figure 3. 19

Figure 3. 20 while Figure 3. 21 shows the components that make up the arm structure.

The model was created with Autodesk® Inventor® Professional 2013 (Student

Version).

3.10 Cost

Table 3.3 gives a breakdown of the cost estimate of the robotic arm.
87

Figure 3. 19: Final model of the robotic arm


(Modelled and rendered with Autodesk® Inventor® Professional)
88

Figure 3. 20: Final model of the robotic arm (upright pose)


(Modelled and rendered with Autodesk® Inventor® Professional)
89

Figure 3.21a: Base board


90

Figure 3.21b: Round base top


91

Figure 3.21c: Rotating base


92

Figure 3.21d: Shoulder servo bracket (left hand side)


93

Figure 3.21e: Shoulder servo bracket (right hand side)


94

Figure 3.21f: Link L2


95

Figure 3.21g: Link L3


96

Figure 3.21h: Wrist rotate servo holder


97

Figure 3.21i: Gripper


98

Figure 3.21j: Gripper(cont'd)

Figure 3. 21: Components of the robotic arm


(Modelled and rendered with Autodesk® Inventor® Professional)
99

Table 3. 3 : Cost Estimate of Robotic Arm


Source: www.robotshop.com

Component Quantity Price


($) ( )

HS-645MG servo motor 1 31.49 5,047.85

HS-422 servo motor 2 19.98 3,202.79

Servo extender cable 3 5.25 841.58


(30cm)

Pololu® micro maestro


6-channel USB servo 1 19.95 3,197.99
controller (assembled)

Lynxmotion injection- 2 11.00 1,763.30


molded servo hinge

HS-225MG high
performance 2 51.98 8,332.39
mini servo

USB A to mini-B cable


(1.3m) 1 1.95 317.39

Wall Adapter (6v, 2A) 1 19.95 3,197.99

Wiring harness with 1 4.95 793.49


wall pack connector

Perspex sheet 2m x 2m 1,750.00

Shipping 77.88 12,484.16

Total cost 40,928.93


100

CHAPTER FOUR

RESULTS AND DISCUSSION

The robotic arm was set up based on the procedure outlined in the preceding chapter.

The components of the arm were cut from acrylic plastic sheet as earlier decided with

the use of a fret saw and a hand-drilling machine. However, the acrylic plastic used

has a thickness of 3mm as opposed to 4mm stated in the preceding chapter. This was

due to the non-availability of the 4mm thick acrylic sheet in the nearby market. It was

however an advantage since the 3mm sheet still had adequate strength and stiffness to

bear imposed load and the increased lightness led to a reduction in the load the servos

had to bear. To further reduce the load imposed on the shoulder and elbow servos, a

load-balancing spring was added to the elbow joint of the robotic arm.

The fabricated robotic arm components were assembled using the 3D model as guide.

The servos were inserted at the various joints in such a way that centring all the servo

horns would make the arm to assume an upright pose as shown in Figure 4.1. The

centring of the servo horns was achieved by connecting the servos to the Pololu®

Micro Maestro™ servo controller and sending PWM (Pulse Width Modulation) signals

of 1500ms to them from the Pololu® Maestro™ Control Centre. The set-up is

illustrated in Figure 4.2.

The servo controller was connected to the PC running the Pololu® Maestro™ Control

Centre via a USB cable which also supplied enough current to power the servo

controller board. The servos on the other hand were powered using a 3-Ampere

regulated dc power pack.


101

Figure 4. 1 : Robotic arm in upright pose


102

Figure 4. 2 : Set-up for the centring of servo horn


103

After the robotic arm was assembled, each servo powering the arm was connected to

the Pololu® Micro Maestro™ servo controller. This was achieved by inserting the

female outlet of each servo cable to the 3-header pins on the servo controller. Servo

extender cables were used to increase the length of the wrist-rotate and gripper servos

cables because they were far away from the servo controller. The control interface

software was then deployed.

The interfacing software (shown in Figure 4.3) requires the user to enter the desired

coordinates of the centre of the gripper after which he clicks a button labelled "click to

compute inverse kinematics " to compute the inverse kinematics of the robotic arm.

On clicking the button, the angle through which each joint will rotate in order to

position the gripper at the specified coordinate is displayed if the specified coordinates

are within the robotic arm workspace. If the specified coordinates are not within the

workspace of the robotic arm, the user is notified on clicking the button. The user then

clicks the button labelled "click to position gripper" to move the gripper to the

specified coordinates.

The robotic arm set-up was tested by inputting series of coordinates in the control

software. The resulting position of the gripper centre for each set of coordinates was

measured. To ensure ease of measurement, a calibrated cardboard was place behind

the robotic arm to measure the y and z coordinates while another one was placed

below the robotic arm to measure the x coordinates. A picture of the set-up is shown

in Figure 4.4 while the test results are presented in Table 4.1.
104

Figure 4. 3 : GUI of interfacing software for controlling robotic arm


105

Figure 4. 4 : Setup for measuring coordinates of gripper position


106

Table 4. 1 : Coordinate input test result for robotic arm

Input Coordinates (cm) Output Coordinates (cm) Deviation (cm)


x y z x y z |Δx| |Δ y| |Δ z|

10 4 19 11 5 19 1 1 0

5 4 23 5 3 25 0 1 2

8 -7 19 10 -7 19 2 0 0

8.5 0 33.5 9 0 34 0.5 0 0.5

14.9 0 10.1 17 0 11 2.1 0 0.9

8.5 -23 9 12 -21 7 3.5 2 2

10 12 18 12 13 17 2 1 1

9 -7 27 10 -7 25 1 0 2

8 -23 10 9 -23 9 1 0 1

10 7 28 11 7 27 1 0 1

15 -7 15 15 -6 15 0 1 0

15 15 15 14 16 14 1 1 1

10 -20 8 11 -21 7 1 1 1

12 -10 13 13 -11 12 1 1 1

11 12 14 13 11 14 2 1 0

12 16 10 14 15 9 2 1 1
107

To demonstrate an area of application of the robotic arm in manufacturing industries,

a software that enables the user of the robotic arm to easily instruct it to perform a

repeated pick-and-place task was developed. A snapshot of the software GUI (Graphic

User Interface) is shown in Figure 4.5 while the source code, written in C# is shown in

Appendix 2. Figures 4.6 and 4.7 show the robotic arm performing a pick-and-place

task.

Another software that allows the user to control the robotic arm by using sliders to

move the joints was developed. A snapshot of the software GUI (Graphic User

Interface) is shown in Figure 4.8 while the source code, written in C# is shown in

Appendix 3.
108

Figure 4. 5 : GUI of interfacing software for controlling robotic arm (pick-and-place


task)
109

Figure 4. 6 : Robotic arm performing a pick-and-place task (a)


110

Figure 4. 7 : Robotic arm performing a pick-and-place task (b)


111

Figure 4. 8 : GUI of alternate interfacing software for controlling robotic arm (using
sliders)
112

CHAPTER FIVE

CONCLUSION AND RECOMMENDATIONS

The objective of this study - the fabrication of a robotic arm that can be used for

demonstrative and educational purposes - has been achieved. The study has further

shown that the robotic arm can be reprogrammed in order to adapt it to a variety of

tasks. This could prove to be very useful in an industrial environment, especially in

manufacturing, packaging and mining industries. It is believed that the use of the arm

for educational and instructional purposes will stimulate the interest of students in the

field of robotics, especially in a developing country like Nigeria, where robotics is yet

to enter into an advanced stage.

There is however room for improvement in the design and implementation of the

robotic arm system as outlined below:

• The number of degrees of freedom of the robotic arm can be increased in

order to expand its workspace, thereby making it more versatile.

• Motors with higher torque ratings can be used to power the joints so as to

ensure that the robotic arm remains in position even when electric current is

not supplied to the motors.

• Wireless control of the robotic arm using Bluetooth™ or Wi-Fi™ (IEEE

802.11x) technology can be implemented.

• Object detection and collision avoidance can be implemented by adding

proximity sensors to the robotic arm.

The inclusion of a course work in robotics in the department's curriculum is also

highly recommended.
113

REFERENCES

Adebola, S. O. (2012). 'Design of an improvised robot arm system'. BSc. thesis,


Obafemi Awolowo University, Ile-Ife.

Adekoya, A. A. (2001). Fuzzy Control Design of a Robotic Arm. BSc. Thesis,


Obafemi Awolowo University, Department of Mechanical Engineering, Ile-
Ife.

Adhikari, U., Islam, M. M., Noser, J., & Puliafico, K. (2010). 'Elderly Assist Robot'.
Worcester Polytechnic Institute, Worcester.

Akinwale, O. B. (2011). 'Development of a robust ILab platform for conducting


robotic arm experiments'. M.Sc. thesis. MSc. Thesis, Obafemi Awolowo
University, Electronic and Electrical Engineering, Ile-Ife.

Angelo, J. A. (2007). Robotics: A Reference Guide to the New Technology. Westport:


Greenwood Press.

Benenson, W., Harris, J. W., Stöcker, H., & Lutz, H. (2006). Handbook of Physics.
New York: Springer Publishing.

Blackmore, M., Furniss, J., & Ochsner, S. (2009). '5 DOF Robotic Arm'. Poland State
University, Embedded Robotics, Poland.

Boedeker.com. (2013). Acrylic PMMA (Polymethyl-Methacrylate) Specifications.


Boedeker Plastics, Inc.: http://www.boedeker.com/acryl_p.htm. Accessed
August 19, 2013.

Carlson, M., Donley, E., Graf, K., & Jones, T. (2013). Helping Hand : Senior Design
Final Documentation. Orlando: University of Central Florida.

Chen, C., Huang, H., & Wan, T. (2006). 'Omnidirectional Mobile Home Care Robot'.
National Chung-Hsing University, Department of Electrical Engineering,
Taichung.

Corke, P. (2011). Robotics, Vision and Control Fundamental Algorithms in MATLAB.


(B. Siciliano, O. Khatib, & F. Groen, Eds.) Berlin: Springer-Verlag.
114

Craig, J. J. (1986). Introduction to Robotics-Mechanics and Control (1st ed.). MA:


Addison-Wesley.

Craig, J. J. (2005). Introduction to Robotics-Mechanics and Control (3rd ed.). (M. J.


Horton, Ed.) Upper Saddle River, USA: Pearson Prentice Hall.

Denavit, J., & Hartenberg, R. S. (1955). Kinematic notation for lower-pair


mechanisms based on matrices. Journal of applied mechanics , 23, 215-221.

Fadahunsi, O. O. (1998). Geometric Modelling : Application to the Simulation of a


Robot's Manipulator. BSc. Thesis, Obafemi Awolowo University, Department
of Mechanical Engineering, Ile-Ife.

FANUC. (2013). FANUC's History. FANUC: http://www.fanuc.co.jp/en/profile/


history/. Accessed October 14, 2013, from

Herman, P., Mosteoyz, A. R., & Muri, A. C. (2012). 'A low-cost custom-built robotic
arm on the TurtleBot platform'. University of Zaragoza, Dpt. de Informática e
Ingeniería de Sistemas, Zaragoza.

Honda Robotics. (2013). Honda Robotics: http:// asimo.honda.com/Inside-ASIMO/


Accessed July 13, 2013.

Images SI Inc. (2013). Servobotics Robotic Arm Model RA-02. Images Scientific
Instruments: http://www.imagesco.com/robotics/arm.html. Accessed
October 19, 2013.

Imagesco.com. (2013). SMC-05 Servo Motor Motion Controller. Images Scientific


Instruments: http://www.imagesco.com/servo/smc05.html. Accessed August
24, 2013.

Ishola, B. I. (2012). Final Report on The Development of Strength of Materials


Remote Laboratory: A Non-EE/CS Ilab For Static Bending Test. BSc.
Thesis, Obafemi Awolowo University, Department of Electronic and
Electrical Engineering, Ile-Ife.

Jenne, F. (2009). 'Combined control of a mobile robot and a robot arm'. Swiss Federal
Institute of Technology , Autonomous Systems Laboratory, Zurich.
115

Jha, D. K. (2006). Dynamics of Rigid Bodies. In D. K. Jha, Text Book Of Rotational


Mechanics (pp. 12-17). Delhi: Discovery Publishing Pvt.Ltd.

KUKA. (2013). History. KUKA: http://www.kuka-robotics.com/usa/en/company/


group/milestones/1973.htm. Accessed September 5, 2013.

KUKA. (2013). KUKA youBot 5-degree-of-freedom arm with 2-finger gripper.


KUKA youBot store: http://www.youbot- store.com/youbotstore/youbots/
products/kuka-youbot-5-degree-of-freedom- arm-with-2-finger-gripper.
Accessed October 19, 2013, from

Lynxmotion. (2013). AL5B Robotic Arm Combo Kit (with RIOS Software. Lynxmotin
: Imagine it. Build it. Control it.: http://www.lynxmotion.com/p-673-al5b-
roboticarm-combo-kit-with-rios-software.aspx. AccessedOctober19, 2013.

Lynxmotion.com. (2013). SSC-32 Servo Controller. Lynxmotion-Imagine it, build it,


control it: http://www.lynxmotion.com/p-395-ssc-32-servo-controller.aspx.
Accessed August 24, 2013.

Mataric, M. J. (2007). The Robotics Primer (Intelligent Robotics and Autonomous


Agent series) . London: MIT Press.

Okedeyi, S. B. (1998). Dynamic Simulation of a Two-Link Constrained Robotic


Manipulator. BSc.Thesis, Obafemi Awolowo University, Department of
Mechanical Engineering, Ile-Ife.

Patil, C., Sachan, S., Singh, R. K., Ranjan, K., & Kumar, V. (2009). Self and Mutual
learning in Robotic Arm, based on Cognitive systems. West Bengal: Indian
Institute of Technology Kharagpur.

Paul, R. P. (1981). Robot Manipulators: Mathematics, Programming, and Control.


Cambridge: MIT Press.

petercorke.com. (2013). Robotics Toolbox. PeterCorke.com:http://petercorke.com/


Robotics_Toolbox.html. Accessed August 24, 2013.

Pololu.com. (2013). Micro Maestro 6-Channel USB Servo Controller (Assembled).


Pololu Robotics and Electronics: http://www.pololu.com/catalog/product/1350
Accessed August 24, 2013.
116

Polymech. (2012). Acrylic. Polymech Liquid and Plastic Solutions:http://www.poly


mech.com.au/engineering-plastics/66-acrylic. Accessed August19, 2013.

Ryan, D. (2012). History MV for Industrial Robots. In D. Ryan, E - Learning


Modules: Dlr Associates Series (pp. 28-29). Bloomington: AuthorHouse.

Siciliano, B., & Khatib, O. (2008). Springer Handbook of Robotics (illustrated Edition
ed.). New York: Springer.

Siciliano, B., Sciavicco, L., Villani, L., & Oriolo, G. (2009). Robotics : Modelling,
Planning and Control. London: Springer-Verlag London Ltd.

Spong, M. W., & Vidyasagar, M. (1989). Robot Dynamics and Control (1st ed.).
Hoboken: John Wiley and Sons.

Sullivan, D. O., & Igoe, T. (2004). Physical Computing : Sensing and Controlling the
Physical World with Computers. Boston: Thomson Course Technology PTR.

The Chambers Dictionary (12th ed.). (2011). Chambers Harrap Publishers Ltd.
117

APPENDIX 1

Flowchart for Inverse Kinematics Program

Start

Input x, y, z

Compute base rotation


angle = a (using
Inverse Kinematics
Equations)

a > 90 or
No Display error
a< 90 message

Yes

Compute possible values of


elbow rotation angle =
Matrix c (using Inverse
Kinematics Equations)

Values of c
No Display error
lie within
workspace message

Yes

A
118

Compute possible values


of elbow rotation angle =
Matrix b (using Inverse
Kinematics Equations)

Values of b No
lie within Display error
workspace message

Yes

Compute forward
kinematics of each
possible set of joint angles

Compute differences between


the x, y and z coordinates of
forward kinematics results and
desired gripper position. Denote
the differences as Error1,
Error2, Error3 and Error4

B
119

Error1≤ Error2
AND
Base angle = a
Yes
Error1≤ Error3 Shoulder angle = b{1}
AND
Elbow angle = c{1}
Error1≤ Error4

No

Error2≤ Error1
AND Base angle = a
Yes
Error2≤ Error3 Shoulder angle = b{2}
AND
Elbow angle = c{1}
Error2≤ Error4

No

Error3≤ Error1
AND
Base angle = a

Error3≤ Error2
Yes Shoulder angle = b{3}
AND
Elbow angle = c{2}
Error3≤ Error4

No

C D
120

C D

Error4≤ Error1 Base angle = a


AND
Yes Shoulder angle = b{4}
Error4≤ Error2
AND
Elbow angle = c{2}
Error4≤ Error3

No
Convert
angles to
Display Error PWM signals
Message

Send PWM
signals to servo
controller

Stop
121

C# Code for GUI Controller and Inverse Kinematics Computation of Robotic

Arm Using Pololu® Software Development Kit

/* Robotic Arm Control Interface with Inverse Kinematics Engine:


* written in Visual C#.
* by Oridate Ademola Ayodeji
* (MEE/2008/071)
*
* Acknowledgment:
* The Inverse kinematics algorithm written by Akinwale (2009) of the
* Department of Electronic and
* Electrical Engineering, Obafemi Awolowo University for the RA-01 robotic arm
* (in MatLab) for
* his M.Sc Thesis was used as a guide in writing this program.
*/

using Pololu.Usc;
using Pololu.UsbWrapper;

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Text;
using System.Windows.Forms;
using System.Numerics;

namespace Pololu.Usc.MaestroEasyExample
{
public partial class MainWindow : Form
{

string xinput = ""; double x;


string yinput = ""; double y;
string zinput = ""; double z;

const double L0 = 1.95;


const double L1 = 2.25;
const double L2 = 2.75;
const double L3 = 2.75;
const double L4 = 3.00;

double a;
double [] b = new double[4];
double [] c = new double[2];

double [] angleConfig = new double[4];

double ao;
double bo;
double co;

Complex[] bInter = new Complex[4];


Complex[] cInter = new Complex[2];

public MainWindow()
{
122

InitializeComponent();
}

public double computeBaseAngle()


{
xinput = xInput_textBox.Text;
x = Convert.ToDouble(xinput);

yinput = yInput_textBox.Text;
y = Convert.ToDouble(yinput);

Complex acmplx = Math.Atan(y / x) * (180 / Math.PI);


a = acmplx.Real;

//Specify what the sign of a shoulder be if a=90


if ((y < 0) && (x == 0))
{
a = -90.0;
}
else if ((y > 0) && (x == 0))
{
a = 90.0;
}
else
{
a = a;
}
//Ensure that a lies in the first or third quadrants
if (a > 90.0 || a < -90.0)
{
displayMeassage("The specified position is out of the robotic
arm workspace (error a)");
}
return a;
}

//Calculate the value of c


public void computeElbowAngle()
{
xinput = xInput_textBox.Text;
x = Convert.ToDouble(xinput);

yinput = yInput_textBox.Text;
y = Convert.ToDouble(yinput);

zinput = zInput_textBox.Text;
z = Convert.ToDouble(zinput)-L0-L1;

double k = (Math.Pow(x, 2) + Math.Pow(y, 2) + Math.Pow(z, 2)) -


(Math.Pow(L2, 2) + Math.Pow(L3, 2) + Math.Pow(L4, 2));

double ea = (4 * Math.Pow(L2, 2) * Math.Pow(L4, 2)) + (4 *


Math.Pow(L2, 2) * Math.Pow(L3, 2));

double eb = -4 * k * L2 * L4;

double ec = Math.Pow(k, 2) - (4 * Math.Pow(L2, 2) * Math.Pow(L3,


2));
123

Complex cInter1 = (-eb + Math.Sqrt(Math.Pow(eb, 2) - (4 * ea *


ec))) / (2 * ea);

Complex cInter2 = (-eb - Math.Sqrt(Math.Pow(eb, 2) - (4 * ea * ec)))


/ (2 * ea);

//Check if the value of cInter lies within limits

int ii = 0;

if ((Math.Pow(eb, 2) - (4 * ea * ec)) > 0.0)


{
ii = ii + 1;
cInter[0]=cInter1;
}

if ((Math.Pow(eb, 2) - (4 * ea * ec)) > 0.0)


{
ii = ii + 1;
cInter[1]=cInter2;
}

if ((Math.Pow(eb, 2) - (4 * ea * ec)) < 0.0)


{
MessageBox.Show("The specified position is out of the robotic
arm workspace (error c)");
};

//else if (ii=2)
//Calculate the value of c

for (int iii = 0; iii < ii; iii++)


{
c[iii]=(Math.Asin(cInter[iii].Real) * (180 / Math.PI));

if (Math.Abs(c[iii]) < 0.0001)


{
c[iii] = 0.0;
}
}
}

//Calculate the values of b


public void computeShoulderAngle()
{
xinput = xInput_textBox.Text;
x = Convert.ToDouble(xinput);

yinput = yInput_textBox.Text;
y = Convert.ToDouble(yinput);

zinput = zInput_textBox.Text;
z = Convert.ToDouble(zinput) - L0 - L1;

int kk = 0;
124

for (int jj = 0; jj < c.Length; jj++)


{
double k1 = (L4 * Math.Sin(c[jj] * (Math.PI / 180))) + (L3 *
Math.Cos(c[jj] * (Math.PI / 180))) + L2;

double k2 = (L4 * Math.Cos(c[jj] * (Math.PI / 180))) - (L3 *


Math.Sin(c[jj] * (Math.PI / 180)));

double eaa = Math.Pow(k1, 2) + Math.Pow(k2, 2);

double ebb = -2 * z * k2;

double ecc = Math.Pow(z, 2) - Math.Pow(k1, 2);

Complex bInter1 = (-ebb + Math.Sqrt(Math.Pow(ebb, 2) - (4 * eaa


* ecc))) / (2 * eaa);

Complex bInter2 = (-ebb - Math.Sqrt(Math.Pow(ebb, 2) - (4 * eaa


* ecc))) / (2 * eaa);

int ii = 0;

if ((Math.Pow(ebb, 2) - (4 * eaa * ecc)) > 0.0)


{
ii = ii + 1;
bInter[ii-1]=bInter1;
}

if ((Math.Pow(ebb, 2) - (4 * eaa * ecc)) > 0.0)


{
ii = ii + 1;
bInter[ii-1]=bInter2;
}

if ((Math.Pow(ebb, 2) - (4 * eaa * ecc)) < 0.0)


{
MessageBox.Show("The specified position is out of the
robotic arm workspace (error b)");
}

for (int iii = 0; iii < ii; iii++)


{
b[kk]=(Math.Asin(bInter[iii].Real) * (180 / Math.PI));

if (Math.Abs(b[iii]) < 0.0001)


{
b[iii] = 0.0;
}
kk = kk + 1;
}

}
}

public void displayMeassage(string msg)


{
125

MessageBox.Show(msg);

private static double[,] MatMulp(ref double[,] Ma, ref double[,] Mb)


{
double[,] aa = new double[4, 4];
double[,] bb = new double[4, 4];
double[,] abab = new double[4, 4];

for (int i = 0; i < abab.GetLength(0); i++)


{
for (int j = 0; j < abab.GetLength(1); j++)
{
abab[i, j] = 0;
for (int k = 0; k < Ma.GetLength(1); k++)

abab[i, j] = abab[i, j] + Ma[i, k] * Mb[k, j];


}
}
return abab;
}

private static double[] Fkine(ref double a, ref double b, ref double c,


ref double d)
{
double L0=1.95; double L1=2.25; double L2=2.75; double L3=2.75;
double L4 = 3.0;

double sinda = Math.Sin(a * (Math.PI / 180));


double cosda = Math.Cos(a * (Math.PI / 180));
double sindb = Math.Sin(b * (Math.PI / 180));
double cosdb = Math.Cos(b * (Math.PI / 180));
double sindc = Math.Sin(c * (Math.PI / 180));
double cosdc = Math.Cos(c * (Math.PI / 180));
double sindd = Math.Sin(d * (Math.PI / 180));
double cosdd = Math.Cos(d * (Math.PI / 180));

//Compute D-H Transformation Matrices


double[,] T01 = new double[4, 4] {{ cosda, -sinda, 0.0, 0.0},
{ sinda, cosda, 0.0, 0.0},
{ 0.0, 0.0, 1.0, 0.0},
{ 0.0, 0.0, 0.0, 1.0,}};

double[,] T12 = new double[4, 4] {{-sindb, -cosdb, 0.0, 0.0},


{ 0.0, 0.0 , -1.0, 0.0},
{ cosdb, -sindb, 0.0, 0.0},
{ 0.0, 0.0, 0.0, 1.0}};

double[,] T23 = new double[4, 4] {{ cosdc, -sindc, 0.0, L2},


{ sindc, cosdc, 0.0, 0.0},
{ 0.0, 0.0, 1.0, 0.0},
{ 0.0, 0.0, 0.0, 1.0}};

double[,] T34 = new double[4, 4] {{-cosdd, sindd, 0.0, L3},


{ 0.0, 0.0, -1.0, 0.0},
{-sindd, -cosdd, 0.0, 0.0},
126

{ 0.0, 0.0, 0.0, 1.0}};

double[,] T45 = new double[4, 4] {{-1.0, 0.0, 0.0, 0.0},


{ 0.0, -1.0, 0.0, 0.0},
{ 0.0, 0.0, 1.0, 0.0},
{ 0.0, 0.0, 0.0, 1.0}};

double[,] Tint1 = MatMulp(ref T01, ref T12);

double[,] Tint2 = MatMulp(ref Tint1, ref T23);

double[,] Tint3 = MatMulp(ref Tint2, ref T34);

double[,] T05 = MatMulp(ref Tint3, ref T45);

//Calculate gripper centre position relative to base frame


double[,] V5 = new double[4, 1] { { 0.0 }, { 0.0 }, { L4 }, { 1.0 }
};

double[,] V50t = new double[4, 1];

V50t[0, 0] = (T05[0, 0] * V5[0, 0]) + (T05[0, 1] * V5[1, 0]) +


(T05[0, 2] * V5[2, 0]) + (T05[0, 3] * V5[3, 0]);

V50t[1, 0] = (T05[1, 0] * V5[0, 0]) + (T05[1, 1] * V5[1, 0]) +


(T05[1, 2] * V5[2, 0]) + (T05[1, 3] * V5[3, 0]);

V50t[2, 0] = (T05[2, 0] * V5[0, 0]) + (T05[2, 1] * V5[1, 0]) +


(T05[2, 2] * V5[2, 0]) + (T05[2, 3] * V5[3, 0]);

V50t[3, 0] = (T05[3, 0] * V5[0, 0]) + (T05[3, 1] * V5[1, 0]) +


(T05[3, 2] * V5[2, 0]) + (T05[3, 3] * V5[3, 0]);

double[] position = { V50t[0, 0], V50t[1, 0],(V50t[2, 0] + L0 + L1)


};
return position;
}

private void computeInvKin(object sender, EventArgs e)


{
computeBaseAngle();
computeElbowAngle();
computeShoulderAngle();

ao = a;
bo = b[3];
co = c[1];
127

double d = 0.0;
double b0 = b[0]; double b1 = b[1];double b2=b[2];double b3 = b[3];
double c0 = c[0]; double c1 = c[1];

double[] check1 = Fkine(ref a, ref b0, ref c0, ref d);


double[] check2 = Fkine(ref a, ref b1, ref c0, ref d);
double[] check3 = Fkine(ref a, ref b2, ref c1, ref d);
double[] check4 = Fkine(ref a, ref b3, ref c1, ref d);

double[] coord = { x, y, z + L0 + L1 };

double[] err1Inter = new double[3];


for (int j = 0; j < check1.Length; j++)
{
err1Inter[j] = Math.Abs(check1[j] - coord[j]);
}
double error1 = Math.Sqrt(Math.Pow(err1Inter[0], 2) +
Math.Pow(err1Inter[1], 2)+ Math.Pow(err1Inter[2],2));

double[] err2Inter = new double[3];


for (int j = 0; j < check2.Length; j++)
{
err2Inter[j] = Math.Abs(check2[j] - coord[j]);
}
double error2 = Math.Sqrt(Math.Pow(err2Inter[0], 2) +
Math.Pow(err2Inter[1], 2) + Math.Pow(err2Inter[2], 2));

double[] err3Inter = new double[3];


for (int j = 0; j < check3.Length; j++)
{
err3Inter[j] = Math.Abs(check3[j] - coord[j]);
}
double error3 = Math.Sqrt(Math.Pow(err3Inter[0], 2) +
Math.Pow(err3Inter[1], 2) + Math.Pow(err3Inter[2], 2));

double[] err4Inter = new double[3];


for (int j = 0; j < check4.Length; j++)
{
err4Inter[j] = Math.Abs(check4[j] - coord[j]);
}
double error4 = Math.Sqrt(Math.Pow(err4Inter[0], 2) +
Math.Pow(err4Inter[1], 2) + Math.Pow(err4Inter[2], 2));

if ((error1 <= error2) && (error1 <= error3) && (error1 <= error4))
{
ao = a; bo = b[0]; co = c[0];
}

else if ((error2 <= error1) && (error2 <= error3) && (error2 <=
error4))
{
128

ao = a; bo = b[1]; co = c[0];
}

else if ((error3 <= error1) && (error3 <= error2) && (error3 <=
error4))
{
ao = a; bo = b[2]; co = c[1];
}

else if ((error4 <= error1) && (error4 <= error2) && (error4 <=
error3))
{
ao = a; bo = b[3]; co = c[1];
}

angleConfig[0]=ao;
angleConfig[1]=bo;
angleConfig[2]=co;

BaseOutput_textBox.Text = String.Format("{0:N4}",ao);
ShoulderOutput_textBox.Text = String.Format("{0:N4}", bo);
ElbowOutput_textBox.Text = String.Format("{0:N4}", co);

/// <summary>
/// This functions runs when the user clicks the Target=1000us button.
/// </summary>
void positionGripper_Click(object sender, EventArgs e)
{

displayMeassage("ao is: "+ ao +" ,bo is: "+bo + ",co is: " + co);

double ratio = 1200.0 / 180.0;

UInt16 basePWM = Convert.ToUInt16((((angleConfig[0] * ratio) +


1500) * 4));
UInt16 shoulderPWM = Convert.ToUInt16((((angleConfig[1] * ratio) +
1500) * 4));
UInt16 elbowPWM = Convert.ToUInt16((((angleConfig[2] * ratio) +
1500) * 4));
UInt16 wristPWM = Convert.ToUInt16((((angleConfig[3] * (1200 /
180)) + 1500) * 4));

MessageBox.Show("the UInt16 PWM's are :" +basePWM +" ," +


shoulderPWM + " ," + elbowPWM);

TrySetTarget(0, basePWM);
TrySetTarget(1, shoulderPWM);
TrySetTarget(2, elbowPWM);
TrySetTarget(3, wristPWM)

void ButtonDisable_Click(object sender, EventArgs e)


129

{
// Set target of channel 0 to 0. This tells the Maestro to stop
// transmitting pulses on that channel. Any servo connected to it
// should stop trying to maintain its position.
TrySetTarget(0, 0);
TrySetTarget(1, 0);
TrySetTarget(2, 0);
TrySetTarget(3, 0);
TrySetTarget(4, 0);
}

/// <summary>
/// Attempts to set the target (width of pulses sent) of a channel.
/// </summary>
/// <param name="channel">Channel number from 0 to 23.</param>
/// <param name="target">
/// Target, in units of quarter microseconds. For typical servos,
/// 6000 is neutral and the acceptable range is 4000-8000.
/// </param>
void TrySetTarget(Byte channel, UInt16 target)
{
try
{
using (Usc device = connectToDevice()) // Find a device and
temporarily connect.
{
device.setTarget(channel, target);

// device.Dispose() is called automatically when the


//"using" block ends,
// allowing other functions and processes to use the
//device.
}
}
catch (Exception exception) // Handle exceptions by displaying
//them to the user.
{
displayException(exception);
}
}

/// <summary>
/// Connects to a Maestro using native USB and returns the Usc object
/// representing that connection. When you are done with the
/// connection, you should close it using the Dispose() method so that
/// other processes or functions can connect to the device later. The
/// "using" statement can do this automatically for you.
/// </summary>
Usc connectToDevice()
{
// Get a list of all connected devices of this type.
List<DeviceListItem> connectedDevices = Usc.getConnectedDevices();

foreach (DeviceListItem dli in connectedDevices)


{
// If you have multiple devices connected and want to select a
//particular
// device by serial number, you could simply add a line like
//this:
// if (dli.serialNumber != "00012345"){ continue; }

Usc device = new Usc(dli); // Connect to the device.


130

return device; // Return the device.


}
throw new Exception("Could not find device. Make sure it is
plugged in to USB " +
"and check your Device Manager (Windows) or
run lsusb (Linux).");
}

/// <summary>
/// Displays an exception to the user by popping up a message box.
/// </summary>
void displayException(Exception exception)
{
StringBuilder stringBuilder = new StringBuilder();
do
{
stringBuilder.Append(exception.Message + " ");
if (exception is Win32Exception)
{
stringBuilder.Append("Error code 0x" +
((Win32Exception)exception).NativeErrorCode.ToString("x") +
". ");
}
exception = exception.InnerException;
}
while (exception != null);
MessageBox.Show(stringBuilder.ToString(), this.Text,
MessageBoxButtons.OK, MessageBoxIcon.Error);
}

private void controlGripperJaws(object sender, EventArgs e)


{
UInt16 trackBarPos = Convert.ToUInt16(trackBar1.Value*4);
//MessageBox.Show("trackbar PWM is" + trackBarPos);

TrySetTarget(4, trackBarPos);
}

private void xInput_textBox_TextChanged(object sender,


KeyPressEventArgs e)
{
if (!char.IsControl(e.KeyChar) && !char.IsDigit(e.KeyChar) &&
e.KeyChar != '.' && e.KeyChar != '-')
{
e.Handled = true;
}
}

private void yInput_textBox_TextChanged(object sender,


KeyPressEventArgs e)
{
if (!char.IsControl(e.KeyChar) && !char.IsDigit(e.KeyChar) &&
e.KeyChar != '.' && e.KeyChar != '-')
{
e.Handled = true;
}
}

private void zInput_textBox_TextChanged(object sender,


KeyPressEventArgs e)
{
131

if (!char.IsControl(e.KeyChar) && !char.IsDigit(e.KeyChar) &&


e.KeyChar != '.' && e.KeyChar != '-')
{
e.Handled = true;
}
}

}
}
132

APPENDIX 2

/* Robotic Arm Control Interface with Inverse Kinematics Engine for repeated
* pick-and-place task:
* written in Visual C#.
* by Oridate Ademola Ayodeji
* (MEE/2008/071)
*
* Acknowledgment:
* The Inverse kinematics algorithm written by Akinwale (2009) of the
Department of Electronic and
* Electrical Engineering, Obafemi Awolowo University for the RA-01 robotic arm
(in MatLab) for
* his M.Sc Thesis was used as a guide in writing this program.
*/

using Pololu.Usc;
using Pololu.UsbWrapper;

using System;
using System.Linq;
using System.Threading;
using System.Collections.Generic;
using System.ComponentModel;
using System.Text;
using System.Windows.Forms;
using System.Numerics;

namespace Pololu.Usc.MaestroEasyExample
{
public partial class MainWindow : Form
{

string xinput = ""; double x;


string yinput = ""; double y;
string zinput = ""; double z;

const double L0 = 5.3;


const double L1 = 1.7;
const double L2 = 11.6;
const double L3 = 14.9;
const double L4 = 8.5;

double a;
double [] b = new double[4];
double [] c = new double[2];

double [] angleConfigPick = new double[4];


double [] angleConfigPlace = new double[4];

double ao;
double bo;
double co;

Complex[] bInter = new Complex[4];


Complex[] cInter = new Complex[2];

UInt16 trackBarPos = 0;
133

UInt16 speed = 5;

public MainWindow()
{
InitializeComponent();
}

public double computeBaseAngle(string xInput_textBox, string


yInput_textBox)
{
xinput = xInput_textBox;
x = Convert.ToDouble(xinput);

yinput = yInput_textBox;
y = Convert.ToDouble(yinput);

Complex acmplx = Math.Atan(y / x) * (180 / Math.PI);


a = acmplx.Real;

//Specify what the sign of a shoulder be if a=90


if ((y < 0) && (x == 0))
{
a = -90.0;
}
else if ((y > 0) && (x == 0))
{
a = 90.0;
}
else
{
a = a;
}
//Ensure that a lies in the first or third quadrants
if (a > 90.0 || a < -90.0)
{
displayMeassage("The specified position is out of the robotic
arm workspace (error a)");
}
return a;
}

//Calculate the value of c


public void computeElbowAngle(string xInput_textBox, string
yInput_textBox, string zInput_textBox)
{
xinput = xInput_textBox;
x = Convert.ToDouble(xinput);

yinput = yInput_textBox;
y = Convert.ToDouble(yinput);

zinput = zInput_textBox;
z = Convert.ToDouble(zinput)-L0-L1;

double k = (Math.Pow(x, 2) + Math.Pow(y, 2) + Math.Pow(z, 2)) -


(Math.Pow(L2, 2) + Math.Pow(L3, 2) + Math.Pow(L4, 2));
134

double ea = (4 * Math.Pow(L2, 2) * Math.Pow(L4, 2)) + (4 *


Math.Pow(L2, 2) * Math.Pow(L3, 2));
double eb = -4 * k * L2 * L4;
double ec = Math.Pow(k, 2) - (4 * Math.Pow(L2, 2) * Math.Pow(L3,
2));

Complex cInter1 = (-eb + Math.Sqrt(Math.Pow(eb, 2) - (4 * ea *


ec))) / (2 * ea);
Complex cInter2 = (-eb - Math.Sqrt(Math.Pow(eb, 2) - (4 * ea *
ec))) / (2 * ea);

//Check if the value of cInter lies within limits

int ii = 0;

if ((Math.Pow(eb, 2) - (4 * ea * ec)) > 0.0)


{
ii = ii + 1;
cInter[0]=cInter1;
}

if ((Math.Pow(eb, 2) - (4 * ea * ec)) > 0.0)


{
ii = ii + 1;
cInter[1]=cInter2;
}

if ((Math.Pow(eb, 2) - (4 * ea * ec)) < 0.0)


{
MessageBox.Show("The specified position is out of the robotic
arm workspace (error c)");
};

//else if (ii=2)
//Calculate the value of c

for (int iii = 0; iii < ii; iii++)


{
c[iii]=(Math.Asin(cInter[iii].Real) * (180 / Math.PI));

if (Math.Abs(c[iii]) < 0.0001)


{
c[iii] = 0.0;
}
}
}

//Calculate the values of b


public void computeShoulderAngle(string xInput_textBox, string
yInput_textBox, string zInput_textBox)
{
xinput = xInput_textBox;
x = Convert.ToDouble(xinput);

yinput = yInput_textBox;
y = Convert.ToDouble(yinput);
135

zinput = zInput_textBox;
z = Convert.ToDouble(zinput) - L0 - L1;

int kk = 0;

for (int jj = 0; jj < c.Length; jj++)


{
double k1 = (L4 * Math.Sin(c[jj] * (Math.PI / 180))) + (L3 *
Math.Cos(c[jj] * (Math.PI / 180))) + L2;
double k2 = (L4 * Math.Cos(c[jj] * (Math.PI / 180))) - (L3 *
Math.Sin(c[jj] * (Math.PI / 180)));

double eaa = Math.Pow(k1, 2) + Math.Pow(k2, 2);


double ebb = -2 * z * k2;
double ecc = Math.Pow(z, 2) - Math.Pow(k1, 2);

Complex bInter1 = (-ebb + Math.Sqrt(Math.Pow(ebb, 2) - (4 * eaa


* ecc))) / (2 * eaa);
Complex bInter2 = (-ebb - Math.Sqrt(Math.Pow(ebb, 2) - (4 * eaa
* ecc))) / (2 * eaa);

int ii = 0;

if ((Math.Pow(ebb, 2) - (4 * eaa * ecc)) > 0.0)


{
ii = ii + 1;
bInter[ii-1]=bInter1;
}

if ((Math.Pow(ebb, 2) - (4 * eaa * ecc)) > 0.0)


{
ii = ii + 1;
bInter[ii-1]=bInter2;
}

if ((Math.Pow(ebb, 2) - (4 * eaa * ecc)) < 0.0)


{
MessageBox.Show("The specified position is out of the
robotic arm workspace (error b)");
}

for (int iii = 0; iii < ii; iii++)


{
b[kk]=(Math.Asin(bInter[iii].Real) * (180 / Math.PI));

if (Math.Abs(b[iii]) < 0.0001)


{
b[iii] = 0.0;
}
kk = kk + 1;
}

}
}
136

public void displayMeassage(string msg)


{

MessageBox.Show(msg);

//double aa = 70.0; double bb = -75.7; double cc = 7.0; double dd =


3.0;
//double[] forkine = Fkine(ref aa, ref bb, ref cc, ref dd);
//Console.WriteLine("x =" + forkine[0] + ", " + "y =" + forkine[1] +
", " + "z =" + forkine[2]);

private static double[,] MatMulp(ref double[,] Ma, ref double[,] Mb)


{
double[,] aa = new double[4, 4];
double[,] bb = new double[4, 4];
double[,] abab = new double[4, 4];

for (int i = 0; i < abab.GetLength(0); i++)


{
for (int j = 0; j < abab.GetLength(1); j++)
{
abab[i, j] = 0;
for (int k = 0; k < Ma.GetLength(1); k++) // OR
k<b.GetLength(0)
abab[i, j] = abab[i, j] + Ma[i, k] * Mb[k, j];
}
}
return abab;
}

private static double[] Fkine(ref double a, ref double b, ref double c,


ref double d)
{
double sinda = Math.Sin(a * (Math.PI / 180));
double cosda = Math.Cos(a * (Math.PI / 180));
double sindb = Math.Sin(b * (Math.PI / 180));
double cosdb = Math.Cos(b * (Math.PI / 180));
double sindc = Math.Sin(c * (Math.PI / 180));
double cosdc = Math.Cos(c * (Math.PI / 180));
double sindd = Math.Sin(d * (Math.PI / 180));
double cosdd = Math.Cos(d * (Math.PI / 180));

//Compute D-H Transformation Matrices


double[,] T01 = new double[4, 4] {{ cosda, -sinda, 0.0, 0.0},
{ sinda, cosda, 0.0, 0.0},
{ 0.0, 0.0, 1.0, 0.0},
{ 0.0, 0.0, 0.0, 1.0,}};

double[,] T12 = new double[4, 4] {{-sindb, -cosdb, 0.0, 0.0},


{ 0.0, 0.0 , -1.0, 0.0},
{ cosdb, -sindb, 0.0, 0.0},
{ 0.0, 0.0, 0.0, 1.0}};

double[,] T23 = new double[4, 4] {{ cosdc, -sindc, 0.0, L2},


137

{ sindc, cosdc, 0.0, 0.0},


{ 0.0, 0.0, 1.0, 0.0},
{ 0.0, 0.0, 0.0, 1.0}};

double[,] T34 = new double[4, 4] {{-cosdd, sindd, 0.0, L3},


{ 0.0, 0.0, -1.0, 0.0},
{-sindd, -cosdd, 0.0, 0.0},
{ 0.0, 0.0, 0.0, 1.0}};

double[,] T45 = new double[4, 4] {{-1.0, 0.0, 0.0, 0.0},


{ 0.0, -1.0, 0.0, 0.0},
{ 0.0, 0.0, 1.0, 0.0},
{ 0.0, 0.0, 0.0, 1.0}};

double[,] Tint1 = MatMulp(ref T01, ref T12);

double[,] Tint2 = MatMulp(ref Tint1, ref T23);

double[,] Tint3 = MatMulp(ref Tint2, ref T34);

double[,] T05 = MatMulp(ref Tint3, ref T45);

//Calculate gripper centre position relative to base frame


double[,] V5 = new double[4, 1] { { 0.0 }, { 0.0 }, { L4 }, { 1.0 }
};

double[,] V50t = new double[4, 1];

V50t[0, 0] = (T05[0, 0] * V5[0, 0]) + (T05[0, 1] * V5[1, 0]) +


(T05[0, 2] * V5[2, 0]) + (T05[0, 3] * V5[3, 0]);

V50t[1, 0] = (T05[1, 0] * V5[0, 0]) + (T05[1, 1] * V5[1, 0]) +


(T05[1, 2] * V5[2, 0]) + (T05[1, 3] * V5[3, 0]);

V50t[2, 0] = (T05[2, 0] * V5[0, 0]) + (T05[2, 1] * V5[1, 0]) +


(T05[2, 2] * V5[2, 0]) + (T05[2, 3] * V5[3, 0]);

V50t[3, 0] = (T05[3, 0] * V5[0, 0]) + (T05[3, 1] * V5[1, 0]) +


(T05[3, 2] * V5[2, 0]) + (T05[3, 3] * V5[3, 0]);

double[] position = { V50t[0, 0], V50t[1, 0], (V50t[2, 0] + L0 +


L1) };
return position;
}

private void computeInvKinPick_click(object sender, EventArgs e)


{
computeBaseAngle(xInput_textBox1.Text, yInput_textBox1.Text);
138

computeElbowAngle(xInput_textBox1.Text, yInput_textBox1.Text,
zInput_textBox1.Text);
computeShoulderAngle(xInput_textBox1.Text, yInput_textBox1.Text,
zInput_textBox1.Text);

ao = a;
bo = b[3];
co = c[1];

double d = 0.0;
double b0 = b[0]; double b1 = b[1]; double b2 = b[2]; double b3 =
b[3];
double c0 = c[0]; double c1 = c[1];

double[] check1 = Fkine(ref a, ref b0, ref c0, ref d);


double[] check2 = Fkine(ref a, ref b1, ref c0, ref d);
double[] check3 = Fkine(ref a, ref b2, ref c1, ref d);
double[] check4 = Fkine(ref a, ref b3, ref c1, ref d);

double[] coord = { x, y, z + L0 + L1 };

double[] err1Inter = new double[3];


for (int j = 0; j < check1.Length; j++)
{
err1Inter[j] = Math.Abs(check1[j] - coord[j]);
}
double error1 = Math.Sqrt(Math.Pow(err1Inter[0], 2) +
Math.Pow(err1Inter[1], 2) + Math.Pow(err1Inter[2], 2));
//Console.WriteLine("error1 is:" + error1);

double[] err2Inter = new double[3];


for (int j = 0; j < check2.Length; j++)
{
err2Inter[j] = Math.Abs(check2[j] - coord[j]);
}
double error2 = Math.Sqrt(Math.Pow(err2Inter[0], 2) +
Math.Pow(err2Inter[1], 2) + Math.Pow(err2Inter[2], 2));
//Console.WriteLine("error2 is:" + error2);

double[] err3Inter = new double[3];


for (int j = 0; j < check3.Length; j++)
{
err3Inter[j] = Math.Abs(check3[j] - coord[j]);
}
double error3 = Math.Sqrt(Math.Pow(err3Inter[0], 2) +
Math.Pow(err3Inter[1], 2) + Math.Pow(err3Inter[2], 2));
//Console.WriteLine("error3 is:" + error3);

double[] err4Inter = new double[3];


for (int j = 0; j < check4.Length; j++)
{
err4Inter[j] = Math.Abs(check4[j] - coord[j]);
}
double error4 = Math.Sqrt(Math.Pow(err4Inter[0], 2) +
Math.Pow(err4Inter[1], 2) + Math.Pow(err4Inter[2], 2));
//Console.WriteLine("error4 is:" + error4);
139

if ((error1 <= error2) && (error1 <= error3) && (error1 <= error4))
{
ao = a; bo = b[0]; co = c[0];
}

else if ((error2 <= error1) && (error2 <= error3) && (error2 <=
error4))
{
ao = a; bo = b[1]; co = c[0];
}

else if ((error3 <= error1) && (error3 <= error2) && (error3 <=
error4))
{
ao = a; bo = b[2]; co = c[1];
}

else if ((error4 <= error1) && (error4 <= error2) && (error4 <=
error3))
{
ao = a; bo = b[3]; co = c[1];
}

double[] check = Fkine(ref ao, ref bo, ref co, ref d);
double[] error = new double[] { (coord[0] - check[0]), (coord[1] -
check[1]), (coord[2] - check[2]) };

if (Math.Abs(error[0]) > 0.0001 || Math.Abs(error[1]) > 0.0001 ||


Math.Abs(error[2]) > 0.0001)
{
displayMeassage("desired gripper postion is out of robotic arm
workspace");
}
else
{
angleConfigPick[0] = ao;
angleConfigPick[1] = bo;
angleConfigPick[2] = co;

BaseOutput_textBox.Text = String.Format("{0:N4}",
angleConfigPick[0]);
ShoulderOutput_textBox.Text = String.Format("{0:N4}",
angleConfigPick[1]);
ElbowOutput_textBox.Text = String.Format("{0:N4}",
angleConfigPick[2]);
}

//displayMeassage("diff1 is: " + error[0] + " ,diff2 is: " +


error[1] + ", diff3 is: " + error[2]);

private void computeInvKinPlace_click(object sender, EventArgs e)


{
computeBaseAngle(xInput_textBox2.Text, yInput_textBox2.Text);
computeElbowAngle(xInput_textBox2.Text, yInput_textBox2.Text,
zInput_textBox2.Text);
140

computeShoulderAngle(xInput_textBox2.Text, yInput_textBox2.Text,
zInput_textBox2.Text);

ao = a;
bo = b[3];
co = c[1];

double d = 0.0;
double b0 = b[0]; double b1 = b[1]; double b2 = b[2]; double b3 =
b[3];
double c0 = c[0]; double c1 = c[1];

double[] check1 = Fkine(ref a, ref b0, ref c0, ref d);


double[] check2 = Fkine(ref a, ref b1, ref c0, ref d);
double[] check3 = Fkine(ref a, ref b2, ref c1, ref d);
double[] check4 = Fkine(ref a, ref b3, ref c1, ref d);

double[] coord = { x, y, z + L0 + L1 };

double[] err1Inter = new double[3];


for (int j = 0; j < check1.Length; j++)
{
err1Inter[j] = Math.Abs(check1[j] - coord[j]);
}
double error1 = Math.Sqrt(Math.Pow(err1Inter[0], 2) +
Math.Pow(err1Inter[1], 2) + Math.Pow(err1Inter[2], 2));
//Console.WriteLine("error1 is:" + error1);

double[] err2Inter = new double[3];


for (int j = 0; j < check2.Length; j++)
{
err2Inter[j] = Math.Abs(check2[j] - coord[j]);
}
double error2 = Math.Sqrt(Math.Pow(err2Inter[0], 2) +
Math.Pow(err2Inter[1], 2) + Math.Pow(err2Inter[2], 2));
//Console.WriteLine("error2 is:" + error2);

double[] err3Inter = new double[3];


for (int j = 0; j < check3.Length; j++)
{
err3Inter[j] = Math.Abs(check3[j] - coord[j]);
}
double error3 = Math.Sqrt(Math.Pow(err3Inter[0], 2) +
Math.Pow(err3Inter[1], 2) + Math.Pow(err3Inter[2], 2));
//Console.WriteLine("error3 is:" + error3);

double[] err4Inter = new double[3];


for (int j = 0; j < check4.Length; j++)
{
err4Inter[j] = Math.Abs(check4[j] - coord[j]);
}
double error4 = Math.Sqrt(Math.Pow(err4Inter[0], 2) +
Math.Pow(err4Inter[1], 2) + Math.Pow(err4Inter[2], 2));
//Console.WriteLine("error4 is:" + error4);
141

if ((error1 <= error2) && (error1 <= error3) && (error1 <= error4))
{
ao = a; bo = b[0]; co = c[0];
}

else if ((error2 <= error1) && (error2 <= error3) && (error2 <=
error4))
{
ao = a; bo = b[1]; co = c[0];
}

else if ((error3 <= error1) && (error3 <= error2) && (error3 <=
error4))
{
ao = a; bo = b[2]; co = c[1];
}

else if ((error4 <= error1) && (error4 <= error2) && (error4 <=
error3))
{
ao = a; bo = b[3]; co = c[1];
}

double[] check = Fkine(ref ao, ref bo, ref co, ref d);
double[] error = new double[] { (coord[0] - check[0]), (coord[1] -
check[1]), (coord[2] - check[2]) };

if (Math.Abs(error[0]) > 0.0001 || Math.Abs(error[1]) > 0.0001 ||


Math.Abs(error[2]) > 0.0001)
{
displayMeassage("desired gripper postion is out of robotic arm
workspace");
}
else
{
angleConfigPlace[0] = ao;
angleConfigPlace[1] = bo;
angleConfigPlace[2] = co;

BaseOutput_textBox2.Text = String.Format("{0:N4}",
angleConfigPlace[0]);
ShoulderOutput_textBox2.Text = String.Format("{0:N4}",
angleConfigPlace[1]);
ElbowOutput_textBox2.Text = String.Format("{0:N4}",
angleConfigPlace[2]);
}

//displayMeassage("diff1 is: " + error[0] + " ,diff2 is: " +


error[1] + ", diff3 is: " + error[2]);

/// <summary>
/// This functions runs when the user clicks the Target=1000us button.
/// </summary>

void rest_Click(object sender, EventArgs e)


{
142

// Set target of channel 0 to 0. This tells the Maestro to stop


// transmitting pulses on that channel. Any servo connected to it
// should stop trying to maintain its position.
TrySetTarget(0, 6000);
TrySetTarget(1, (1560*4));
TrySetTarget(2, 6000);
TrySetTarget(3, 1530*4);

/// <summary>
/// Attempts to set the target (width of pulses sent) of a channel.
/// </summary>
/// <param name="channel">Channel number from 0 to 23.</param>
/// <param name="target">
/// Target, in units of quarter microseconds. For typical servos,
/// 6000 is neutral and the acceptable range is 4000-8000.
/// </param>
void TrySetTarget(Byte channel, UInt16 target)
{
try
{
using (Usc device = connectToDevice()) // Find a device and
temporarily connect.
{
device.setSpeed(channel,speed);
device.setTarget(channel, target);

// device.Dispose() is called automatically when the


"using" block ends,
// allowing other functions and processes to use the
device.
}
}
catch (Exception exception) // Handle exceptions by displaying
them to the user.
{
displayException(exception);
}
}

/// <summary>
/// Connects to a Maestro using native USB and returns the Usc object
/// representing that connection. When you are done with the
/// connection, you should close it using the Dispose() method so that
/// other processes or functions can connect to the device later. The
/// "using" statement can do this automatically for you.
/// </summary>
Usc connectToDevice()
{
// Get a list of all connected devices of this type.
List<DeviceListItem> connectedDevices = Usc.getConnectedDevices();

foreach (DeviceListItem dli in connectedDevices)


{
// If you have multiple devices connected and want to select a
particular
// device by serial number, you could simply add a line like
this:
// if (dli.serialNumber != "00012345"){ continue; }
143

Usc device = new Usc(dli); // Connect to the device.


return device; // Return the device.
}
throw new Exception("Could not find device. Make sure it is
plugged in to USB " +
"and check your Device Manager (Windows) or run lsusb
(Linux).");
}

/// <summary>
/// Displays an exception to the user by popping up a message box.
/// </summary>
void displayException(Exception exception)
{
StringBuilder stringBuilder = new StringBuilder();
do
{
stringBuilder.Append(exception.Message + " ");
if (exception is Win32Exception)
{
stringBuilder.Append("Error code 0x" +
((Win32Exception)exception).NativeErrorCode.ToString("x") + ". ");
}
exception = exception.InnerException;
}
while (exception != null);
MessageBox.Show(stringBuilder.ToString(), this.Text,
MessageBoxButtons.OK, MessageBoxIcon.Error);
}

private void controlGripperJaws(object sender, EventArgs e)


{
trackBarPos = Convert.ToUInt16(trackBar1.Value*4);
//MessageBox.Show("trackbar PWM is" + trackBarPos);
TrySetTarget(4, trackBarPos);
}

private void xInput_textBox_TextChanged(object sender,


KeyPressEventArgs e)
{
if (!char.IsControl(e.KeyChar) && !char.IsDigit(e.KeyChar) &&
e.KeyChar != '.' && e.KeyChar != '-')
{
e.Handled = true;
}
}

private void yInput_textBox_TextChanged(object sender,


KeyPressEventArgs e)
{
if (!char.IsControl(e.KeyChar) && !char.IsDigit(e.KeyChar) &&
e.KeyChar != '.' && e.KeyChar != '-')
{
e.Handled = true;
}
}

private void zInput_textBox_TextChanged(object sender,


KeyPressEventArgs e)
{
if (!char.IsControl(e.KeyChar) && !char.IsDigit(e.KeyChar) &&
e.KeyChar != '.' && e.KeyChar != '-')
144

{
e.Handled = true;
}
}

private void xInput_textBox2_TextChanged(object sender,


KeyPressEventArgs e)
{
if (!char.IsControl(e.KeyChar) && !char.IsDigit(e.KeyChar) &&
e.KeyChar != '.' && e.KeyChar != '-')
{
e.Handled = true;
}
}

private void yInput_textBox2_TextChanged(object sender,


KeyPressEventArgs e)
{
if (!char.IsControl(e.KeyChar) && !char.IsDigit(e.KeyChar) &&
e.KeyChar != '.' && e.KeyChar != '-')
{
e.Handled = true;
}
}

private void zInput_textBox2_TextChanged(object sender,


KeyPressEventArgs e)
{
if (!char.IsControl(e.KeyChar) && !char.IsDigit(e.KeyChar) &&
e.KeyChar != '.' && e.KeyChar != '-')
{
e.Handled = true;
}
}

private void numberOfObjects_TextChanged(object sender,


KeyPressEventArgs e)
{
if (!char.IsControl(e.KeyChar) && !char.IsDigit(e.KeyChar))
{
e.Handled = true;
}
}

private void disable_Click(object sender, EventArgs e)


{
TrySetTarget(0, 0);
TrySetTarget(1, 0);
TrySetTarget(2, 0);
TrySetTarget(3, 0);
TrySetTarget(4, 0);

private void start_Click(object sender, EventArgs e)


{
double ratio = (2400.0 - 1500.0) / (90.0 - 0.0);

UInt16 basePWMpick = Convert.ToUInt16((((angleConfigPick[0] * (-


ratio)) + 1500) * 4));
145

UInt16 shoulderPWMpick = Convert.ToUInt16((((angleConfigPick[1] *


(ratio)) + 1560) * 4));
UInt16 elbowPWMpick = Convert.ToUInt16((((angleConfigPick[2] *
(ratio)) + 1470) * 4));
//UInt16 wristPWMpick = Convert.ToUInt16((((angleConfigPick[3] *
(1200 / 180)) + 1500) * 4));

UInt16 basePWMplace = Convert.ToUInt16((((angleConfigPlace[0] * (-


ratio)) + 1500) * 4));
UInt16 shoulderPWMplace = Convert.ToUInt16((((angleConfigPlace[1] *
(ratio)) + 1560) * 4));
UInt16 elbowPWMplace = Convert.ToUInt16((((angleConfigPlace[2] *
(ratio)) + 1500) * 4));
//UInt16 wristPWMplace = Convert.ToUInt16((((angleConfigPlace[3]
* (1200 / 180)) + 1500) * 4));

for (int n = 0; n < Convert.ToInt64(numberOfObjects.Text); n++)


{
TrySetTarget(0, basePWMpick);
Thread.Sleep((((Math.Abs(basePWMplace - (1500*4)) / (25 *
speed)) / 4) * 1000) + 2000);
TrySetTarget(1, shoulderPWMpick);
TrySetTarget(2, elbowPWMpick);
TrySetTarget(4, 1900*4);
//............................
Int32[] sleep1 = {Math.Abs(basePWMpick - 1500 * 4),
Math.Abs(shoulderPWMpick - 1560 * 4), Math.Abs(elbowPWMpick - 1500 * 4)};
Thread.Sleep( (( (sleep1.Max() / (25 * speed) ) /4)*1000) +
2000);

TrySetTarget(4, trackBarPos);
Int32 sleep2 = Math.Abs((1900*4) - trackBarPos);
Thread.Sleep((((sleep2 / (25 * speed)) / 4) * 1000) + 2000);

TrySetTarget(2, 1500 * 4);//get back up


TrySetTarget(1, 1560 * 4);
Int32[] sleep3 = { Math.Abs(shoulderPWMpick - 1560 * 4),
Math.Abs(elbowPWMpick - 1500 * 4) };
Thread.Sleep((((sleep3.Max() / (25 * speed)) / 4) * 1000) +
2000);//get back up time

TrySetTarget(0, basePWMplace);
Thread.Sleep((((Math.Abs(basePWMplace - basePWMpick) / (25 *
speed)) / 4) * 1000) + 2000);
TrySetTarget(1, shoulderPWMplace);
TrySetTarget(2, elbowPWMplace);
//TrySetTarget(3, wristPWMplace);
Int32[] sleep4 = { Math.Abs(basePWMplace - basePWMpick),
Math.Abs(shoulderPWMplace - 1560 * 4), Math.Abs(elbowPWMpick - 1500 * 4) };
Thread.Sleep((((sleep4.Max() / (25 * speed)) / 4) * 1000) +
2000);

TrySetTarget(4, 1900*4);//open gripper to release object


System.Threading.Thread.Sleep(2500);//time for gripper to open

TrySetTarget(2, 1500 * 4);


TrySetTarget(1, 1560 * 4);//get back up
Int32[] sleep5 = { Math.Abs(shoulderPWMplace - 1560 * 4),
Math.Abs(elbowPWMplace - 1500 * 4) };
146

Thread.Sleep((((sleep5.Max() / (25 * speed)) / 4) * 1000) +


2000);//get back up time

TrySetTarget(0, 1500 * 4);//take base servo back to rest in


preparation for next iteration of the loop
Int32 sleep6 = Math.Abs((1500 * 4) - basePWMplace);
Thread.Sleep((((sleep6 / (25 * speed)) / 4) * 1000) + 2000);

}
}

private void numericUpDown_servo_speed_ValueChanged(object sender,


EventArgs e)
{
speed =
Convert.ToUInt16((double)numericUpDown_servo_speed.Value/2.5);
//textBox1.Text = speed.ToString();

}
}
147

APPENDIX 3

/* Robotic Arm Control Interface (using sliders):


* written in Visual C#.
* by Oridate Ademola Ayodeji
* (MEE/2008/071)
*
* Acknowledgment:
* The Inverse kinematics algorithm written by Akinwale (2009) of the
Department of Electronic and
* Electrical Engineering, Obafemi Awolowo University for the RA-01 robotic arm
(in MatLab) for
* his M.Sc Thesis was used as a guide in writing this program.
*/

using Pololu.Usc;
using Pololu.UsbWrapper;

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Text;
using System.Windows.Forms;
using System.Numerics;

namespace Pololu.Usc.MaestroEasyExample
{
public partial class MainWindow : Form
{
public MainWindow()
{
InitializeComponent();
}

/// <summary>
/// Attempts to set the target (width of pulses sent) of a channel.
/// </summary>
/// <param name="channel">Channel number from 0 to 23.</param>
/// <param name="target">
/// Target, in units of quarter microseconds. For typical servos,
/// 6000 is neutral and the acceptable range is 4000-8000.
/// </param>
void TrySetTarget(Byte channel, UInt16 target)
{
try
{
using (Usc device = connectToDevice()) // Find a device and
temporarily connect.
{
device.setTarget(channel, target);

// device.Dispose() is called automatically when the


"using" block ends,
// allowing other functions and processes to use the
device.
}
}
catch (Exception exception) // Handle exceptions by displaying
them to the user.
{
148

displayException(exception);
}
}

/// <summary>
/// Connects to a Maestro using native USB and returns the Usc object
/// representing that connection. When you are done with the
/// connection, you should close it using the Dispose() method so that
/// other processes or functions can connect to the device later. The
/// "using" statement can do this automatically for you.
/// </summary>
Usc connectToDevice()
{
// Get a list of all connected devices of this type.
List<DeviceListItem> connectedDevices = Usc.getConnectedDevices();

foreach (DeviceListItem dli in connectedDevices)


{
// If you have multiple devices connected and want to select a
particular
// device by serial number, you could simply add a line like
this:
// if (dli.serialNumber != "00012345"){ continue; }

Usc device = new Usc(dli); // Connect to the device.


return device; // Return the device.
}
throw new Exception("Could not find device. Make sure it is
plugged in to USB " +
"and check your Device Manager (Windows) or run lsusb
(Linux).");
}

/// <summary>
/// Displays an exception to the user by popping up a message box.
/// </summary>
void displayException(Exception exception)
{
StringBuilder stringBuilder = new StringBuilder();
do
{
stringBuilder.Append(exception.Message + " ");
if (exception is Win32Exception)
{
stringBuilder.Append("Error code 0x" +
((Win32Exception)exception).NativeErrorCode.ToString("x") + ". ");
}
exception = exception.InnerException;
}
while (exception != null);
MessageBox.Show(stringBuilder.ToString(), this.Text,
MessageBoxButtons.OK, MessageBoxIcon.Error);
}

void Rest_Click(object sender, EventArgs e)


{
// Set target of channel 0 to 0. This tells the Maestro to stop
// transmitting pulses on that channel. Any servo connected to it
// should stop trying to maintain its position.
TrySetTarget(0, 6000);
TrySetTarget(1, (1560 * 4));
149

TrySetTarget(2, 6000);
TrySetTarget(3, 1530 * 4);
}

private void disable_Click(object sender, EventArgs e)


{
TrySetTarget(0, 0);
TrySetTarget(1, 0);
TrySetTarget(2, 0);
TrySetTarget(3, 0);
TrySetTarget(4, 0);
}

private void controlGripperJaws(object sender, EventArgs e)


{
UInt16 trackBarPos = Convert.ToUInt16(trackBar_gripper.Value * 4);
//MessageBox.Show("trackbar PWM is" + trackBarPos);

TrySetTarget(4, trackBarPos);
}

private void trackBar_wrist_Scroll(object sender, EventArgs e)


{
UInt16 trackBarPos = Convert.ToUInt16(trackBar_wrist.Value * 4);
//MessageBox.Show("trackbar PWM is" + trackBarPos);

TrySetTarget(3, trackBarPos);
}

private void trackBar_elbow_Scroll(object sender, EventArgs e)


{
UInt16 trackBarPos = Convert.ToUInt16(trackBar_elbow.Value * 4);
//MessageBox.Show("trackbar PWM is" + trackBarPos);

TrySetTarget(2, trackBarPos);
}

private void shoulder_Scroll(object sender, EventArgs e)


{
UInt16 trackBarPos = Convert.ToUInt16(trackBar_shoulder.Value * 4);
//MessageBox.Show("trackbar PWM is" + trackBarPos);

TrySetTarget(1, trackBarPos);
}

private void trackBar_base_Scroll(object sender, EventArgs e)


{
UInt16 trackBarPos = Convert.ToUInt16(trackBar_base.Value * 4);
//MessageBox.Show("trackbar PWM is" + trackBarPos);

TrySetTarget(0, trackBarPos);
}

}
}

You might also like