Professional Documents
Culture Documents
BY
IN MECHANICAL ENGINEERING
May, 2014
i
CERTIFICATION
(Supervisor)
............................
Date
(Head of Department)
.....................................
Date
ii
DEDICATION
iii
ACKNOWLEDGEMENTS
Firstly, my appreciation goes to God Almighty for seeing me thus far and for His
I will like to appreciate my mother, Mrs. M.O. Oridate for her support, morally and
financially.
My appreciation also goes to my supervisor, Dr. A.O. Oluwajobi for his counsel and
encouragement.
I would like to express my gratitude to my uncles, Mr. Soretire and Mr. Soretire for
I would also like to appreciate Mr. Akinwale of the Department of Electronic and
Electrical Engineering for his thorough work on the kinematics of a robotic arm,
Lastly, I would like to appreciate my good friend, Itimi Richard for his assistance with
iv
Table of Contents
CERTIFICATION ......................................................................................................... ii
ACKNOWLEDGEMENTS .......................................................................................... iv
ABSTRACT .................................................................................................................. xi
INTRODUCTION ......................................................................................................... 1
v
CHAPTER TWO ......................................................................................................... 12
METHODOLOGY ...................................................................................................... 39
vii
LIST OF FIGURES
Figure 2. 1: ASIMO demonstrates its ability to climb stairs at Auto Zurich Car Show
...................................................................................................................................... 14
Figure 2. 3: Model of the robotic arm and the complete robot system by Jenne ......... 25
Figure 2. 4: Final TETRIX robot arm and hand assembly by Blackmore et al. .......... 28
Figure 2. 7: Herman's robotic arm built with standard bricks, before being mounted on
Figure 2. 8: Robotic arm for picking beams for static bending test ............................. 37
Figure 3. 10: Robotic arm pose corresponding to joint angle set qr ............................ 58
Figure 3. 11: Illustration of the use of teach method for simulating a robotic arm ..... 60
viii
Figure 3. 14: Autodesk® Inventor® Professional iproperties™ displaying the moments
Figure 4. 3 : GUI of interfacing software for controlling robotic arm ....................... 104
Figure 4. 8 : GUI of alternate interfacing software for controlling robotic arm (using
ix
LIST OF TABLES
Table 2. 1 : Linguistic rules for fuzzy logic control of robotic arm ............................. 21
Table 4. 1 : Coordinate input test result for robotic arm ............................................ 106
x
ABSTRACT
This report contains the design and implementation of a 5-DoF (Degree of Freedom)
robotic arm. The robotic arm is intended for educational purposes. Bearing in mind
that a similar project has been previously undertaken in the department, this project
work goes a step further by redesigning the robotic arm for improved accuracy by
using servos to power the joints as well as implementing the inverse kinematics of the
robotic arm.
A simulation of the robotic arm to visualise the joint movements using Robotics
The report covers the procedure for selection of the servos used to power each joint of
the arm in details. Polymethyl methacrylate was selected to fabricate the components
of the robotic arm. The torque exerted at each of the joints was calculated and a servo
with the required torque rating was selected for each joint.
A suitable servo controller was selected for the project and a control software for the
robotic arm was developed using Microsoft's C# programming language. The software
allows the gripper of the robotic arm to be postioned in space by specifying the
The software was tested by specifying various coordinates for the gripper with
reference to the robot's base and measuring the corresponding coordinates of the
centre position of the gripper. Satisfactory results were obtained from the project.
xi
CHAPTER ONE
INTRODUCTION
1.1 Background
The first usage of the term 'robot' was in a 1921 Czech science fiction play -
'Rossum’s Universal Robots' - by Karel Capek. The robots were artificial people or
androids and the word was derived from the word 'Robata', a Czech word for slave
(Corke, 2011).
A question of perpetual interest is to define a robot. Since the beginning of the study
of robotics, there has been some controversy in the definition of a robot. So long as
the evolution of robotics continues, the definition of the robot will change from time
to time, depending on the technological advances in its sensory capability and level of
However, the most widely accepted definition of a robot was given by the Robotic
Various criteria could be used as bases for the classification of robots, but two modes
Robots could be classified based on their kinematics and locomotion method. Based
on this mode of classification, the two major types of robots are stationary robots
(including robotic arms with a global axis of movement) and mobile robots. Mobile
robots can be further classified as wheeled robots, tracked robots, legged robots,
Robots could also be classified based on their uses or applications. Based on this
• Military robots – robots brought into play in military and armed forces. They
• Space robots – robots employed in space exploration and other space activities
Robotics, 2003).
3
Of all robotic mechanical systems, manipulators have always been given special
attention for various reasons. One reason is that, in their simplest form, as robotic
arms, they occur most frequently in industry. Another is that the architecture of
robotic arms constitutes the simplest of all robotic architectures, and hence, appear as
Manipulating, in turn, means to move something with one's hands, as the word is
derived from the Latin word 'manus', meaning hand. The basic idea behind the
foregoing concept is that hands are among the organs that the human brain can control
mechanically with the highest accuracy as the work of an accomplished guitar player
Robotic manipulators resembling the human arm are known as robotic arms. They are
rotational joints (also referred to as revolute joints) or translating joints (also referred
A robotic arm is thus a type of mechanical arm, usually programmable, with similar
functions to a human arm. It may be the sum total of the mechanism or may be part of
may be controlled directly or in any number of ways. Due to this fact, standalone arms
are often classified as full robots. Notable examples of robotic arms are the SCARA -
Selective Compliance Articulated Robot Arm and the Unimation Puma 560.
4
• Actuators
• Controller
• End-effector
A link is considered as a rigid body that defines the relationship between two
are connected by joints that allow relative motion of neighbouring links. The links
1.3.2 Actuators
Actuators play the same role the muscles play in the human arm - they convert stored
energy into movement. Actuators are used to move a robot’s manipulator joints. The
three basic types of actuators currently are used in contemporary robots are
pressurized gas to move the manipulator joint. They are inexpensive and simple, but
Hydraulic actuators on the other hand employ a pressurized liquid (typically oil) to
move the manipulator joint. They are quite common and offer very large force
hydraulic actuators are their accompanying apparatus (fluid pumps and storage tanks)
accurately, and are very reliable. However, these actuators cannot deliver as much
actuator functions, electrical actuators are often preferred. The various types of
electric motors used as actuators for robotic applications are direct current (dc)
1.3.3 Controller
The controller is the main device that processes information and carries out
instructions in a robot. It is the robot's 'brain' and controls the robot's movements. It is
usually a computer of some type which is used to store information about the robot
and the work environment and to store and execute programs which operate the robot.
It contains programs, data algorithms, logic analysis and various other processing
activities which enable the robot to perform its intended function (Carlson et al.,
2013).
controllers perform computational functions and interface with and control sensors,
Most modern servomotor actuators are designed and supplied around a dedicated
controller module from the same manufacturer. Such controllers are usually developed
around microcontrollers and common examples are the Pololu Mini Maestro 12-
1.3.4 End-effector
interact with the environment. The exact nature of this device depends on the
pushing and pulling, twisting, using tools, performing insertions, welding and various
types of assembly activities. Thus, the major types of robot end-effectors are:
• grippers - Grippers are the most common type of end-effectors. They can use
• material removal tools - These include cutting, drilling and deburring tools
torches have thus become very efficient end-effectors that can be controlled in
• tool changers - Tool changers are used when many different end effectors need
interface between the robot flange and the base of the tool. They can be
manual or automatic
Surgical robots have end-effectors that are specifically manufactured for performing
1.3.5 Sensor
Sensors are physical devices that enable a robot to perceive its physical environment
in order to get information about itself and its surroundings. They allow the robotic
arm to receive feedback about its environment. They can give the robot a limited
sense of sight and sound. The sensor collects information and sends it electronically to
One use of these sensors is to prevent collision with obstacles (collision detection and
give a pick and place robotic arm the capability to differentiate between items to
According to Angelo (2007), robots now play a prominent and indispensable role in
modern manufacturing. The use of robots has led to increased productivity and
Today, many products we buy have been assembled or handled by a robot (Corke,
Robotic arms are typically used in industry. Repetitive autonomous robots perform
one task repeatedly based upon predetermined movements and specifically located
deceleration, distance, and direction. More complex actions are executed based upon
sensor processing. If object orientation or position is unknown, arms are often paired
with machine vision and artificial intelligence to identify the object and subsequently
Contemporary applications of the robotic arm range from doing an accurate and
The da Vinci surgical robot uses robotic arms equipped with scalpels and other
• painting automation;
• fibreglass cutting;
• assembly operations;
• foundry operations.
In the automobile industry, robotic arms are used in diverse manufacturing processes
including assembly, spot welding, arc welding, machine tending, part transfer, laser
Robots have proved to help automakers to be more agile, flexible and to reduce
production lead times. They can be programmed to perform precise intricate duties at
much faster speeds than any human could be expected to. They also have the
as extreme heat and presence of dangerous chemicals in the air (Angelo, 2012).
The aim of the project is to design and fabricate a robotic arm that can be used for
Bearing in mind that a similar project has been previously undertaken in the
Department by Adebola (2012), this project work goes a step further by redesigning
the robotic arm for improved accuracy by using servos to power the joints.
9
the arm will be made.This will make it possible to postion the gripper of the robotic
arm by specifying the coordinates of its centre position in a Graphic User Interface
(GUI) control software. The derivation of the inverse kinematics equations is however
not intended due to its tedious nature. As such, the structure of the robotic arm will be
similar to the RA-01 robotic arm used by Akinwale (2011) for his M.Sc. project. The
kinematics of the robotic arm were extensively covered in the report and the equations
• create a 3-D model of the robotic arm based on the design parameters;
The field of robotics has become indispensible to mankind. Robotics technologies are
helping to improve healthcare (through the use of the da Vinci surgical system),
national defence (through the use of robotic bomb detonators), manufacturing (as
The availability of a robotic arm that can be used for demonstrative and educational
stimulating the interest of students in robotics. It will provide a tool to use for
learning and experimenting with robotics. Students with a flair for programming can
An off-the-shelf robotic arm could suffice , but the cost of such robotic arms is usually
educational robotic arm costs $548.03 ( 87,630.00 )with control software and
shipping cost included (Lynxmotion, 2013), Images SI Inc.'s RA-02 robotic arm - a 6-
DoF robotic arm - costs $599.95 ( 95,932.01) without inclusion of shipping cost
(Images SI Inc., 2013) while KUKA's youBot™ - a 5-DoF arm with 2-finger gripper-
(KUKA, 2013). The proposed robotic arm on the other hand has an estimated cost of
about 39,530.00 as shown in Table 1.1. Thus, building a robotic arm is a more
1.8 Scope
This work covers material selection, design, simulation, programming and fabrication
of a basic robotic arm system. It also covers the implementation of the kinematics of
the arm but does not consider the details of the derivation of the kinematic equations.
11
Miscellaneous 2000.00
CHAPTER TWO
LITERATURE REVIEW
A brief and clear definition of robotics is given in The Chamber's Dictionary (2011). It
defines robotics as the branch of technology dealing with the design, construction and
use (operation and application) of robots. A more technical definition was given by
Craig (2005).He defined robotics as a field that concerns itself with the desire to
and computers.
The field of robotics is divided into different aspects and robotics research are carried
out by experts in various fields. It is usually not the case that any single individual has
the entire area of robotics in his or her grasp. A partitioning of the field of robotics is
By and large, the field of robotics is not a new science, but merely a collection of
contributes methodologies for the study of the kinematics and dynamics of robots.
Mathematics supplies tools for describing spatial motions and other attributes of
robots. Control theory provides tools for the design and evaluation of algorithms to
techniques are brought to bear in the design of sensors and interfaces for industrial
robots, and computer science contributes a basis for programming these devices to
2.2 Robots
The term robot was first used in a 1921 Czech science fiction play - “Rossum’s
Universal Robots” - by Karel Capek. It was coined from the word "Robata", which is
The concept of artificial creatures has been around for a long time. Human beings
have always been fascinated with the concept of artificial life and the construction of
machines that look and behave like people. The medieval legend of the Golem (an
However, the first patent for what we would now consider a robot was filed in 1954
by George C. Devol and issued in 1961. Devol's robot was the first industrial robot, a
(Angelo, 2007)
Today, more advanced and highly sophisticated robots have been created. An example
ASIMO has 34 degrees of freedom (34-DOF) that allow it to do things like turn on a
light switch, open doors, carry objects and push carts. It has the ability to recognize
moving objects, postures, gestures, its surrounding environment, sounds and faces,
which enables it to interact with humans. Also, its Intelligent Real-Time Flexible
allow it to take on complex tasks such descending stairs, asccending stairs (shown in
Figure 2.1) and navigating a slope. Honda Robotics refers to ASIMO as "the world's
Figure 2. 1: ASIMO demonstrates its ability to climb stairs at Auto Zurich Car Show
Source: http://asimo.honda.com/gallery/
15
The use of robots has led to increased productivity and improved product quality. It
Today many products we buy have been assembled or handled by a robot (Corke,
2011).
George Devol received the first patents for robotics in 1954. He invented the
first robotic arm - a device that comprised a mechanical arm with a gripper that was
mounted on tracks and the sequence of motions was encoded as magnetic patterns
stored on a rotating drum. Devol originally called the device 'Universal Automation',
but later shortened the term to 'Unimation'. Devol's Unimation was the first industrial
robotic arm, a system designed specifically to pick and place objects in a factory
environment. A U.S. patent was issued for his design in 1961(Angelo, 2007).
In 1973, the wide range of technological experience gained by KUKA, combined with
demands from the automotive industry for powerful and reliable robots culminated in
the development of KUKA's own industrial robot. Known as FAMULUS, it was the
first robotic arm to have six electric motor-driven axes (KUKA, 2013).
Interest in industrial robotics increased in the late 1970s and many companies
entered the field, including large firms like General Electric, and General
Motors (GM) which formed the joint venture GMFanuc Robotics Corporation with
2013).
16
At the height of the robot boom in 1984, Unimation was acquired by Westinghouse
still making articulated robots for general industrial and cleanroom applications and
even bought the robotic division of Bosch in late 2004 (Ryan, 2012).
about 165,000 industrial robots were sold worldwide in the year 2011 alone.
The Academia has also contributed immensely to the advancement in the field of
robotics.
In 1969, Victor Scheinman at Stanford University invented the Stanford arm, an all-
electric, 6-axis articulated robotic arm designed to permit an arm solution. This
allowed the robot to accurately follow arbitrary paths in space and widened the
potential use of the robot to more sophisticated applications such as assembly and
Scheinrnan then designed a second arm for the Massachusetts Institute of Technology
(MIT) A1 Laboratory, called the "MIT arm". Sheinman sold his designs to
Unimation which further developed it with support from General Motors and later
(Ryan, 2012) .
The PUMA was a very versatile pick-and-place industrial robotic arm that could
In recent times, various research works on the robotic arm have been carried out by
University in his B.Sc. final year project, developed a serial algorithm for the dynamic
revolute joints and powered by ideal actuators. It was also assumed to be non-
redundant and its end-effector was assumed to be in contact with some rigid body in
The Euler-Lagrangian formulation was used to derive the equations describing the
dynamics of the manipulator. The equation related the torques acting at the joints due
to the motion of the manipulator links, inertia effects, centripetal effects (due to
rotation of the links around a joint), gravitational effect and the velocities of the links
(represented by the Jacobian Matrix). The joint accelerations were then solved, with
the assumption that the torques were given after which numerical integration
technique was used to solve for the next state joint positions and rates of the
manipulator.
A two-link manipulator was used as a case study for the simulation. The step-by-step
serial algorithm was used to develop a computer program for the simulation.
program.
18
The report showed that the algorithm derived was practical and result-oriented. The
approach used for the simulation is however only efficient for few links but
computationally intensive for manipulators with higher number of links such as five
and above.
University in his B.Sc. final year project, applied geometric modelling to the
form. The model form, though different from the original object or concept, simulates
all the qualities of the original. In geometric modelling, the object shape is accurately
wireframe model. The Bezier curve-defining technique was used to create the
wireframe model of the manipulator. The technique was chosen because it allowed for
easy control of the model shape by changing a few parameters as well as its ability to
remain stable and invariant under affine transformations such as translation and
rotation.
The generated wireframe model of the manipulator was then used to simulate the
rotation of its links using a suitable algorithm. The algorithm was used to write a
computer program to handle the tedious and complex mathematical computations that
Although, the objectives of the project were met, it had some shortcomings. The
wireframe models did not represent the robot manipulator accurately. A surface or
solid modelling technique would have been more appropriate, though it would have
University in his B.Sc. final year project, investigated the application of Fuzzy
for a robotic arm. The design was configured for a two-degree of freedom system in
which the robotic arm could move on a plane (i.e. displacement in two perpendicular
directions) with the aid of motors placed at the joint locations of the arm.
Fuzzy logic is a form of many-valued logic reasoning that recognizes more than
simple true and false values. Compared to traditional binary sets (where variables may
take on true or false values), fuzzy logic variables may have a truth value that ranges
Fuzzy logic was used to design a control for the robotic arm based on results obtained
from sensors placed at strategic locations of the arm. The major reason for
incorporating fuzzy control into the arm was to implement collision detection and
The sensors enabled the determination of relative distances between the robotic arm,
The design was based on the Model Approach. Objective information was represented
statements that were converted to rules that were then quantised using fuzzy logic.
20
Based on the distances from obstacles determined by the four sensors (which formed
the inputs) and two membership functions each, a total of sixteen (16) linguistic rules
The linguistic rules were used to instruct the motors to rotate at a certain speed and in
The speed and direction of the motors were represented with the following acronyms;
PF - Positive Fast
PM - Positive Medium
PS - Positive Slow
NS - Negative Slow
NM - Negative Medium
NF - Negative Fast
If the distance to the left is Big and the distance in front is Big and the distance to the
right is Small and the distance at the back is Big, then the Forward-Backward motor
speed is Positive Slow and the Left-Right motor speed is Positive Slow.
The design was able to achieve the aim of ensuring the robotic arm reached its
destination while effectively avoiding collision with obstacles that get into its
Forward-
Motor Motor
The controller can be further improved by increasing the number of sensors, using
sensors with higher sensitivities and increasing the number of membership functions
for both distances and motor speeds. Also, increasing the degrees of freedom of the
robotic will enhance its capability for collision avoidance due to increased ability to
for the Nios II Embedded Processor Design Contest. The project involved the design
of a smart robot to expand the applications for which robotic arms are traditionally
used. The robotic arm was installed on an omnidirectional mobile vehicle and a
The omnidirectional mobile home-care robot provides home care for the disabled and
improves their quality of life. The smart robot can perform various tasks such as
reaching for an item with the help of its network camera, the platform, and the robotic
arm. The user grabs the article’s image on the camera LCD (Liquid Crystal Display)
screen and the system directs the platform and arm to pick up the object. The robot
helps the disabled person pick up objects and perform household tasks. Figure 2.2
shows the omnidirectional mobile home care robot. The robotic arm and the
omnidirectional mobile vehicle system use two Altera® Nios® II test boards. Stratix®
and Stratix II devices control the system, motor position, and speed while a PC
performs image processing. The robot arm has five servomotors and the platform has
three.
The user uses the webcam network camera to select image information, which is
Upon completion of the image processing, the system determines the object’s three-
dimensional (3-D) coordinates compared to the robot’s. Then, the system substitutes
the coordinates in an algorithm to obtain the rotation angle of the robot arm’s five-axis
motor as well as the platform’s best rotating X and Y coordinates. The system then
transmits the control signal to the hub via TCP (Transmission Control Protocol) and
Winsockets and delivers the rotation angle and the XY coordinates to the development
boards, which controls the speed and position of the robotic arm and platform in their
Jenne (2009) of the Swiss Federal Institute of Technology, Zurich designed and
mobile platform, realized with a differential drive and a 6-DoF robotic arm (shown in
Figure 2.3). Firstly, the kinematic models of the system were analyzed separately.
With the theory of the homogeneous matrix exponential method, a combination of the
kinematic models was established. Further on, the motion of the mobile platform was
Also, an algorithm was developed to control the performance of the robotic arm and
its mobile platform together. With a priority distribution, the differential drive could
manoeuvre the arm to a convenient position and the arm's end-effector could execute
To implement the controller on the real robot, data structures and a transfer protocol
were created to communicate with the robot. The software did not take the robot's
inertia into account, so the speed controller had to be modified. It proved to be highly
suitable to control the combined system with a standardized control algorithm that
Figure 2. 3: Model of the robotic arm and the complete robot system by Jenne
Source : Jenne (2009)
26
Blackmore et al. (2009) from Portland State University worked on a project that
behaviours. The means of achieving the project objectives were contained in three
aspects of the robot's design. The first was a mobile base that allowed the robot to
navigate through an indoor space. The second was a head and neck capable of
producing several human-like gestures coordinated with speech. The third was an arm
and hand assembly used to supplement the robot's gestures and allow it to grasp and
move objects.
The robotic arm had the ability to grasp an object and place it in a different location. It
was similar in scale to that of a human arm and could replicate similar motions. The
arm was made with standard components, such that it could be easily reproduced and
mirrored to create left and right versions. Also, the arm could be easily mounted to
The ability of the arm to move an object was given attention. With this in
consideration, an object with a negligible mass was chosen. This reduced the torque
requirements for the arm's actuators and allowed the overall design to be built to a
larger scale.
To simplify the software design, each joint on the arm was manually controlled.
Although, the developers intended to make provision for more sophisticated motion
control in future phases of the project, the simplified approach allowed the robot's
As a first step to higher levels of motion control, the forward kinematic equations for
The robotic arm was built with TETRIX robot kit components. In addition to the stock
TETRIX components, welding rod was used to fashion the fingers of the hand. The
plate that the fingers attached to was also fabricated. Figure 2.4 shows the complete
The electronic components used to operate the arms consisted of two direct current
(dc) motors, four servomotors, one dc motor controller, one servo controller, and the
NXT brick. The dc motors were used on the elbow and shoulder joints to provide
more torque and stability while servomotors were used to control the hand, wrist, and
arm rotation. All these components were part of the Lego Tetrix Robotics toolkit.
Using the Tetrix parts along with the NXT brick allowed for less time spent
integrating and developing drivers, because when programmed with RobotC, the
drivers and control functions are already integrated into the system allowing for more
the arm.
The main control of the arm was done by the NXT brick (a control unit run by a 32bit
simulated and implemented a mobile robotic base with an attached robotic arm. The
aim of the project was to create a robot capable of assisting elderly people with tasks
One of the project aims was to design a robot that could come to the aid of the user.
Thus, the robot had to run in an autonomous mode because it was intended to be used
by an aged person. The robot could travel to the user when it was called for help.
28
Figure 2. 4: Final TETRIX robot arm and hand assembly by Blackmore et al.
Source : Blackmore et al. (2009)
29
Thus, the robot had the ability to know the position of the user or at least get a general
The robot was designed to perform indoor user location, thus decreasing the accuracy
necessitating the use of indoor tracking methods such as a WLAN (Wireless Local
Area Network) - based tracking system. The location of the user was accomplished by
To be able to perform tracking, the robot had sensors that could carry out dead
Object avoidance was also incorporated into the robot to enable it to navigate properly
through the room. This is was achieved using an infrared transmitter and receivers that
could detect an obstacle. Then the data was processed in the micro-processor to
In order for the robot to find and pick up objects, it was equipped with the ability to
identify and locate them. The method used to achieve this involved the translation of
A RA-01 robotic arm with five degrees of freedom from Images Incorporated, New
wrist
rotation
gripper
elbow open-close
movement
shoulder
movement
base rotation
The robotic arm platform was set up and the robotic arm was interfaced with a
forward and inverse kinematics of the arm using the Denavit-Hartenberg (D-H)
parameters .
The online laboratory platform was implemented using the Massachusetts Institute
arm and integration of the experiment engines into the iLab Interactive
ensure laboratory safety without human supervision and hence, ensure robustness
of the setup.
purposes. The system was implemented using improvised materials while the
He fabricated two models (A and B), evaluated and compared their results. Test
results showed that Model A failed due to the inability of its electronic circuit to
supply the power required by its motors. Further tests showed that Model B
was fully functional having all of its joints (apart from the gripper) capable of
360o movement.
In line with emphasizing local content, he obtained motors from old antennas and
from children toys. Perspex was used for each link and the gripper of the successful
model.
32
Forward and inverse kinematics of the robotic design were given in his report. The
results of the kinematic analysis were however not implemented in the design of the
robot. Doing that would have been difficult due to its simplistic method of control.
Instead the robot used the simple mechanism of the human eye as its control.
Supposing the student controlling the robot wanted the end-effector to move from
position A to position B, he had to move each link of the robot until the end-effector
reached position B.
His study showed that an educational instructional robotic arm made using local
materials was feasible and viable for teaching robotics especially at early stages. The
Herman et al. (2012) ,of the University of Zaragoza (Spain), designed and built a low-
cost mobile platform equipped with a robotic arm. The low-cost mobile platform was
a TurtlebotTM prototype. The robotic arm was built with off-the-shelf parts and
controlled from an Arduino board. A Kinect sensor was used for object detection.
Two ROS (Robot Operating System) nodes managed all the elements, commanding
objects were moved out of the way was described. The initial prototype had a simple
The project augmented a standard vacuum cleaner base with a custom arm capable of
grasping simple items, like small pieces of clothing, that were in the way of the robot.
The goal was to allow the robot to identify objects in its path and, if they were suitable
targets, remove them using the arm, in order to proceed with a hypothetical vacuum-
cleaning task.
33
The prototype, shown in Figure 2.7 was capable of detecting and grasping small soft
objects, like light clothes. It had trouble with larger or rigid objects, since only a basic
Awolowo University in his B.Sc. final year project, developed a remote online
The experiment involved applying a known force at any point between the ends of a
beam supported on a pivot at both ends and thereafter measuring the deflection of the
beam using a suitable instrument. The value of deflection was then used to calculate
The design put into consideration, a case of permanent deformation or damage of the
beam in the event that a student applied an extremely high force on the beam.
Considering the absence of a lab attendant to replace the beam, as it was a remote
laboratory where everything was expected to be automated and the experiment had to
Thus, a robotic arm with four degrees of freedom (4-DoF) was designed and
implemented. The joints of the arm were powered by AX-12™ dynamixel™ servos
manufactured by Robotis®.
A continuous rotation servo was used to effect a 360° rotation at the base joint. The
Figure 2. 7: Herman's robotic arm built with standard bricks, before being mounted on
a Turtlebot™ prototype
Control of the servos from a PC was achieved using the USB2Dynamixel™ - a TTL
“Virtual Serial Port” that attaches to the USB port on a PC. It was used as the
connector between the dynamixel servos and the computer, thus allowing the servos to
A 9V, 4.5A power supply was used to power the dynamixel actuators with each
actuator needing about 900mA current at 9V. The power supply circuit used a pass
The interfacing software for communicating with the robotic arm was developed using
Also, the inverse kinematics equation of the arm were derived and implemented so as
to give it the ability to pick the beam at the specified location. The robotic arm is
(robotic arm), a systematic, general method is needed to define the relative position
and orientation of two consecutive links. The problem is that to attach two frames to
the two links and compute the coordinate transformations between them. In general,
the frames can be arbitrarily chosen as long as they are attached to the link they are
referred to. Nevertheless, it is convenient to set some rules for the definition of the
Figure 2. 8: Robotic arm for picking beams for static bending test
Source : Ishola (2012)
38
standardize the coordinate frames for spatial linkages (Denavit and Hartenberg, 1955).
The D-H representation will be used in carrying out various analyses in the design
process of the proposed robotic arm. Thus, the procedure for determining the Denavit-
It should be noted that there are two quite different forms of the Denavit-Hartenberg
• Classical as per the original 1955 paper of Denavit and Hartenberg, and used
Both notations represent a joint as two translations and two rotation angles.
However, the expressions for the link transform matrices of the two notations are quite
different. Unfortunately many robotics literatures do not specify this crucial piece of
information. Most textbooks cover only one and do not even allude to the existence of
the other. The modified form will be adhered to in the course of the analyses in this
CHAPTER THREE
METHODOLOGY
This chapter covers the detailed design and method of construction of the robotic arm
The following material properties were put into consideration during the material
selection process:
• strength
• lightness
• availability
• ease of cutting.
The material should posses sufficient strength so as to ensure that each link of the arm
is able to bear the load imposed on it by motors, other attached links and the payload.
Lightness of the material reduces the torque requirement of the robot actuators,
thereby minimising cost. The material also had to be readily available and easy to cut
because the fabrication of some parts of the robotic arm involved the cutting of
intricate shapes.
Perspex (polymethyl methacrylate or PMPA plastic) satisfies all the above criteria and
was thus selected for the project. Perspex can be easily cut with readily available hand
tools such as a fret saw. In addition to this, it offers outstanding strength and stiffness
2013), it's lightness is a key advantage for any demonstrative or educational robotics
The entire implementation plan of the robotic arm from start to finish is represented
by a block flow diagram shown in Figure 3. 1. The software used for each stage is
arm setup.
As earlier stated in the introductory chapter of this report, the robotic arm has five
degrees of freedom (5-DOF) with the inclusion of the gripper. A CAD (Computer
Aided Drafting) model of the robotic arm structure was designed and rendered using
Kinematics is the branch of mechanics that studies the motion of a body or a system of
bodies without giving consideration to its mass or the forces acting on it. The
• Forward kinematics
• Inverse kinematics
Forward Kinematics
The goal of the kinematic analysis of a robotic arm is to develop equations that
can be used to determine the position of the end-effector given any set of joint
values. The equations developed relate the end-effector position to the joint
Simulate forward
Obtain kinematic and inverse
equations of kinematics of robot
Select servos with
robotic arm arm (graphically)
required torque rating
Importation of 3D model(in
.dwg format) to determine the
weight and inertia of each link of
the robotic arm (using Autodesk Obtain equations
Simulate forward
Inventor® iproperties™) for for the for forward
and inverse
torque calculation. The purpose and inverse
kinematics of robot
of the torque calculation is to kinematics of
arm (graphically)
facilitate the selection of robotic arm
appropriate motors for each joint
of the arm.
serial bytes
Five PWM
signals to control
servo rotation
Forward kinematics refers to the use of the kinematic equations of the robotic arm to
compute the position of the end-effector from specified values for the joint parameters
(Paul,1981).
Inverse Kinematics
A method of determining the pose of the end-effector of the robotic arm, given the
joint coordinates has been discussed. "A problem of real practical interest is the
inverse problem: given the desired pose of the end-effector, what are the required joint
coordinates? For example, if we know the Cartesian pose of an object, what joint
coordinates does the robot need in order to reach it? " (Corke, 2011).
kinematics of the robotic arm. Based on the procedure for determining the Denavit-
Hartenberg Parameters of a serial link , the D-H parameters of the robot were
determined.
Refer to Figure 3.5. Firstly, frames were attached to each of the links of the
manipulator at the joint positions. The convention for locating frames on the links is
as follows: The Z-axis of frame {i}, Zi , is coincident with the joint axis i. The origin
of frame {i} is located where ai perpendicularly intersects with the joint i axis. Xi
measured in the right-hand sense about Xi, so we see that the freedom of choosing the
sign of α in this case corresponds to two choices for the direction of Xi. Yi is formed by
Axis i - 1 Axis i
Link i - 1
Link i
Zi-1
Yi
Yi-1 Zi
ai
ai-1
Xi-1 di Xi
θi
αi-1
If the link frames have been attached to the links according to the above convention,
where 'α' is called the link twist, 'a' is called the link length, 'd' is called the link offset
and 'θ' is called the joint variable (in case of revolute joints).
(Craig, 2005).
A schematic diagram of the robotic arm showing the chosen joint axes is shown in
Figure 3. 6. The D-H parameters of the robotic arm (shown in Table 3.1) were
i αi-1 ai-1 di θi
1 0° 0 0 θ1
2 90° 0 0 90°- θ2
3 0° L2 0 θ3
4 90° L3 0 180° - θ4
5 0° 0 0 180°
50
The D-H parameters of robot arm are exactly identical to those of the RA-01 robotic
arm used by Akinwale (2011) for his M.Sc. thesis. The kinematic analysis of the robot
the thesis. According to Akinwale (2011), the location of the central point of the end
The above equations represent the forward kinematics of the robotic arm - given the
values of the joint angles, they can be used to obtain the location of the end effector.
The inverse kinematics problem was to solve this equation for θ1, θ2 and θ3 given the
values of x, y and z.
θ1 = tan-1
±
-1
θ2 = sin
k2 = L4 cosθ3 - L3 sinθ3
±
θ3 = sin-1
where = + + − + ! +
"The three equations above together make up the joint configuration of the robotic
arm to locate the gripper at a particular point in space. With each computation of the
inverse kinematics, a maximum of one solution is possible for θ1, two results for θ3
and four results for θ2. If the values of x, y and z used for computing the inverse
kinematics lie within the workspace of the robotic arm, then there would be four
52
arm. One of these configurations will definitely place the end-effector of the robotic
arm at the specified location. The other configurations may or may not place the
solution would successfully place the robotic arm at the desired location after finding
The feedback algorithm runs the obtained inverse kinematics through the forward
kinematics to obtain a location in space after which the location is compared against
the target location specified. The first solution to yield the correct location in space is
MATLAB® was used to graphically illustrate the kinematic analysis of the robotic
arm. A strength of MATLAB® is its support for Toolboxes which are collections of
functions targeted at particular topics. To simplify the task of carrying out the
kinematic analysis with MATLAB, a third party toolbox developed by Peter Corke
was used. The toolbox, Robotics Toolbox for Matlab (release 9.8) simplifies such
tasks as kinematics, dynamics, and trajectory generation for serial arm-type robots
(petercorke.com, 2013).
The Toolbox is based on a very general method of representing the kinematics and
The D-H parameters (given in Table 3.1) were used to create a vector of 'Link' objects
as shown below:
The joint angles alpha are variables since all the joints are revolute. The zeros only
>>L
L=
theta=q1, d= 0, a= 0, alpha= 0 (R,modDH)
The link objects were then passed to the constructor 'SerialLink' as shown below:
This provides a concise description of the robot. The robot has 5 revolute joints as
The kinematic parameters of the link objects are also listed and the joint variables are
shown as variables q1, q2, q3, q4 and q5. The robot was also assigned a name
Although a value was given for θ, it is not displayed – that value simply served as a
and its value in the Link object is ignored. The value is managed by the Link object.
To illustrate different poses of the robotic arm, four different sets of joint angles were
defined and the plot method was invoked. By invoking the plot method, the robot
the robot’s name ,the joints and their axes, and a shadow on the ground plane. The
MATLAB returned;
qn =
MATLAB returned a graphical representation of the robot pose shown in Figure 3.5.
robot pose corresponding to each set of joint angles as shown in Figures 3.6 to 3.8.
Also, the effect of changing set of joint angles on the pose of the robotic arm was
simulated using the "teach" method in the Robotics Toolbox. The teach method is a
useful tool in the MATLAB® Robotics Toolbox for simulating robots. It is used to
drive (the joints of) a graphical robot by means of a graphical slider panel. The
graphical slider panel provides a means of changing the set of joint angles of the
robotic arm (Corke, 2008). Figure 3. 11 shows a screenshot of the teach method in
use.
Each pose of the robotic arm positions the frame of the end effector in a point in space
that can be represented by a transformation matrix. The goal of the forward kinematic
analysis of the arm using the Robotics toolbox is to obtain this transformation matrix
for any set of joint values. The transformation matrix is a representation of the
position of the end effector of the arm in space relative to a reference frame. The
Figure 3. 11: Illustration of the use of teach method for simulating a robotic arm
61
MATLAB returned;
T=
0 0 0 1.0000
The method returns the homogenous transform that represents the pose of the end-
effector.
R=
The significance of this rotation matrix can be further explained by using Euler’s
orientation. Thus;
>> tr2eul(R)
ans =
This implies successive rotations of the end effector frame about the z-axis, the
ans =
Thus, the position and orientation of the end effector frame corresponding to the final
and
z-axis.
The same procedure can be repeated for any set of joint angles.
The inverse kinematics of the arm was illustrated and visualised using the ikine6s
method of the Robotics Toolbox. Assuming the robot end effector is initially located
point in space described by the homogenous transform T1, where T1 is given as;
63
>> T1 = Ademola01.fkine
T1 =
0 0 0 1.0000
The inverse kinematic problem involves obtaining the joint angles (qs2) that
MATLAB returned:
Warning: Initial joint angles results in near-singular configuration, this may slow
convergence
qs2 =
The value of qs2 computed by the ikine method is equal to the original value of qs
The argument '[1 1 1 1 1 0]' is a mask matrix which instructs the 'ikine' method to
ignore the axis with zero value in arriving at the solution for the inverse kinematics.
64
This was necessary because the robotic arm has five degrees of freedom as opposed to
The argument 'pinv' instructs the toolbox to use pseudo-inverse method instead of
To effect motion of the links of a robotic arm, an actuator is fixed at each joint of the
arm. The actuator applies torque at the joint to overcome the resistance of the link to
motion. Resistance of any link to motion is due to gravity and inertia effects.
The effect of gravity on any link of the robot is to pull and accelerate it towards the
centre of the earth due to its own weight, thus exerting a resistive torque on it.
Therefore, a portion of the torque generated by the actuator is needed to overcome this
resistive torque (due to gravity). It was necessary to calculate the value of the gravity-
induced resistive torque acting on each link of the arm so that an actuator with
The resistive torque exerted on a joint due to gravity acting on the robot depends very
strongly on the robot’s pose. Intuitively the torque on the shoulder joint is much
greater when the arm is stretched out horizontally. Thus, in order to calculate the
torque required at each joint, the worst case scenario (arm completely stretched out)
was chosen. The arm is subjected to the highest torque when the arm is stretched
Figure 3. 12 shows a labelling of the lengths of the links of the robot arm while a free-
J2 = shoulder joint
J3 = elbow joint
J4 = wrist joint
J5 = gripper joint
W1 = Weight of link L1
W2 = Weight of link L2
W3 = Weight of link L3
W4 = Weight of link L4
The calculation of the resistive torque exerted on each joint due to gravity effect is as
follows:
T1g = 0 N.m since waist rotation does not cause motion of any link in the vertical
T4g = 0 N-m since wrist rotation does not result in motion against gravity
T5g = 0 N-m since opening and closing of gripper jaws does not result in motion
against gravity
The resistance that is shown by a body to change its rotation is called moment of
inertia (I). Thus, the difficulty faced to change the angular motion of a body about an
axis is measured by calculating its moment of inertia about that particular axis. The
moment of inertia plays the same role for rotation as the mass does for a translational
2006).
Thus, for a given angular acceleration of any link of the robot, the inertia of the link
results in a resistive torque applied at the joint where the lower end of the link is
connected. The value of this torque (Ti) is given by the product of the moment of
Ti = Iα
69
According to Benenson et al. (2006), the moment of inertia of a body depends on its
shape and mass distribution as well as the orientation of its rotational axis, thus
necessitating the use of a different formula to calculate the moment of inertia of each
Considering the fact that the links of the robot are not simple geometrical shapes, thus
moments of inertia using a CAM (Computer Aided Modelling) software. Thus, the
3-D model of the robot arm (created with AutoCAD®) was exported into Autodesk®
Professional, the moments of inertia of the various parts of the robot were determined.
Figure 3. 14 illustrates the use of the CAM software to determine the moments of
inertia of link L2 (in an upright position) with respect to X, Y and Z-axes (i.e. IXX, IYY
and IZZ).
Using the same procedure, IXX, IYY and IZZ were evaluated for the various components
A decision was made to design the robot such that each motor rotates by one degree
speed, the moderate speed selected will reduce the rate of wear and tear on the joints
of robotic arm.
70
Also, a decision was made to design the robot such that each link moves from rest
position (0rad/s) to the design angular velocity of 0.873rad/s in 0.5 seconds, which
translates into an angular acceleration of 1.746 rad/s2. For the arm to move from a rest
Thus, the calculation of the resistive torque exerted on each joint due to inertia effect
is as follows:
Let resistive torque at joint 5 (gripper joint) due to inertia effect = T5i
J2 = shoulder joint
J3 = elbow joint
J4 = wrist joint
J5 = gripper joint
Ibase = moment of inertia of rotating base and shoulder motor bracket assembly
The values of the moments of inertia obtained from Autodesk® Inventor® Professional
are about the axes passing through each body's centre of mass. In the case of a robotic
arm, the moment of inertia must take into consideration that the part is being
rotated about a pivot point (the joint in this case) located a distance away from the
centre of mass . To evaluate the moments of inertia about the joint axes, the parallel
Consider a body of mass m, which is made to rotate about an axis z passing through
the body's centre of mass (as shown in Figure 3. 15). Let the body have a moment of
The parallel axis theorem states that if the body is made to rotate instead about a new
axis z′ which is parallel to the first axis and displaced from it by a distance h, then the
moment of inertia I with respect to the new axis is related to Icm by;
I = Icm + mh2
...where h is the perpendicular distance between the axes z and z′ (Jha, 2006).
Thus, for each joint, the moment of inertia of each link pivoted about it was calculated
by adding the products of each individual mass (m) by the square of the respective
perpendicular distance (h) between the body's centre of mass and the joint.
Thus;
NB: T1i was calculated with the arm in a stretched out pose (worst case scenario).
h11 = perpendicular distance between z-axes through the centroids of base motor horn
h12 = perpendicular distance between z-axes through the centroids of base motor horn
h13 = perpendicular distance between z-axes through the centroids of base motor horn
h14 = perpendicular distance between z-axes through the centroids of base motor horn
h15 = perpendicular distance between z-axes through the centroids of base motor horn
h16 = perpendicular distance between z-axes through the centroids of base motor horn
h17 = perpendicular distance between z-axes through the centroids of base motor horn
h18 = perpendicular distance between z-axes through the centroids of base motor horn
h21 = perpendicular distance between x-axes through the centroids of shoulder motor
h22 = perpendicular distance between x-axes through the centroids of shoulder motor
h23 = perpendicular distance between x-axes through the centroids of shoulder motor
h24 = perpendicular distance between x-axes through the centroids of shoulder motor
h25 = perpendicular distance between x-axes through the centroids of shoulder motor
h26 = perpendicular distance between x-axes through the centroids of shoulder motor
h31 = perpendicular distance between x-axes through the centroids of elbow motor
h32 = perpendicular distance between x-axes through the centroids of elbow motor
h33 = perpendicular distance between x-axes through the centroids of elbow motor
h34 = perpendicular distance between x-axes through the centroids of elbow motor
h41 = perpendicular distance between y-axes through the centroids of wrist motor
h42 = perpendicular distance between y-axes through the centroids of wrist motor
A decision was made to use servomotors (commonly called servos) to provide the
required torque at each joint location. The decision was made due to the ability of
servos to control the angular position, velocity and acceleration of the output shaft in a
precise manner.
77
With ordinary DC (Direct Current) motors, it is possible to control whether or not they
are on or off, and the speed, but controlling the direction or position takes more work
and more components. RC servo motors have built into them, the components needed
to control the motor’s position precisely. Inside the servo is a combination of gears
and a potentiometer with circuitry that make it possible to set its position fairly
quickly and precisely and within a 180-degree range of travel (Sullivan and Igoe,
2004).
Using the formulae derived in the previous section, the resistive torque expected to act
at each joint was evaluated and a motor (servo) with an appropriate torque rating was
A standard mini servo gripper template from Lynxmotion® Robotics was used to
implement the gripper design. It is based around two counter-rotating gears which use
levers to make the grippers open and close laterally (Lynxmotion.com, 2013).
The design uses a Hitec® HS-225MG servo. Thus, a Hitec® HS-225MG servo was
selected to provide the needed torque for the opening and closing of the gripper jaws.
Note: The 'MG' added to the servo name is an abbreviation for 'Metal Gear'. It
indicates that the drive gears of the servo are made of metals as opposed to plastics
output spline
drive gears
servo case
control circuit
potentiometer motor
Total resistive torque at wrist joint = T4g + T4i = 6.012 x 10-5N-m or 6.012 x10-4 kg-cm
Thus a Hitec® HS-225MG servo with a torque rating of 3.9kg.cm was selected to
T3g = 0.2707N.m
T3i = 0.0049N.m
Hence;
2.756kg.cm. Thus a Hitec® HS-422 standard servo with a torque rating of 3.30kg.cm
at 4.8volts was selected to provide the torque needed at the joint location.
The extra torque of 0.544kg.cm caters for the extra resistive torque due to inertia of
the payload, servo bearing friction, irregularities in the robot frame design and future
T2g = 0.5903N.m
T2i = 0.0152N.m
Hence;
Thus a Hitec® HS-645MG Ultra Torque servo with a torque rating of 7.70kg.cm at
4.8volts was selected to provide the torque needed at the joint location. The extra
torque of 1.645kg.cm caters for the extra resistive torque due to inertia of the payload,
Thus a Hitec® HS-422 servo with a torque rating of 3.30kg.cm at 4.8volts was
The properties of the selected servos are highlighted in Table 3.2 while their pictures
A servo motor is controlled by sending a series of pulses to it. This is called Pulse
Width Modulation (PWM) signal. A pulse is sent to the servo every 20 milliseconds.
Depending on the pulse width (duration of the pulse), the motor spline or shaft moves
to a position within a 180-degree range. For a hitec® servo, a 0.9 millisecond (900µs)
pulse moves the motor to 0 degrees, a 2.1 millisecond (2100 µs) pulse moves it to 180
degrees. Any pulse between 900µs and 2100µs moves the motor to a position
A servo controller is a circuit arrangement that generates the PWM signals that control
Figure 3.17b Hitec HS-422 servo Figure 3.17c Hitec HS-645MG servo
Three servo controllers were considered for the implementation of this project - the
Pololu® Micro Maestro 6-Channel USB (Universal Serial Bus) servo controller,
servo controller.
The Pololu® Micro Maestro is a highly versatile servo controller. It supports three
transistor logic) serial for use with embedded systems, and internal scripting for self-
Its extremely precise, high-resolution servo pulses have a jitter of less than 200 ns,
making it well suited for high-performance robotics applications, and its in-built speed
and acceleration control for each channel make it easy to achieve smooth, seamless
movements without requiring the control source to constantly compute and stream
project.
Also, a mail was sent to the manufacturer to ascertain its compatibility with Hitec ®
servo motors(which have been selected for this project) and the company technical
allows manual and PC (Personal Computer) control of five servomotors (Hitec® and
PC. Universal three position headers make it easy to connect servomotors by just
The SSC-32 (serial servo controller) is a small preassembled servo controller with 32
channels of 1µS resolution servo control, thus enabling accurate positioning of the
servo output shaft. The range of its generated Pulse Width Modulation (PWM) signal
is 0.50ms to 2.50ms for a range of about 180° and it features an Atmel ATMEGA168-
2013).
The three servo controllers could suffice for this project but due to the high cost of
the SMC-05A ($165.95 as opposed to $39.95 for the SSC-32 servo controller and
$19.95 for the Pololu® Maestro), a decision was made not to use it for the project
implementation. The Pololu® Mini Maestro was preferred to the SSC-32 because it
has a USB Software Development Kit that allows a less tedious programming with
PC (using the USB connection feature) without the need for a USB-to-serial adapter.
Thus, a decision was made to implement the robotic arm with the Pololu® Micro
which enables the operator to send instructions to the servo controller is needed. As
earlier stated, the Pololu® Mini Maestro has a USB Software Development Kit that
Thus, a decision was made to interface the servo controller with the host PC and as
well implement the inverse kinematics of the robotic arm (thus making it possible to
programming language. The flowchart and C# code for the interfacing software and
The final robotic arm model with the selected servos included is shown in Figure 3. 19
Figure 3. 20 while Figure 3. 21 shows the components that make up the arm structure.
The model was created with Autodesk® Inventor® Professional 2013 (Student
Version).
3.10 Cost
Table 3.3 gives a breakdown of the cost estimate of the robotic arm.
87
HS-225MG high
performance 2 51.98 8,332.39
mini servo
CHAPTER FOUR
The robotic arm was set up based on the procedure outlined in the preceding chapter.
The components of the arm were cut from acrylic plastic sheet as earlier decided with
the use of a fret saw and a hand-drilling machine. However, the acrylic plastic used
has a thickness of 3mm as opposed to 4mm stated in the preceding chapter. This was
due to the non-availability of the 4mm thick acrylic sheet in the nearby market. It was
however an advantage since the 3mm sheet still had adequate strength and stiffness to
bear imposed load and the increased lightness led to a reduction in the load the servos
had to bear. To further reduce the load imposed on the shoulder and elbow servos, a
load-balancing spring was added to the elbow joint of the robotic arm.
The fabricated robotic arm components were assembled using the 3D model as guide.
The servos were inserted at the various joints in such a way that centring all the servo
horns would make the arm to assume an upright pose as shown in Figure 4.1. The
centring of the servo horns was achieved by connecting the servos to the Pololu®
Micro Maestro™ servo controller and sending PWM (Pulse Width Modulation) signals
of 1500ms to them from the Pololu® Maestro™ Control Centre. The set-up is
The servo controller was connected to the PC running the Pololu® Maestro™ Control
Centre via a USB cable which also supplied enough current to power the servo
controller board. The servos on the other hand were powered using a 3-Ampere
After the robotic arm was assembled, each servo powering the arm was connected to
the Pololu® Micro Maestro™ servo controller. This was achieved by inserting the
female outlet of each servo cable to the 3-header pins on the servo controller. Servo
extender cables were used to increase the length of the wrist-rotate and gripper servos
cables because they were far away from the servo controller. The control interface
The interfacing software (shown in Figure 4.3) requires the user to enter the desired
coordinates of the centre of the gripper after which he clicks a button labelled "click to
compute inverse kinematics " to compute the inverse kinematics of the robotic arm.
On clicking the button, the angle through which each joint will rotate in order to
position the gripper at the specified coordinate is displayed if the specified coordinates
are within the robotic arm workspace. If the specified coordinates are not within the
workspace of the robotic arm, the user is notified on clicking the button. The user then
clicks the button labelled "click to position gripper" to move the gripper to the
specified coordinates.
The robotic arm set-up was tested by inputting series of coordinates in the control
software. The resulting position of the gripper centre for each set of coordinates was
the robotic arm to measure the y and z coordinates while another one was placed
below the robotic arm to measure the x coordinates. A picture of the set-up is shown
in Figure 4.4 while the test results are presented in Table 4.1.
104
10 4 19 11 5 19 1 1 0
5 4 23 5 3 25 0 1 2
8 -7 19 10 -7 19 2 0 0
10 12 18 12 13 17 2 1 1
9 -7 27 10 -7 25 1 0 2
8 -23 10 9 -23 9 1 0 1
10 7 28 11 7 27 1 0 1
15 -7 15 15 -6 15 0 1 0
15 15 15 14 16 14 1 1 1
10 -20 8 11 -21 7 1 1 1
12 -10 13 13 -11 12 1 1 1
11 12 14 13 11 14 2 1 0
12 16 10 14 15 9 2 1 1
107
a software that enables the user of the robotic arm to easily instruct it to perform a
repeated pick-and-place task was developed. A snapshot of the software GUI (Graphic
User Interface) is shown in Figure 4.5 while the source code, written in C# is shown in
Appendix 2. Figures 4.6 and 4.7 show the robotic arm performing a pick-and-place
task.
Another software that allows the user to control the robotic arm by using sliders to
move the joints was developed. A snapshot of the software GUI (Graphic User
Interface) is shown in Figure 4.8 while the source code, written in C# is shown in
Appendix 3.
108
Figure 4. 8 : GUI of alternate interfacing software for controlling robotic arm (using
sliders)
112
CHAPTER FIVE
The objective of this study - the fabrication of a robotic arm that can be used for
demonstrative and educational purposes - has been achieved. The study has further
shown that the robotic arm can be reprogrammed in order to adapt it to a variety of
manufacturing, packaging and mining industries. It is believed that the use of the arm
for educational and instructional purposes will stimulate the interest of students in the
field of robotics, especially in a developing country like Nigeria, where robotics is yet
There is however room for improvement in the design and implementation of the
• Motors with higher torque ratings can be used to power the joints so as to
ensure that the robotic arm remains in position even when electric current is
highly recommended.
113
REFERENCES
Adhikari, U., Islam, M. M., Noser, J., & Puliafico, K. (2010). 'Elderly Assist Robot'.
Worcester Polytechnic Institute, Worcester.
Benenson, W., Harris, J. W., Stöcker, H., & Lutz, H. (2006). Handbook of Physics.
New York: Springer Publishing.
Blackmore, M., Furniss, J., & Ochsner, S. (2009). '5 DOF Robotic Arm'. Poland State
University, Embedded Robotics, Poland.
Carlson, M., Donley, E., Graf, K., & Jones, T. (2013). Helping Hand : Senior Design
Final Documentation. Orlando: University of Central Florida.
Chen, C., Huang, H., & Wan, T. (2006). 'Omnidirectional Mobile Home Care Robot'.
National Chung-Hsing University, Department of Electrical Engineering,
Taichung.
Herman, P., Mosteoyz, A. R., & Muri, A. C. (2012). 'A low-cost custom-built robotic
arm on the TurtleBot platform'. University of Zaragoza, Dpt. de Informática e
Ingeniería de Sistemas, Zaragoza.
Images SI Inc. (2013). Servobotics Robotic Arm Model RA-02. Images Scientific
Instruments: http://www.imagesco.com/robotics/arm.html. Accessed
October 19, 2013.
Jenne, F. (2009). 'Combined control of a mobile robot and a robot arm'. Swiss Federal
Institute of Technology , Autonomous Systems Laboratory, Zurich.
115
Lynxmotion. (2013). AL5B Robotic Arm Combo Kit (with RIOS Software. Lynxmotin
: Imagine it. Build it. Control it.: http://www.lynxmotion.com/p-673-al5b-
roboticarm-combo-kit-with-rios-software.aspx. AccessedOctober19, 2013.
Patil, C., Sachan, S., Singh, R. K., Ranjan, K., & Kumar, V. (2009). Self and Mutual
learning in Robotic Arm, based on Cognitive systems. West Bengal: Indian
Institute of Technology Kharagpur.
Siciliano, B., & Khatib, O. (2008). Springer Handbook of Robotics (illustrated Edition
ed.). New York: Springer.
Siciliano, B., Sciavicco, L., Villani, L., & Oriolo, G. (2009). Robotics : Modelling,
Planning and Control. London: Springer-Verlag London Ltd.
Spong, M. W., & Vidyasagar, M. (1989). Robot Dynamics and Control (1st ed.).
Hoboken: John Wiley and Sons.
Sullivan, D. O., & Igoe, T. (2004). Physical Computing : Sensing and Controlling the
Physical World with Computers. Boston: Thomson Course Technology PTR.
The Chambers Dictionary (12th ed.). (2011). Chambers Harrap Publishers Ltd.
117
APPENDIX 1
Start
Input x, y, z
a > 90 or
No Display error
a< 90 message
Yes
Values of c
No Display error
lie within
workspace message
Yes
A
118
Values of b No
lie within Display error
workspace message
Yes
Compute forward
kinematics of each
possible set of joint angles
B
119
Error1≤ Error2
AND
Base angle = a
Yes
Error1≤ Error3 Shoulder angle = b{1}
AND
Elbow angle = c{1}
Error1≤ Error4
No
Error2≤ Error1
AND Base angle = a
Yes
Error2≤ Error3 Shoulder angle = b{2}
AND
Elbow angle = c{1}
Error2≤ Error4
No
Error3≤ Error1
AND
Base angle = a
Error3≤ Error2
Yes Shoulder angle = b{3}
AND
Elbow angle = c{2}
Error3≤ Error4
No
C D
120
C D
No
Convert
angles to
Display Error PWM signals
Message
Send PWM
signals to servo
controller
Stop
121
using Pololu.Usc;
using Pololu.UsbWrapper;
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Text;
using System.Windows.Forms;
using System.Numerics;
namespace Pololu.Usc.MaestroEasyExample
{
public partial class MainWindow : Form
{
double a;
double [] b = new double[4];
double [] c = new double[2];
double ao;
double bo;
double co;
public MainWindow()
{
122
InitializeComponent();
}
yinput = yInput_textBox.Text;
y = Convert.ToDouble(yinput);
yinput = yInput_textBox.Text;
y = Convert.ToDouble(yinput);
zinput = zInput_textBox.Text;
z = Convert.ToDouble(zinput)-L0-L1;
double eb = -4 * k * L2 * L4;
int ii = 0;
//else if (ii=2)
//Calculate the value of c
yinput = yInput_textBox.Text;
y = Convert.ToDouble(yinput);
zinput = zInput_textBox.Text;
z = Convert.ToDouble(zinput) - L0 - L1;
int kk = 0;
124
int ii = 0;
}
}
MessageBox.Show(msg);
ao = a;
bo = b[3];
co = c[1];
127
double d = 0.0;
double b0 = b[0]; double b1 = b[1];double b2=b[2];double b3 = b[3];
double c0 = c[0]; double c1 = c[1];
double[] coord = { x, y, z + L0 + L1 };
if ((error1 <= error2) && (error1 <= error3) && (error1 <= error4))
{
ao = a; bo = b[0]; co = c[0];
}
else if ((error2 <= error1) && (error2 <= error3) && (error2 <=
error4))
{
128
ao = a; bo = b[1]; co = c[0];
}
else if ((error3 <= error1) && (error3 <= error2) && (error3 <=
error4))
{
ao = a; bo = b[2]; co = c[1];
}
else if ((error4 <= error1) && (error4 <= error2) && (error4 <=
error3))
{
ao = a; bo = b[3]; co = c[1];
}
angleConfig[0]=ao;
angleConfig[1]=bo;
angleConfig[2]=co;
BaseOutput_textBox.Text = String.Format("{0:N4}",ao);
ShoulderOutput_textBox.Text = String.Format("{0:N4}", bo);
ElbowOutput_textBox.Text = String.Format("{0:N4}", co);
/// <summary>
/// This functions runs when the user clicks the Target=1000us button.
/// </summary>
void positionGripper_Click(object sender, EventArgs e)
{
displayMeassage("ao is: "+ ao +" ,bo is: "+bo + ",co is: " + co);
TrySetTarget(0, basePWM);
TrySetTarget(1, shoulderPWM);
TrySetTarget(2, elbowPWM);
TrySetTarget(3, wristPWM)
{
// Set target of channel 0 to 0. This tells the Maestro to stop
// transmitting pulses on that channel. Any servo connected to it
// should stop trying to maintain its position.
TrySetTarget(0, 0);
TrySetTarget(1, 0);
TrySetTarget(2, 0);
TrySetTarget(3, 0);
TrySetTarget(4, 0);
}
/// <summary>
/// Attempts to set the target (width of pulses sent) of a channel.
/// </summary>
/// <param name="channel">Channel number from 0 to 23.</param>
/// <param name="target">
/// Target, in units of quarter microseconds. For typical servos,
/// 6000 is neutral and the acceptable range is 4000-8000.
/// </param>
void TrySetTarget(Byte channel, UInt16 target)
{
try
{
using (Usc device = connectToDevice()) // Find a device and
temporarily connect.
{
device.setTarget(channel, target);
/// <summary>
/// Connects to a Maestro using native USB and returns the Usc object
/// representing that connection. When you are done with the
/// connection, you should close it using the Dispose() method so that
/// other processes or functions can connect to the device later. The
/// "using" statement can do this automatically for you.
/// </summary>
Usc connectToDevice()
{
// Get a list of all connected devices of this type.
List<DeviceListItem> connectedDevices = Usc.getConnectedDevices();
/// <summary>
/// Displays an exception to the user by popping up a message box.
/// </summary>
void displayException(Exception exception)
{
StringBuilder stringBuilder = new StringBuilder();
do
{
stringBuilder.Append(exception.Message + " ");
if (exception is Win32Exception)
{
stringBuilder.Append("Error code 0x" +
((Win32Exception)exception).NativeErrorCode.ToString("x") +
". ");
}
exception = exception.InnerException;
}
while (exception != null);
MessageBox.Show(stringBuilder.ToString(), this.Text,
MessageBoxButtons.OK, MessageBoxIcon.Error);
}
TrySetTarget(4, trackBarPos);
}
}
}
132
APPENDIX 2
/* Robotic Arm Control Interface with Inverse Kinematics Engine for repeated
* pick-and-place task:
* written in Visual C#.
* by Oridate Ademola Ayodeji
* (MEE/2008/071)
*
* Acknowledgment:
* The Inverse kinematics algorithm written by Akinwale (2009) of the
Department of Electronic and
* Electrical Engineering, Obafemi Awolowo University for the RA-01 robotic arm
(in MatLab) for
* his M.Sc Thesis was used as a guide in writing this program.
*/
using Pololu.Usc;
using Pololu.UsbWrapper;
using System;
using System.Linq;
using System.Threading;
using System.Collections.Generic;
using System.ComponentModel;
using System.Text;
using System.Windows.Forms;
using System.Numerics;
namespace Pololu.Usc.MaestroEasyExample
{
public partial class MainWindow : Form
{
double a;
double [] b = new double[4];
double [] c = new double[2];
double ao;
double bo;
double co;
UInt16 trackBarPos = 0;
133
UInt16 speed = 5;
public MainWindow()
{
InitializeComponent();
}
yinput = yInput_textBox;
y = Convert.ToDouble(yinput);
yinput = yInput_textBox;
y = Convert.ToDouble(yinput);
zinput = zInput_textBox;
z = Convert.ToDouble(zinput)-L0-L1;
int ii = 0;
//else if (ii=2)
//Calculate the value of c
yinput = yInput_textBox;
y = Convert.ToDouble(yinput);
135
zinput = zInput_textBox;
z = Convert.ToDouble(zinput) - L0 - L1;
int kk = 0;
int ii = 0;
}
}
136
MessageBox.Show(msg);
computeElbowAngle(xInput_textBox1.Text, yInput_textBox1.Text,
zInput_textBox1.Text);
computeShoulderAngle(xInput_textBox1.Text, yInput_textBox1.Text,
zInput_textBox1.Text);
ao = a;
bo = b[3];
co = c[1];
double d = 0.0;
double b0 = b[0]; double b1 = b[1]; double b2 = b[2]; double b3 =
b[3];
double c0 = c[0]; double c1 = c[1];
double[] coord = { x, y, z + L0 + L1 };
if ((error1 <= error2) && (error1 <= error3) && (error1 <= error4))
{
ao = a; bo = b[0]; co = c[0];
}
else if ((error2 <= error1) && (error2 <= error3) && (error2 <=
error4))
{
ao = a; bo = b[1]; co = c[0];
}
else if ((error3 <= error1) && (error3 <= error2) && (error3 <=
error4))
{
ao = a; bo = b[2]; co = c[1];
}
else if ((error4 <= error1) && (error4 <= error2) && (error4 <=
error3))
{
ao = a; bo = b[3]; co = c[1];
}
double[] check = Fkine(ref ao, ref bo, ref co, ref d);
double[] error = new double[] { (coord[0] - check[0]), (coord[1] -
check[1]), (coord[2] - check[2]) };
BaseOutput_textBox.Text = String.Format("{0:N4}",
angleConfigPick[0]);
ShoulderOutput_textBox.Text = String.Format("{0:N4}",
angleConfigPick[1]);
ElbowOutput_textBox.Text = String.Format("{0:N4}",
angleConfigPick[2]);
}
computeShoulderAngle(xInput_textBox2.Text, yInput_textBox2.Text,
zInput_textBox2.Text);
ao = a;
bo = b[3];
co = c[1];
double d = 0.0;
double b0 = b[0]; double b1 = b[1]; double b2 = b[2]; double b3 =
b[3];
double c0 = c[0]; double c1 = c[1];
double[] coord = { x, y, z + L0 + L1 };
if ((error1 <= error2) && (error1 <= error3) && (error1 <= error4))
{
ao = a; bo = b[0]; co = c[0];
}
else if ((error2 <= error1) && (error2 <= error3) && (error2 <=
error4))
{
ao = a; bo = b[1]; co = c[0];
}
else if ((error3 <= error1) && (error3 <= error2) && (error3 <=
error4))
{
ao = a; bo = b[2]; co = c[1];
}
else if ((error4 <= error1) && (error4 <= error2) && (error4 <=
error3))
{
ao = a; bo = b[3]; co = c[1];
}
double[] check = Fkine(ref ao, ref bo, ref co, ref d);
double[] error = new double[] { (coord[0] - check[0]), (coord[1] -
check[1]), (coord[2] - check[2]) };
BaseOutput_textBox2.Text = String.Format("{0:N4}",
angleConfigPlace[0]);
ShoulderOutput_textBox2.Text = String.Format("{0:N4}",
angleConfigPlace[1]);
ElbowOutput_textBox2.Text = String.Format("{0:N4}",
angleConfigPlace[2]);
}
/// <summary>
/// This functions runs when the user clicks the Target=1000us button.
/// </summary>
/// <summary>
/// Attempts to set the target (width of pulses sent) of a channel.
/// </summary>
/// <param name="channel">Channel number from 0 to 23.</param>
/// <param name="target">
/// Target, in units of quarter microseconds. For typical servos,
/// 6000 is neutral and the acceptable range is 4000-8000.
/// </param>
void TrySetTarget(Byte channel, UInt16 target)
{
try
{
using (Usc device = connectToDevice()) // Find a device and
temporarily connect.
{
device.setSpeed(channel,speed);
device.setTarget(channel, target);
/// <summary>
/// Connects to a Maestro using native USB and returns the Usc object
/// representing that connection. When you are done with the
/// connection, you should close it using the Dispose() method so that
/// other processes or functions can connect to the device later. The
/// "using" statement can do this automatically for you.
/// </summary>
Usc connectToDevice()
{
// Get a list of all connected devices of this type.
List<DeviceListItem> connectedDevices = Usc.getConnectedDevices();
/// <summary>
/// Displays an exception to the user by popping up a message box.
/// </summary>
void displayException(Exception exception)
{
StringBuilder stringBuilder = new StringBuilder();
do
{
stringBuilder.Append(exception.Message + " ");
if (exception is Win32Exception)
{
stringBuilder.Append("Error code 0x" +
((Win32Exception)exception).NativeErrorCode.ToString("x") + ". ");
}
exception = exception.InnerException;
}
while (exception != null);
MessageBox.Show(stringBuilder.ToString(), this.Text,
MessageBoxButtons.OK, MessageBoxIcon.Error);
}
{
e.Handled = true;
}
}
TrySetTarget(4, trackBarPos);
Int32 sleep2 = Math.Abs((1900*4) - trackBarPos);
Thread.Sleep((((sleep2 / (25 * speed)) / 4) * 1000) + 2000);
TrySetTarget(0, basePWMplace);
Thread.Sleep((((Math.Abs(basePWMplace - basePWMpick) / (25 *
speed)) / 4) * 1000) + 2000);
TrySetTarget(1, shoulderPWMplace);
TrySetTarget(2, elbowPWMplace);
//TrySetTarget(3, wristPWMplace);
Int32[] sleep4 = { Math.Abs(basePWMplace - basePWMpick),
Math.Abs(shoulderPWMplace - 1560 * 4), Math.Abs(elbowPWMpick - 1500 * 4) };
Thread.Sleep((((sleep4.Max() / (25 * speed)) / 4) * 1000) +
2000);
}
}
}
}
147
APPENDIX 3
using Pololu.Usc;
using Pololu.UsbWrapper;
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Text;
using System.Windows.Forms;
using System.Numerics;
namespace Pololu.Usc.MaestroEasyExample
{
public partial class MainWindow : Form
{
public MainWindow()
{
InitializeComponent();
}
/// <summary>
/// Attempts to set the target (width of pulses sent) of a channel.
/// </summary>
/// <param name="channel">Channel number from 0 to 23.</param>
/// <param name="target">
/// Target, in units of quarter microseconds. For typical servos,
/// 6000 is neutral and the acceptable range is 4000-8000.
/// </param>
void TrySetTarget(Byte channel, UInt16 target)
{
try
{
using (Usc device = connectToDevice()) // Find a device and
temporarily connect.
{
device.setTarget(channel, target);
displayException(exception);
}
}
/// <summary>
/// Connects to a Maestro using native USB and returns the Usc object
/// representing that connection. When you are done with the
/// connection, you should close it using the Dispose() method so that
/// other processes or functions can connect to the device later. The
/// "using" statement can do this automatically for you.
/// </summary>
Usc connectToDevice()
{
// Get a list of all connected devices of this type.
List<DeviceListItem> connectedDevices = Usc.getConnectedDevices();
/// <summary>
/// Displays an exception to the user by popping up a message box.
/// </summary>
void displayException(Exception exception)
{
StringBuilder stringBuilder = new StringBuilder();
do
{
stringBuilder.Append(exception.Message + " ");
if (exception is Win32Exception)
{
stringBuilder.Append("Error code 0x" +
((Win32Exception)exception).NativeErrorCode.ToString("x") + ". ");
}
exception = exception.InnerException;
}
while (exception != null);
MessageBox.Show(stringBuilder.ToString(), this.Text,
MessageBoxButtons.OK, MessageBoxIcon.Error);
}
TrySetTarget(2, 6000);
TrySetTarget(3, 1530 * 4);
}
TrySetTarget(4, trackBarPos);
}
TrySetTarget(3, trackBarPos);
}
TrySetTarget(2, trackBarPos);
}
TrySetTarget(1, trackBarPos);
}
TrySetTarget(0, trackBarPos);
}
}
}