You are on page 1of 74

Mansoura University

Faculty of Engineering
Mechatronics Department

Autonomous Robot
Supervisor

Dr . Abd El-Fattah El-Adl

Team
1. Eslam Mohamed Mohamed Roshdi
2. Mohamed Eid El-Sayed Ali El-Yamani
3. Mohamed Garea Zaher Ragab
4. Mohamed Salah El-Din El-Saeed
5. Ahmed Read Ahmed Ateia Sanad
6. Muhamed Gamal Abd-El Latif Gibrel

2018
Contents: Page :

1. Introduction 1. Robotic Industry 4


2. Robot abstract 9
3. Robot Idea 10
2. Aknowledgment 12
3. Problem Statement 14
4. Motivations 17
5. Usages and Related Works 19
6. Proposed Work & Graphical Abstracts 25
7. SoftWare : 1. Kinematics 29
2. Control 43
3. Vision 50
8. HardWare : 1. Embdedded 52
2. Sensors & Actuators 53
3. Power 53
9. Mechanical : 1. Overview of Robot Design 59
2. Arm Manipulator 62
3. Gripper & Wheels 64
10. Prototyping & Results 66
11. Conclusion & Future Work 71
12. Figures Index 72
13. References 74

Page : 2
Chapter 1
 Introduction  
robotics industry,
Project Abstract,
project idea

Page : 3
Introduction to Robotic Industry :
Recently there has been a lot of discussion about futuristic wars
between humans and robots, robots taking over the world and enslaving
hu- mans. Movies like The Terminator, Star Wars, etc., have propogated
these ideas faster than anything else. These movies are beautiful works of
fiction and present us with an interesting point of view to speculate.
However, the truth is much different
but equally as interesting as the fiction. If
you look around yourself you will see
several machines and gizmos within your
surroundings. When you use a simple pair
of spectacles, do you become nonliving?
When an elderly person uses a hearing
aid or a physically challenged person
uses an artificial leg or arm do they
become half machine? Yes, they do.
Figure 1 Robot change the world
Now we are rapidly moving toward an era where we will have chips
embedded inside our bodies. Chips will communicate with our biological
sensors and will help us in performing several activities more efficiently. An
artificial retina is almost at the final stages of its development. Now we are
thinking in terms of nanobots helping us to strengthen our immune systems.

Now we are already on the verge of becoming half machine. Chips


will be implanted inside our bodies imparting telescopic and microscopic
abilities in our eyes. Cell phones will be permanently placed inside the ear.
We will communicate with different devices not through a control
panel or keyboard; rather these devices will receive commands from the
brain directly. The next level of development will be the part of the brain
being replaced by chips, which will impart more capability to the brain.

Page : 4
CURRENT RESEARCH IN ROBOTICS AROUND THE WORLD
According to MSN Learning & Research, 700,000 robots were in the
industrial world in 1995 and over 500,000 were used in Japan, about 120,000
in Western Europe, and 60,000 in the United States– and many were doing
tasks that are dangerous or unpleasant for humans. Some of the hazardous
jobs are handling material such as blood or urine samples,
searching buildings for fugitives, and deep water searches, and even some
jobs that are repetitive and these can run 24 hours a day without getting
tired. General Motors Corporation uses these robots for spot welding,
painting, machine loading, parts transfer, and assembly

Basically a robot consists of:


■ A mechanical device, such as a wheeled platform, arm, or other
construction, capable of interacting with its environment.
■ Sensors on or around the device that are able to sense the
environment and give useful feedback to the device.
■ Systems that process sensory input in the context of the device’s
current situation and instruct the device to perform actions in response
to the situation.

fields of robotics can be broadly categorized as:


■ Robotic Manipulator
■ Wheeled Mobile Robots
■ Legged Robots
■ Underwater Robots
■ Flying Robots
■ Robot Vision
■ Artificial Intelligence
■ Industrial Automation

Robotic Arms
Robotic arms have become useful and
economical tools in manufacturing, Figure 2 Robot Arms
medicine, and other industries.

Page : 5
Wheeled Mobile Robots
Wheeled mobile robots perform many tasks in industry and in the military

Figure 3 Wheel Mobile Robots


Legged Robots
Locomotion on the ground can be realized with three different basic
mechanisms:
(i) Slider,
(ii) Liver, and
(iii) Wheel or track.
The benefi ts that can be obtained with a legged robot are:
■ Better mobility
■ Better stability on the platform
■ Better energy efficiency
■ Smaller impact on the ground

Figure 4 Legged Robots

Page : 6
Underwater Robots
Camera-equipped underwater robots serve many purposes including
tracking of fish and searching for sunken ships.

Figure 5 Underwater Robots

Flying Robots
Flying robots have been used effectively in military maneuvers, and
often mimic the movements of insects.

Figure 6 Flying Robot

Page : 7
Robot Vision
Vision is our most powerful sense. It provides us with an enormous
amount of information about the environment and enables rich,
intelligent interaction in dynamic environments. It is therefore not
surprising that a great deal of effort has been devoted to providing
machines with sensors that mimic the capabilities of the human vision
system. The fi rst step in this process is the creation of the sensing devices
that capture the same raw information light that the human vision
system uses. The two current technologies for creating vision sensors are
CCD and CMOS. These sensors have specifi c limitations in performance
when compared to the human eye.

Figure 7 Robot Vision

Artificial Intelligence
(AI) is a branch of computer science and engineering that deals with
intelligent behavior, learning, and adaptation in machines. Research in
AI is concerned with producing machines to automate tasks requiring
intelligent behavior. Examples include control, planning and
scheduling, the ability to answer diagnostic and consumer questions,
handwriting, speech, and facial recognition. As such, it has become
an engineering discipline, focused on providing solutions to real-life
problems, software applications, traditional strategy games like
computer chess, and other video games.

Page : 8
Industrial Automation
Automation, which in Greek means self-dictated, is the use of control
systems, such as computers, to control industrial machinery and
processes, replacing human operators. In the scope of industrialization,
it is a step beyond mechanization. Whereas mechanization provided
human operators with machinery to assist them with the physical
requirements of work, automation greatly reduces the need for human
sensory and mental requirements as well.

Figure 8 Industrial Automation

Abstract to our Robot


Recenlty many robots could be one or more typre such a flying robot
(drone) come with vision system to detect landmarks . This ensure integreity
of whole robotic indusrty . Our robot is a wheeled robot provided with a
manipulator arm and a vision robot . This compination gives the robot :

1- more mobility to workaround complex enivrnoment .


2- more sensitivity to see world and extract rich information .
3- more ability to handle objects .
4- more interactive with people .

Page : 9
Robot Idea
Autonomous robot is simply a robot with more independence . Firstly ,
we can define robot according to wikipedia.com as “A machine-
especially one programmable by a computer- capable of carrying out a
complex series of actions automatically . Robots may be constructed to
take on human form but most robots are machines designed to perform a
task with no regard to how they look.” . So to summerize robots are
programmable machines to perform tasks . Robots can be classified into
two types autonomous and semi-autonomous . Autonomous robots are
more hard and complex to desing and program . They have more
independence to accomplish tasks than semiauto ones which need more
human manual control .
Our robot is designed to recognize objects and faces with a real time
vision system , localize it then get it through mobile car robot provided with
arm Manipulator so it can got objects and carry them to another places
inside the room or outside it . And also can give objects to people as it can
recognize their faces . The project have complex algorithms to slove
different problems .
We use computer vision algorithms to detect objects and faces then
localize their positon in the world . Then my role is to take these postion in
3dimention space to make robot got it through mobile car with wheels . As
the robot get close in region near to object postion 10cm , it’s arm with 5
degree of freedom will start to move to got the object and finally gripper at
end-effector opens to catch object in robot arm to start the next round to
carry object to another place .
To do theses tasks I use algorithms like forward kinematics and
backward kinematics to make arm go to postion accuratly . Also have
algorithms for detect obstacles around robot to avoid .
The idea came to us after two or three brainstorming meetings . And
when we asked how we help disabled and blind people to enjoy their life .
However there’re many soultions like Artificial limbs and surgeries but we
found it easy and simple way to recompense their disablity . Beside that ,
the robot can be a personal assistant as it can helps you finish tasks or
getting objects or even close the doors to you .

Page : 10
chapeter 2 
 Aknowledgment  :
thanks to god ,
family , professors ,
college

Page : 11
The satisfaction and euphoria on the successful completion of any task
would be incomplete without mentioning the people who made it possible
whose constant guidance and encouragement crowned out effort with
success those pople who supported us on along way .

First , We must thank God for helping us and give us patience , hope
and motivation to complete work like this to help people and growth
humanity .
We would like to express my heartfelt gratitude to my esteemed
supervisor, Dr. Abd Al-Fatah El-Adl for his technical guidance, valuable
suggestions, and encouragement throughout the experimental and
theoretical study and in this project. It has been our honor to work under his
guidance, whose expertise and discernment were keys in the completion of
this project.

And We are grateful to the Deptartment of Mechatronics Engineering,


for giving us the opportunity to execute a project like this,which is an
integration project between several specializations such mechnical system ,
electrical system , control system and state of art of computer science .

Many thanks to our team who worked for project and their generous
contribution towards enriching the quality of this work.

This acknowledgement would not be complete without expressing our


sincere gratitude to our parents for their love, patience, encouragement,
and understanding which are the source of our motivation and inspiration
throughout our work

To our friends,technicians and colleges with love we said thank you all.

Page : 12
chapeter 3
 Problem Statement : 
routines , disabilities , blindness , hazardous world
,Time & Effort ,

Page : 13
Almost everything in this day and age is most probably mass produ-
ced with machines and technology playing a big part in its production,
manufacturing and distribution to increase efficiency and reliability while
reducing cost. All this this became a reality because of the developments
made in the Industrial Revolution, probably the biggest revolution of all
time. And availability of power sources in different forms such as electrical
,solar , hydro-electrical make it possible .

As there is a revolution in industrial world the next revolution is coming in


home autonomy and home service robots.This is our point of view , humans
must be more free of routines and repetitive tasks in order to decrese man’s
time and efforts so he can dedicate himself for mental tasks such as thinking
or innovation .

From this point of view ,robots will not replace people but will help
them to make many things with less efforts are being done .In domestic
envirnoment there are many tasks robots could do in this .such as collectig
things from different places.

Figure 10 Roomba Robot Figure 9 Robot Industrial Revolution


Arrange rooms ,sweeping floor , home security , help in the kitchen
hanldling tools and components for chef.Bring a cup of water,opening TV
and even closing the door there are tasks need to be done when man is
busy of outside the house .Robot like The SCRUBMATE Robot can do Menial
tasks that human don’t want to do.

Page : 14
Figure 11 Robots in Kitchen Figure 12 SCRUBMATE Robot
A physical disability is a limitation on a person's physical functioning,
mobility, dexterity or stamina. Other physical disabilities include impairments
which limit other facets of daily living, such as respiratory disorders,
blindness, epilepsy and sleep disorders.

Wheelchairs are one of the most commonly used assistive devices for
mobility, and they provide people with mobility within their homes and
communities. While wheelchairs were once a
symbol of inability and stigmatizing, they have
evolved to be highly mobile forms of self-expression
that are often fitted to each individual user.

One may wonder what science and


engineering can do to improve the wheelchair,
and be surprised by the answer that much has
been and remains to be done. One of the areas in
which science and engineering are making the
breakthroughs of tomorrow is in applying computer
modeling, rapid-prototyping and robotics to create Figure 13 Disabled People
electric powered mobility and manipulation
devices. Such devices provide people with very severe disabilities — those
that affect both the use of their arms and legs — the ability to perform tasks
with minimal assistance or even independently. And also for blind people
who need to act with envirnoment and life .

Page : 15
chapeter 4 
 Motivations : 
robotic market
, robot must seethinkact ,

Page : 16
There ‘re some reasons to solve these problems of home services ,
people disablilities and also blindnesss in this way
First ,the growth of robotic market specially ‘’home robotic’’ once it
starts in 2009 to grow .

Figure 14 Robotic Growth Market


This
chart describes the fact of growth the robotic market from 1999 to 2018
with increased rate of 21% per year . And opposite to traditional industrial
robots market it increases with rate of 1-2% of the whole market . Which
make it the best solution for that purpose in addition to the revenue we will
gain in future.In 2020, the robotics market is estimated to be worth 130
billion U.S. dollars globally. The industrial robotics market, which has
traditionally represented the robotics industry and has been led by
Japanese and European robotics manufacturers, is giving way to non-
industrial robots, such as personal assistant robots, customer service robots,
autonomous vehicles, and unmanned aerial vehicles (UAVs).

There are many solution for make diablied people act with life such as
artifcial limbs and wheelchairs .But as second purpose of making such an
autonomous robot is that can operate during power outage period using
battery then it use to do simple home rotuine task without need any human
guidance.Which will be reduce human effort and time will be saved.

Page : 17
chapeter 5 
 Usages : 
Examples and Related Works
Usages of robot in industrical envirnoment , stores ,
Assemply regions , home service , self-assistance ,
blined and disabled people

Page : 18
We develop a robotic system so it can use in may fields such as industrical
field or domestic field . So there are many usages of this autonomous robot
here is some :

1- In warehouses
This robot can be designed to work in warehouses . It can help in
repetitive tasks such pick and place boxes . A manipulator will be attached
with the robot base so it can do tasks of holding objects .With some interset
we come with two examples of two types of robots used in stores .

First , is Amazon who beginning to add Massachusetts-made robots. In


October, the company divulged for the first time that it has deployed more
than 1,300 robotic warehouse workers from Kiva Systems in North Reading, a
company Amazon bought last year for $775 million. And Kiva’s squat
orange robots, which move racks of merchandise to workers who fill boxes,
could prove essential to helping Amazon return to profitability. Amazon
began using these robots in July of this year, and there are now more than
15,000 of them in 10 of the company’s warehouses. They whir around like
gears on a Swiss watch.

Amazon Senior Vice President


of Operations Dave Clark says
“improvements such as the Kiva robots
have significantly increased operations
efficiency while making employees’ lives
easier”. Amazon has sometimes
been criticized for the conditions in
its fulfillment centers, with workers
often logging over 10 hours a day
and walking up to 15 miles in a shift
to pick items off the shelves.
Figure 15 Amazon Robots

Page : 19
Second , is inVia Robotics founded in 2015, inhas
taken in an undisclosed amount of funding to develop a
“simple, intuitive, and affordable robotic picking
automation solution” It’s robot called the “Picker” uses a
suction cup enabled device to grab items and place
them where you like, such as in the Runner robot we just
talked about. Seen below, the Picker is capable of picking
up items as small as a pack of gum or as large as a case
of soda:
Figure 16 inVia Robot
Here’s a quote from its site “Where the nerd in us starts to get really excited. 
Essentially, you can start configuring your shop floor to implement one or many robots, 
depending on how many humans you want to fire  free up to do more value­added 
activities.” And robot has various configurations include:
 Goods to Person – Retrieve products from totes, bins, or boxes and then deliver them to the
humans manning the packing stations with a 99.99% accuracy – 5-20X more efficient
 Goods to Box – Retrieve products from shelf to pack stations or directly to shipping – 100%
Accuracy / 100% labor reduction
 Person to Goods Robots – Robots go collect goods from the pickers and do all the walking for
them – 3-5X productivity increase
 Person to Goods Software – Optimizes and schedules routes for human pickers – 3-5X
productivity increase

And there’s also a software that ensure integration between human labors
and robots working to achiving specific task . And the inVia software
integrates with existing warehouse management system so that the picking
instructions can either be passed to a robot, or a human via tablet or phone:

Figure 17 inVia Robots Figure 18 inVia Software

Page : 20
2- home service :
Robot can be used to help people in home . Such as colleting some
object form place to place. And also assist human beings, typically by
performing a job that is dirty, dull, distant, dangerous or repetitive.A service
robot is a robot which operates semi- or fully autonomously to perform
services useful to the well-being of humans and equipment,.And a solution
to support older people to stay widely independent in everyday life.

An example of such type of this robot is : Care-O-Bot Service Robot


An European research project SRS (Shadow Robotic System for
Independent Living.The robot can pick up and bring objects such as drinks
or open the door for paramedics. The SRS robot can move autonomously
but it can also be supported during new situations from a remote operator
and as a consequence allows the robot to learn in a similar situation for the
next time.

Figure 20 Care-O-Bot Services Figure 19 Care-O-Bot

Page : 21
3- blined and disabled people
Another important usage example is A
robot from ORFOMED A German company
come with JACO Arm robot 6DOF that can use
by disabled people to pick up object , eat
meals with an effective artitical arm . And as
not a traditional wheelchair it come with
controable wheel motors .

So ,The robotic arm provides technical


assistance for coping with everyday life for
people with limited or missing arm / hand
function. Many quadriplegics and people with Figure 21 ORFOMED
musculoskeletal problems use it daily to open
doors, drink, eat and much more.Folded sideways, the robotic arm with a
reach of 90 cm moves together with the wheelchair through each door.
The robot arm is controlled by joystick, touchpad, chin or head control.

The robot arm is characterized by intuitive user-friendliness in an


appealing design. Recurring movements - such as drinking coffee in the
morning - can be easily saved.With its 3-finger gripping system and an
opening span of 12 cm, the robot arm has high sensitivity - for both the
fragile breakfast egg and the beverage glass.

Figure 23 JACO Arm


Figure 22 Robot Help Disabled
The robotic arm's carbon arm can be
moved quietly and gently in six degrees of freedom and unlimited rotation

Page : 22
on each hinge axis - robust and lightweight. Weatherproof and has a Payload
of 1.5 kg

4- Industrial Services
Industrial service robots can be used to carry out simple tasks, such as
examining welding, as well as more complex, harsh-environment tasks, such
as aiding in the dismantling of nuclear power stations. If the robot is an
automatically controlled, reprogrammable, multipurpose manipulator
programmable in three or more axes, which may be either fixed in place or
mobile for use in industrial automation applications. It is called “Industrial
Robot”.
A last Example is KUKA robot which a mobile industrial robotics.use to paint
and welding air turbin blade . It provide a full mobility and freedom to
workaround big parts autonomously .

Figure 24 KUKA Robot

Page : 23
chapeter 6 
Proposed Work  Summery 
[Graphical Abstract]
graphic tree of works done ,
tree of team members ,
flow of opetation

Page : 24
In this chapter we will introduce three summery of the whole everything in
this project without going into details . This chapter has three parst :
1- Tree of work section and associated jops done
2- Fast Flow of how robot system works .(System Process Flow)
3- Tree of team member and their majoir and jops .

[1] Tree of work section and associated jops done :


Work in this project come in three sections in parallel .Software , hardware
and mechanical .
Jobs in this software section is design algorithms and write some control
commands .This include some mathmatics and statistics .

Figure 25 Software Associated Work

Page : 25
Jobs associated with hardware section had beed deliverd to members
who will connect embedded components and code write-read
commands to several types of components which include study of each .
Mechanical section deal with design and fabricationfor system parts .

Figure 26 Hardware Associated Work

Figure 27 Mechanical Associated Work

Page : 26
[2] Fast Flow of how robot system works .(System Process Flow)
Summery of how robot work and what is done inside .

>> Photoshops (Designing )

[3]Tree of team member and their majoir and jops .


The Team , the vital point in any project that can take it high or down .We
began with four members .And now we ‘re six men . We have gathered
because we all have interest in robotics and we need to make new
thing .
All team members are Mechatronics enginnering major , Mansoura
University Level 400 .Here summery schematic of team memebrs .

Figure 28 Team Structure

Page : 27
chapeter 8 
 SoftWare  : 
Algorthims ,Flow-charts,Programming languges ,
9­1­ Kinematics :
[Arm: DOF,Dims,Limitations,Workspace,DH-Table,Forward
Kinematics,Reverse Kinematics,Mobile: Diffrential-Car-
Model ]
9­2­ Control  :
[Arm: pick and place,Path Planning,
Mobile: Navigation , Go to Goal,Obstacle Avoidance,
Follow Wall,Path Planning]
9­3­ Machine Learning  : 
[Label Detection,Object Detection and Recognition
,Face Detection and Recognition]

Page : 28
9-1 : Kinematics :
Kinematics is the way to describe robot movement
.Kinematics deal with velocity and position of robot .
Our robot is consist of Arm and Car. In this chapeter we
will describe how kinematics for each and this with the
providing of robot model .

First , Arm Kinematics : Figure 29 ABB Robot


All modeling and simulation here we done by matlab .

Arm manipulator is an articulated arm with 5 Degree of Freedom


(DOF) . The articulated manipulator is also
called a revolute, or anthropomorphic
manipulator.
The ABB IRB1400 articulated arm is
shown in Figure 29 .A common revolute
joint design of these arrangements joint
axis z2 is parallel to z1 and both z1 and z2
are perpendicular to z0. The structure and
terminology associated with the elbow
manipulator are shown in Figure Figure 30 Arm Kinematic

The number of joints determines the degrees-of-freedom (DOF) of


the manipulator. a manipulator 5 independent DOF: three for positioning
and two for orientation.
The workspace of a manipulator is the total volume swept out by
the end-effector as the manipulator executes all possible motions. The
workspace is constrained by the geometry of the manipulator as well as
mechanical constraints on the joints.
The workspace of our robot shown in figure .we have 5 joints with 180
Deg limitation this what’s make workspace not fully sphere .

Figure 31 Robot Workspace

Page : 29
Through kinematics we solve were divided into to important problems .
First , the forward kinematics and second is Inverse Kinematics .

But before we go forward , we must know our arm model and


dimensions . We use a well known articulated manipulator design called
6 servo robot ARM with specific dimensions as seen in figure .

Figure 32 Robot Dimensions

Forward kinematics
Refers to the use of the kinematic equations of a robot to compute
the position of the end-effector from specified values for the joint
parameters.
The forward kinematics problem is concerned with the relationship
between the individual joints of the robot manipulator and the position
and orientation of the tool or end-effector. Stated more formally, the
forward kinematics problem is to determine the position and orientation
of the end-effector, given the values for the joint variables of the robot.
The joint variables are the angles between the links in the case of
revolute or rotational joints, and the link extension in the case of
prismatic or sliding joints.

Page : 30
The forward kinematics problem is to be contrasted with the inverse
kinematics problem .There are many methods to describe forwad
kinematics solution . We will deal with most common one of it and good
enough to describe robot kinematic it’s Denavit-Hartenberg matrix or
(D-H paremeters) .

DH of robot : by decoupling robot arm into two manipulator one for


first three dimensions

x+1
We use x R for the rotation matrix between coordinate systems x and
x+1, and θi to represent the ith joint angle. The first three matrices
describe the postion of manipulator in 3D space . And the two other
describe the orientation configuration of wrist .

Page : 31
To find X2is the first column from 20R, which you find by multiplying 10R21R:

In a similar way you can find Z5:

Now we had the forward kinematics equations . Which describe the


robot configuration when pass specific angle values .

c1[c2(c4c5c6 − s4s6) − s2s5c6] − s1(s4c5c6 + c4s6) = r11


s1[c2(c4c5c6 − s4s6) − s2s5c6] + c1(s4c5c6 + c4s6) = r21
−s2(c4c5c6 − s4s6) − c2s5s6 = r31
c1[−c2(c4c5s6 + s4c6) + s2s5s6] − s1(−s4c5s6 + c4c6) = r12
s1[−c2(c4c5s6 + s4c6) + s2s5s6] + c1(−s4c5s6 + c4c6) = r22
s2(c4c5s6 + s4c6) + c2s5s6 = r32
c1(c2c4s5 + s2c5) − s1s4s5 = r13
s1(c2c4s5 + s2c5) + c1s4s5 = r23
−s2c4s5 + c2c5 = r33

Page : 32
And the equations of end-effector position equations with respect
to origin .
c1s2d3 − s1d2 + d6(c1c2c4s5 + c1c5s2 − s1s4s5) = Ox
s1s2d3 + c1d2 + d6(c1s4s5 + c2c4s1s5 + c5s1s2) = Oy
c2d3 + d6(c2c5 − c4s2s5) = Oz

Where : “sM” is equal to sin(M) , and M is index of frame .


And Ox , Oy , Oz is the 3D position of the end-effector frame.

Inverse Kinematics :
In the previous we showed how to determine the end-effector
position and orientation in terms of the joint variables. This chapter is
concerned with the inverse problem of finding the joint variables in
terms of the end-effector position and orientation. This is the problem of
inverse kinematics, and it is, in general, more difficult than the forward
kinematics problem.

The practical question of the existence of solutions to the inverse


kinematics problem depends on engineering as well as mathematical
considerations. For example, the motion of the revolute joints may be
restricted to less than a full 360 degrees of rotation so that not all
mathematical solutions of the kinematic equations will correspond to
physically realizable configurations of the manipulator. We will assume
that the given position and orientation is such that at least one solution
is exists.

Kinematic Decoupling
Although the general problem of inverse kinematics is quite
difficult, it turns out that for manipulators having six joints, with the last
three joints intersecting at a point (such as the Stanford Manipulator
above), it is possible to decouple the inverse kinematics problem into
two simpler problems, known respectively, as inverse position
kinematics, and inverse orientation kinematics.

To put it another way, for a five-DOF manipulator with a spherical


wrist, the inverse kinematics problem may be separated into two
simpler problems, namely first finding the position of the intersection of
the wrist axes, hereafter called the wrist center, and then finding the
orientation of the wrist.

Page : 33
The idea of kinematic decoupling is illustrated in Figure 33 . For this
class of manipulators the determination of the inverse kinematics can be
summarized by the following algorithm.

Figure 33 Kinematic Decoupling


Step 1: Find q1, q2, q3 such that the wrist center Oc has coordinates
given by

Step 2: Using the joint variables determined in Step 1, evaluate R3 0.


Step 3: Find a set of Euler angles corresponding to the rotation matrix

Page : 34
Inverse Position: A Geometric Approach
For the common kinematic arrangements that we consider, we can use a
geometric approach to find the variables,q1,q2,q3 corresponding to O c0

Consider the elbow manipulator shown in Figure 33, with the


components of O0c denoted by xc, yc, zc. We project O c onto the x0 − y0
plane as shown in Figure 34

Figure 34 Inverse Geometric Approach

We see from this projection that

in which A tan(x, y) denotes the two Figure 35 Projection of the wrist


argument arctangent function. A tan(x, y) is
defined for all (x, y) ~= (0, 0) and equals the unique angle θ such that

Page : 35
For example, A tan(1, −1) = − π/4 , while A tan(−1, 1) = + ¾ π .
Note that a second valid solution for θ1 is

Of course this will, in turn, lead to different solutions for θ2 and θ3, as
we will see below.
These solutions for θ1, are valid unless xc = yc = 0. In this case is
undefined and the manipulator is in a singular configuration, shown in
Figure 36

Figure 36 Singularity Figure 37 Right Arm Configuration


In this position the wrist center Oc intersects z 0; hence any value of θ1
leaves Oc fixed. There are thus infinitely many solutions for θ 1 when Oc
intersects z0
From this figure 37, we see geometrically that

where

To find the angles θ2, θ3 for the elbow manipulator, given θ1, we
consider the plane formed by the second and third links as shown in
Figure

Page : 36
Since the motion of links two and three is planar, the solution is
analogous to that of the two-link manipulator of Chapter 1. As in our
previous derivation (cf. (1.8) and (1.9)) we can apply the law of cosines
to obtain

since r2 = x2c + yc 2 − d2 and s = zc. Hence, θ3 is given by

Similarly θ2 is given as

The two solutions for θ3 correspond to the elbow-up position and


elbow-down position, respectively.
An example of an elbow manipulator with offsets is the PUMA shown in
Figure

Figure 38 Four Kinematic Solutions

Page : 37
Si mulations done on Matlab to test robot different configurations and
test inverse kinematic solution .

Figure 39 Matlab Simulation

In next figure forward kinematic was been tested . Here we see from theta-1 to
theta-4 which describe all joints . And the last joint of orientation theta-5 is
prependicular to the last frame , shown as frame O 5 (X5,Y5,Z5) in figure 40

Figure 40 Forward Kinematic

Page : 38
Here show the frames of all joints

Figure 41 Manipulator Frames

Here workspace done by Inverse Kinematics to test points can arm


reach within XZ-plane with constant Y = 0 and as well as test constant
horizental configuration for gripper (As test for pick and place control
from a horizental surface .

Figure 42 Workspace Plane View


This figure shows that there’re some points end-effector can’t reach.

This because some tiny calculations mistakes by using cosine and


sine functions make no inverse kinematic solution for some points .

Page : 39
We can solve this and make our workspace look more rich .
Richer workspace for specific configuration .

Figure 43 Developed Workspace

In the final figure in this section we show the two different solution
for inverse kinematic problem which called upper and lower arms .

Figure 44 Two Solutions Kinematic

Page : 40
Second , Differential Mobile Car Robot Kinematics:
A mobile robot, or vehicle, has 6 degrees of freedom (DOF)
expressed by the pose: (x, y, z, Roll, Pitch, Yaw). It is composed of
two parts: the position= (x,y,z) and the attitude = ( Roll, Pitch,
Yaw). Informally, Roll can be said to be to the sidewise rotation and
Pitch the rotation forward or backwards. Yaw , commonly also
denoted Heading or Orientation , refers to the direction in which
the robot moves in the x-y plane.

For a robot on a two dimensional surface, the 2D pose (x,y, θ),


where θ denotes the heading, is sufficient to describe its motion. It
Is normally defined in a global coordinate system as illustrated
below. Note that θ is NOT an angular polar coordinate for the
position, instead it points in the forward direction of the robot.

Figure 45 Differential Car Model

Forward kinematics equations


A central concept for the derivation of the
kinemat ics equations is the Angular
velocity ω of the robot. It is defined as
follows: Each wheel rotates around ICC
along a circle with radius R.

Figure 46 Car Angular velocity

Page : 41
where v1 is the left
wheel’s velocity along the ground, and v2 is the right wheel’s velocity
along the ground, D is robot base width, and R is the signed distance
from the ICC to the midpoint between the two wheels. Note that v1, v2,
and R are all functions of time.At any moment in time

The velocity of the CR point, which is the midpoint between the two
wheels, can be calculated as the average of the velocities v1 and v2:

If v1 = v2, then the radius R is infinite and the robot moves in a


straight line (see Fig. 2a). For different values of v1 and v2, the mobile
robot does not move in a straight line but rather follows a curved
trajectory around a point located at a distance R from CR (Fig. 2B and c –
turning around one of the wheels), changing both the robot’s position
and orientation. If v1 = -v2, then the radius R is zero and the robot
rotates around CR (it rotates in place Figure 47).

Figure 47 Differential Car Kinematic

Page : 42
9-2 : Control :
First , Pick and Place Arm Control .

We can consider pick-and-place problem as two seperates inverse


kinematic problem .First , we got position and want to reach then grap
object and got another postion to reach next.This may made many times
as we need .

This may be a closed loop control with help of encoder attached to


each joints to check joint angle position But we use an open loop control
and compensate the error we may find by using accurate servo motors .
So we send angle commands to servo and its internal closed loop control
reaches thes angles perfectly .

We use P-control with assume angle reached to make smoth path


go to point .Here the
Algorthim we use .

Figure 48 Pick-and-Place Control Algorthim

Page : 43
Here some simulation by matlab of reaching some points with P-
control as we see that path to point is smoth and the velocities
decreases by getting closer to the goal point make it a perfect control
design .

Figure 49 Arm Go to Point Figure 50 Arm Go to point

Page : 44
Control of Mobile robot

For a robot with differential drive, the direction of motion is


controlled by separately controlling speeds vL and vR of the left and
right wheels respectively. Many such robots have two wheels connected
directly to motors, and in addition some kind of support wheel to keep
the robot upright. Common examples of robot with differential drive are
the Khepera robot , the Roomba vacuum cleaning robot and also our
robot as shown in Figure .

Figure 51 Differential Robot Figure 52 Robot Design

The algorithm is done in this way . First the robot start with go to
goal behavior , within this the sensors sending data to controller . when
the robot get closer to an obstacle(At the space of obstacle) it use
combined go to goal and avoid obstacle behavior . Avoid obstacle is
based on the previous concept “weighted formula” . Which mean that
robot behavior translates between AO and GTG until it resolve to non
obstacle space . The last behavior is when the robot is very very closer
to obstacles. it use the past concept but with weighted plenty .And final
when robot near to goal pose it stops .

Page : 45
1- Go-To-Goal Control Design:

The robot stabilisation problem can be divided into two different


control problems: robot positioning control and robot orientating control.
The robot positioning control must assure the achievement of a desired
position (xref; yref), regardless of the robot orientation. The robot
orientating control must assure the
achievement of the desired position
and orientation (xd; yd; θd)

Robot positioning control


In Figure 55 illustrates the
positioning problem, where dL is the
distance between the robot and the
desired reference (xref; yref) in the
Cartesian space. The robot positioning
control problem will be solved if we
assure dL –> 0. Figure 53 Position Problem

To overcome this problem, we can define two new


variables, dLamda‚ and Phi . dLamda‚ is the distance to R, the nearest
point from the desired reference that lies on the robot orientation line;
Phi is the angle of the vector that binds the robot position to the desired
reference. We can also define dPhi as the difference between the
Phi angle and the robot orientation: dPhi = Phi – θ.

We can now easily conclude that:

So, if dLamda –> 0 and dPhi –> 0 then dL –> 0. That is, if we design
a control system that assures the dLamda ‚ and dPhi convergence to
zero, then the desired reference, xref and yref, is achieved. Thus, the robot
positioning control problem can be solved by applying any control
strategy that assures such convergence.

To make eθ = dPhi , we just need to define θref = Phi,


so eθ = θref – θ = Phi – θ = dPhi. For this, we make:

Page : 46
To calculate es is generally not very simple, because the s output
signal cannot be measured and we cannot easily calculate a suitable
value for sref. But if we define the R point in last figure as the reference
point for the s controller, only in this case it is true that e s = sref – s =
dLamda‚So:

The complete robot positioning controller, based on the diagram of


next figure and the last equations on figure 56. It can be used as a
stand-alone robot control system if the problem is just to drive to robot
to a given position (xref; yref), regardless of the final robot orientation.

Figure 54 Position Control System

And here we ‘re using PID control and simulate some go to goal
behavior of robot on matlab . Start with constant linear velocity and
modified angular velocity (Omega). As we can see that linear velocity
decreases as robot turn with big omega this is to prevent robot from
flipping over

Figure 55 PID GTG Control Simulation

Page : 47
2- Obstacles Avoidance Control Design:
By using 5 ultrasonic sensors which is distributed along the chassis ,
the reading data coming from put it in weighted formula . To have a
described information about obstacles around the robot and which
direction of the nearest one to be most avoid .

The purpose of this control design is to avoid any obstacles near


the robot.
1. Transform the senor distance to a point in the robot’s coordinate
frame.
2. Transform this point from the robot’s coordinate frame to the
world coordinate frame.
3. Compute a vector that points away from any obstacles.

Sensor’s coordinates are in the robot’s frame (centered at robot


with robot’s orientation) . Sensor distance is defined in a sensor’s frame
(centered at sensor, with sensor’s orientation)

Figure 56 Robot Frame


Figure 57 Sensor’s Frame
Rotation and Translation in 2D

Page : 48
A0andGTG Control :
We will use two arbitration techniques, blending and hard
switching, to drive to a goal while avoiding obstacles.

1. Blend go-to-goal and avoid-obstacle vectors in one controller.


2. Switch between go-to-goal and avoid-obstacle controllers
separately.
3. Use the blended controller as an intermediary

Figure 58 AO + GTG Control

Blending
Two controllers in one.

One controller at a time.


Switch from go-to-goal to avoidobstacles near any obstacles
Avoid chattering by using the blended controller between go-to-
goal and avoid obstacles.

Page : 49
9-3 : Vision :

Page : 50
chapeter 9 
HardWare :
 usage,ch/c’s(properites),images,connetctions
10­1­ Sensors  : [Cameras , Ultrasnoic,
Encoders,IMU,IR]
10­2 Actuators  : [Servo Motors: 180,R360]
10­3­ Controllers  :[Rassperry Pi , Arduino]
10­4­ Power [Power Circuit,PCB,Pow.,Vol.,Amp.]

Page : 51
In this chapter we will discuss theses components used to
construct this robot , shown these usage ,charactersitcs, how to connect
, signal or commands used for .Then , in the end of chapter we will show
power circuit used to energize robot.
This robot is may be considerd as an embedded system .And each
embedded system consists of three main components actuators,sensors
and controller .
Here’s the schematic of embedded systems runs this robot .

Figure 59 Embdedded System Schematic

Page : 52
Acutators :
Actuators are the muscles of the manipulators. Common types of
actuators are servomotors, stepper motors, pneumatic cylinders etc.
Wheel Motor :
Motors are the essential part in locomotion of the robot. Rotation
of the motor will assists the wheel to rotate. DC-Servo motors are very
easy to use.
We use 6V 13kg.cm Digital Continuous rotation Servo .with these
specifications :
The imortant two feature of this motor in differ with DC-motor is we
can control the velocity only by send command and leave servo close-
loop control achieves required velocity.
The 2nd do not need for gear box to use .

Figure 60 Servo Motro Specifications Figure 61 FS5113R Servo

Arm Motors :
We use servo motor limited rotation 0-180 o .With use servo motor we
can easly control angle of rotation by passing PWM signal with
corrsponding value . Here’s the main components of servo system .

Figure 62 Servo Motor

Figure 63 Servo Motor Componetns

Page : 53
Here’s PWM signal and the corresponding

Figure 64 Servo Motor PWM

Sensors :
Sensors are used to collect information about the internal state of the
robot or to communicate with the outside environment. Robots are often
equipped with external sensory devices such as a vision system, touch
and tactile sensors etc which help to communicate with the environment

Camera :
We use a high definition camera connected by USB prot

Figure 65 HD Camera

Page : 54
Ultrasonic
This sensor is a high performance ultrasonic range finder. It is
compact and measures an amazingly wide range from 2cm to 4m. This
ranger is a perfect for any robotic application, or any other projects
requiring accurate ranging information. This sensor can be connected
directly to the digital I/O lines of your microcontroller and distance can
be measured in time required for travelling of sound signal using simple
formula as below.

Distance = (Echo pulse width high time * Sound Velocity (340 M/S)/2)
or Distance in cm = (Echo pulse width high time (in us)*0.017)

The module works on 5VDC input and also gives an output signal
directly for detection of any obstacle up to 4M.Power up the sensor by
5VDC using pins “VCC” and “GND”. First of all a 10us trigger input has to
be given to the pin named “Trig” on the sensor. This starts one cycle of
range conversion and sends 8 bursts of sound waves from the
transmitter.

Figure 66 Ultrasnoic Work


As soon as the signals are transmitted the “Echo” pin goes to high
level and remains in high level until the same sound waves are received
by the receiver. If the received sound waves are same as what the same
sensor transmitted then the
Echo pin goes to low level. If
no object is detected within
5M after 30ms the Echo
signal will automatically go to
low level.And Here how
connect HC-04 ultrasonic to
microcontroller Figure 67 Ultrasnoic Signal

Page : 55
Controller :
The controller receives data from the computer, controls the motions of
the actuator and coordinates these motions with the sensory feedback
information.

We use one master computer having machine learing and vision


systme on ,Rasperry Pi 3 , The Pi 3 runs at 1.2 GHz, compared to the Pi
2's 900MHz, and also has an upgraded power system, and the same four
USB ports and extendable 'naked board' design as the Pi 2. "Four years
ago today, we launched the first Raspberry Pi with our friends at Premier
Farnel," Pi founder Eben Upton said.

The full specs for the Raspberry Pi 3 include:


 CPU: Quad-core 64-bit ARM Cortex A53 clocked at 1.2 GHz
 GPU: 400MHz VideoCore IV multimedia
 Memory: 1GB LPDDR2-900 SDRAM (i.e. 900MHz)
 USB ports: 4
 Video outputs: HDMI, composite video (PAL and NTSC) via 3.5 mm jack
 Network: 10/100Mbps Ethernet and 802.11n Wireless LAN
 Peripherals: 17 GPIO plus specific functions, and HAT ID bus
 Bluetooth: 4.1
 Power source: 5 V via MicroUSB or GPIO header
 Size: 85.60mm × 56.5mm
 Weight: 45g (1.6 oz)

Figure 68 Raspberry Pi board

Page : 56
The second other controller is Arduino uno . We used to decrease the
processing load from Rpi , and also for ease way to prototype the
embdedded system .

Figure 69 Arduino Board

Here’s the connection between the two .

Figure 70 Arduino – Raspberry pi connected

Page : 57
chapeter 10 
Mechanical :
 Design,Dimensions, Software used (soildworks )
11­1­ Overview of Robot Design  : the whole robot
11­2­ Chassis Body
     : the cuboid shape,wooden
11­3­ Arm Manipulator  : parts of Arm,endeffector
11­4­ Gripper
     AndWheels Design :

Page : 58
11­1­ Overview of Robot Design : 
   the whole robot
In this chapeter we will disscuss our design and matrial used for fabrication
Our arm design in based on another common articulated arm we mentioned befor .
We had used SolidWorks to design and analysis and extract CAD files .
We had fabricate this design using light wood material cuuting by CNC laser machine

Figure 71 Robot Design

Figure 72 Robot Desing

Page : 59
Figure 73 Robot Top View
11­2­ 
 Chassis Body   : cuboid shape

Figure 74 Chassis Body

Page : 60
Figure 76 Chassis Design
Figure 75 Chassis Design

Figure 77 Chassis Assemply

Page : 61
11­3­ Arm Manipulator  : and parts of Arm

Figure 78 Arm Design

Base Joint

Figure 79 Arm Base Desgin

Page : 62
Camera Holder

Figure 80 Camera holder design

Gripper Joint

Figure 81 Gripper holder


Shoulder

Figure 82 Shoulder Desgin

Page : 63
Elbow

Figure 83 Elbow Design

11­4­ Gripper 

Figure 84 Gripper Design


11­4­ Wheels : 

Figure 85 Wheel Design

Page : 64
chapeter 11 
 Prototyping & Results  : 
Simulations , Farbrications , Assemply , Team
Photos , within-work ,

Page : 65
In this chapter we will review some of prototyping
photos during installation ,assemplying and
fabrication of robot .

Figure 86 Arm Assemply

Page : 66
Figure 87 Assemply wheels

Page : 67
Pick-and-Place

Figure 88 Pick-and-Place Process

Page : 68
Machine Vision processing

Figure 89 label Localization

Figure 90 Facial Emotion Detection

Page : 69
chapeter 12
Conclusion :
  FinalWords of dream ,
hope and work

Page : 70
Along six months or more we had found some challenges .Some of
them are technical such like unprofessionlality of team members and un
founded components in local market . But unlikely we find most of them
are non-technical ones such how to communicate , how to put a time
line plan for out tasks . This beside the need for the money how we can
fund our ideas to translate into real .

We start solve as possible as we can , as usual take time for discuss


and slove , recenlty we can handle most of challenges but still the
biggest one is it will work fine ? As we are finishing project in these
days .I shall say that we were exhausted by it . Awaken nights we hadn’t
slept well . Stress for most decisions . Days of try and try . Bad news for
broken parts and burned components .

What I was found that nothing easy and nothing hard . Everything
you work on regurally will strengthen you . It may digging in your
thoughts in nights until you see in your dreams . From the things that I
had expected is that the idea is everything . What we found is the team
is the first concern you have to take care of then the idea and in the
third place is money . What I had expected is the graduation project
must be something huge size but as we came here we have new huge
ideas need only small prototype to prove it .

We have to finish this but before go we would to say that we have


learnt more and more through this experience . we can’t descibe the
feeling when we saw smilies on our team faces when we finishe it . And
this the ideal end for a team had succeded in . This is the passion we’re
seeking for ,then “ yeah we did it ” .

For recap we had talked about our autonomous robot as our


graduation project and how we got it to help disabled people to enjoy
life . Then , we get deeper with some defientions for robot and
autonomous one , and what our idea about . Some technical details had
came next such like kinematics , control of this robot and vision system
we develop and finally the mechanical design .

Finally , There’s no doubt that graduaiton project is the coronation


of everyone’s university years .It’s the final assignement you asked to
send in deadline . It’s the milstone of whole what you learnt in college
not only these technical skills but also non-technical skills . There’s no
project without stress as there’s no life without hope . But the grad
project is different from any other project because it carries
dreams beside knowledge .

Page : 71
Figures Index , Caption , Page
Figure 1 Robot change the world...................................................................................................................4
Figure 2 Robot Arms......................................................................................................................................5
Figure 3 Wheel Mobile Robots......................................................................................................................6
Figure 4 Legged Robots.................................................................................................................................6
Figure 5 Underwater Robots..........................................................................................................................7
Figure 6 Flying Robot....................................................................................................................................7
Figure 7 Robot Vision....................................................................................................................................8
Figure 8 Industrial Automation......................................................................................................................9
Figure 9 Robot Industrial Revolution..........................................................................................................14
Figure 10 Roomba Robot.............................................................................................................................14
Figure 11 Robots in Kitchen........................................................................................................................15
Figure 12 SCRUBMATE Robot...................................................................................................................15
Figure 13 Disabled People...........................................................................................................................15
Figure 14 Robotic Growth Market...............................................................................................................17
Figure 15 Amazon Robots............................................................................................................................19
Figure 16 inVia Robot..................................................................................................................................20
Figure 17 inVia Robots................................................................................................................................20
Figure 18 inVia Software.............................................................................................................................20
Figure 19 Care-O-Bot..................................................................................................................................21
Figure 20 Care-O-Bot Services....................................................................................................................21
Figure 21 ORFOMED..................................................................................................................................22
Figure 22 Robot Help Disabled....................................................................................................................22
Figure 23 JACO Arm...................................................................................................................................22
Figure 24 KUKA Robot...............................................................................................................................23
Figure 25 Software Associated Work...........................................................................................................25
Figure 26 Hardware Associated Work..........................................................................................................26
Figure 27 Mechanical Associated Work.......................................................................................................26
Figure 28 Team Structure.............................................................................................................................27
Figure 29 ABB Robot...................................................................................................................................29
Figure 30 Arm Kinematic.............................................................................................................................29
Figure 31 Robot Workspace.........................................................................................................................29
Figure 32 Robot Dimensions.......................................................................................................................30
Figure 33 Kinematic Decoupling.................................................................................................................34
Figure 34 Inverse Geometric Approach.......................................................................................................35
Figure 35 Projection of the wrist..................................................................................................................35
Figure 36 Singularity....................................................................................................................................36
Figure 37 Right Arm Configuration.............................................................................................................36
Figure 38 Four Kinematic Solutions............................................................................................................37
Figure 39 Matlab Simulation.......................................................................................................................38
Figure 40 Forward Kinematic......................................................................................................................38
Figure 41 Manipulator Frames.....................................................................................................................39
Figure 42 Workspace Plane View.................................................................................................................39
Figure 43 Developed Workspace..................................................................................................................40
Figure 44 Two Solutions Kinematic.............................................................................................................40
Figure 45 Differential Car Model.................................................................................................................41
Figure 46 Car Angular velocity....................................................................................................................41
Figure 47 Differential Car Kinematic..........................................................................................................42
Figure 48 Pick-and-Place Control Algorthim..............................................................................................43
Figure 49 Arm Go to Point...........................................................................................................................44

Page : 72
Figure 50 Arm Go to point...........................................................................................................................44
Figure 51 Differential Robot........................................................................................................................45
Figure 52 Robot Design...............................................................................................................................45
Figure 53 Position Problem..........................................................................................................................46
Figure 54 Position Control System..............................................................................................................47
Figure 55 PID GTG Control Simulation......................................................................................................47
Figure 56 Robot Frame................................................................................................................................48
Figure 57 Sensor’s Frame............................................................................................................................48
Figure 58 AO + GTG Control......................................................................................................................49
Figure 59 Embdedded System Schematic....................................................................................................52
Figure 60 Servo Motro Specifications.........................................................................................................53
Figure 61 FS5113R Servo............................................................................................................................53
Figure 62 Servo Motor.................................................................................................................................53
Figure 63 Servo Motor Componetns............................................................................................................53
Figure 64 Servo Motor PWM......................................................................................................................54
Figure 65 HD Camera..................................................................................................................................54
Figure 66 Ultrasnoic Work...........................................................................................................................55
Figure 67 Ultrasnoic Signal.........................................................................................................................55
Figure 68 Raspberry Pi board......................................................................................................................56
Figure 69 Arduino Board.............................................................................................................................57
Figure 70 Arduino – Raspberry pi connected..............................................................................................57
Figure 71 Robot Design...............................................................................................................................59
Figure 72 Robot Desing...............................................................................................................................59
Figure 73 Robot Top View...........................................................................................................................60
Figure 74 Chassis Body...............................................................................................................................60
Figure 75 Chassis Design.............................................................................................................................61
Figure 76 Chassis Design.............................................................................................................................61
Figure 77 Chassis Assemply........................................................................................................................61
Figure 78 Arm Design..................................................................................................................................62
Figure 79 Arm Base Desgin.........................................................................................................................62
Figure 80 Camera holder design..................................................................................................................63
Figure 81 Gripper holder..............................................................................................................................63
Figure 82 Shoulder Desgin...........................................................................................................................63
Figure 83 Elbow Design...............................................................................................................................64
Figure 84 Gripper Design.............................................................................................................................64
Figure 85 Wheel Design...............................................................................................................................64
Figure 86 Arm Assemply.............................................................................................................................66
Figure 87 Assemply wheels.........................................................................................................................67
Figure 88 Pick-and-Place Process................................................................................................................68
Figure 89 label Localization.........................................................................................................................69
Figure 90 Facial Emotion Detection............................................................................................................69

Page : 73
References :
Introduction Appin Knowledge Solutions, ROBOTICS,Pp.13-23,2007
Mark W. Spong , Robot Dynamics and Control 2nd Edition ,pp. 83-95 ,2014
Kinematics of a car-like mobile robot , Julius Maximilian Universität Würzburg,PP 1-5 ,
2003
Frederico C. VIEIRA , Position And Orientation Control of A Two-Wheeled Robot,
PP1-4 ,
WebSites
Disablitiy https://en.wikipedia.org/wiki/Physical_disability
wheel chairs https://www.livescience.com/5622-robotic-systems-people
disabilities.html
robot market https://www.roboticsbusinessreview.commanufacturing/
Amazon Robot https://www.bostonglobe.com/business/2013/12/01/
Amazon Robot http://time.com/3605924/amazon-robots/
inVia Robot https://www.nanalyze.com/2017/03/invia-robotics-warehouse-
robotics/
Care-O-Bot https://www.indiatimes.com/boyz-toyz/machines/careobot-
service-robot-73511.html
JACO Arm https://www.online-wohn-beratung.de/produktschau/
KUKA https://quality-engineering.industrie.de/allgemein/keine-angst-vor-
grossen-teilen/
6 servo robot ARM http://www.arexx.com.cn/en/ProductShow.asp?ID=58
Forward Kinematics https://robotics.stackexchange.com/questions/8549/d

Page : 74

You might also like