You are on page 1of 38

Proceedings of the 18th World Congress

The International Federation of Automatic Control


Milano (Italy) August 28 - September 2, 2011

Design and Control of Automated Guided


Vehicle Systems: A Case Study
Q. Li, A. C. Adriaansen, J. T. Udding, and A. Y. Pogromsky ∗

Department of Mechanical Engineering, Eindhoven University of
Technology, 5600 MB, The Netherlands (Tel: +31-40-2473578; e-mail:
q.li@tue.nl).

Abstract: In this paper, we study the design and control of automated guided vehicle (AGV)
systems, with the focus on the quayside container transport in an automated container terminal.
We first set up an event-driven model for an AGV system in the zone control framework.
Then a number of layouts of the road network (guide-path) are carefully designed for the
workspace of the AGVs in a container terminal. Based on our zone control model, a traffic
control strategy is proposed, which decouples the motion conflict resolution among the AGVs
from the routing problem. A routing algorithm is also constructed with the goal of minimizing
the vehicle travel distances and the transportation time. The efficiency of the integrated design
and control is demonstrated by computer simulations in terms of a set of defined measures of
system performance. Lastly, we point out several possibilities towards improving our current
results.

1. INTRODUCTION step towards a complete solution, in this paper, we present


a package of designs and controls that address the first,
Automated guided vehicles normally mean mobile robots third and fourth issues simultaneously for the quayside
(or unmanned vehicles) used in transporting objects. They container transportation.
were traditionally employed in manufacturing systems, Our contributions can be summarized as follows. We first
but have recently extended their popularity to many establish a novel zone control model underlying our design
other industrial applications, such as goods transportation and control of an AGV system. This model elaborates
in warehouses and container transshipment at container the structure of the road network (RN) and defines the
terminals. See Vis [2006] for an comprehensive survey of vehicles’ behavior when traveling on the road network.
the research on the design of AGV systems. Then based on this model, a number of RN layouts
The booming of international trade spurs the development are designed according to the specifications of a real
of automated container terminals (ACTs), equipped with ACT. With these settings, a traffic control strategy is
automated container transshipment systems (consisting of proposed for the warranty of no inter-vehicle collisions
automated cranes and automated guided vehicles etc.), and deadlocks. The traffic control enables the vehicles
that can meet rapidly increasing demands for higher oper- to freely choose/change their routes and has small time
ational efficiency, lower costs, and smaller variability than complexity. Concerned about the system performance, a
what traditional terminals can achieve. There are four heuristic vehicle routing algorithm is developed which is
main issues in building an AGV system in an ACT. The oriented at minimizing the travel distance of the vehicles
first is the design of the guide-path (we call it road network as well as the timespan for the container transportation.
in this paper) that specifies possible paths on which the Incorporating with a task dispatching scheme of random
vehicles can travel [Steenken et al., 2004]. The second is nature, the performance of the AGV system with the set of
the dispatching problem which is about where and when proposed designs and controls is investigated by computer
vehicles should go for the container loading or discharging simulations. Finally, some discussions and open problems
tasks [Briskorn et al., 2006, Kim and Bae, 1999, Bish et al., for future work are given.
2005, Nguyen and Kim, 2009]. The third is the vehicle
routing aimed at finding good paths for vehicles dispatched 2. ZONE CONTROL MODEL
for certain tasks [Steenke et al., 1993, Duinkerken et al.,
2006, Stahlbock and Voß, 2008a]. The last is the conflict The workspace of the AGVs is divided into non-overlapping
resolutions among the fleet of vehicles and between the zones, and the entrance of a zone is strictly controlled. This
vehicles and other container handling equipments [Evers strategy eases the avoidance of inter-vehicle collisions by
and Koppers, 1996, Lehmann et al., 2006, Kim et al., 2006]. demanding that each zone can be occupied by at most
Although all these matters are interconnected as far as the one vehicle. Another reason that makes it the favorite of
system performance is concerned, it is very difficult to take most applications is that it is simple to install and easy to
them into a comprehensive consideration. As a meaningful expand. Our zone control model described below contains
⋆ The research leading to these results has received funding from the two components: the structure of the road network and the
European Community’s Seventh Framework Programme (FP7/2007- discrete-event based behavior of a vehicle when it follows
2013) under grant agreement no. INFSO-ICT-223844. a route consisting of a sequence of zones.

978-3-902661-93-7/11/$20.00 © 2011 IFAC 13852 10.3182/20110828-6-IT-1002.01232


18th IFAC World Congress (IFAC'11)
Milano (Italy) August 28 - September 2, 2011

2.1 Building blocks of the road network

The road network is composed of lanes, crossings and


depots.

Lane and depot A lane is a finite sequence of zones.


Physically, a lane is a road segment on which a vehicle
can move in the direction specified by the order of zones
of the lane. Particularly, we call the first zone and the last
zone of a lane the starting zone (SZ) and ending zone (EZ)
of the lane respectively.
Every depot is modeled as a zone that can accommodate
any number of vehicles. Each depot is affiliated with at Fig. 1. Quayside container transport at an automated
least one in-lane and one out-lane. An in-lane (resp. out- container terminal
lane) of a depot is a lane with the direction that allows a
vehicle moving inwards (resp. outwards from) the depot. The event-driven behavior of the vehicle vi can be sketched
We emphasize that a depot is not a zone of any lane. as follows. At a certain time when vi is in api , p ∈ Z1ei −1 , it
triggers a “intend to leave” event. For the sake of collision
We use Cto denote the set of all zones in the system and deadlock avoidance, the vehicle is required to check
respectively. some traffic rules (see Section 4) to make a “go ahead”
or “stop” decision. If the vehicle will be permitted to go
Crossing A crossing is physically a junction area con- ahead, it triggers a “leave” event and changes its state
necting multiple lanes. Specifically, each crossing is at- to from api to ap+1i
tached with a set of in-lanes and a set of out-lanes. An butbe moving
will continually ;“intend
otherwise it hasevent
to leave” to stop
for
trigger the
in-lane (resp. out-lane) of a crossing allows a vehicle to leaving the zone ap . In the former case, after the vehicle has
i
move towards (resp. away from) the crossing. The EZ of i
any in-lane of a crossing is named an at-crossing zone of completely entered ap+1, it triggers an “arrive at” event,
the crossing. The zones that are not at-crossing zones are and changes its state to be in ap+1 i . The process above

called off-crossing zones. Note that any depot is an off- keeps iterating and terminates if vi is in aiei.
crossing zone as it is not on any lane. Let R and A be
We say that a vehicle arrives at, intends to leave, or
the sets of all crossings and the set of all at-crossing zones
leaves some zone at time t if the vehicle triggers the
respectively.
corresponding event at t. Note that a vehicle switches its
state only when it arrives at or leaves a zone, and must be
Neighboring zone The concept of neighboring zone char- in the zone when it intends to leave a zone.
acterizes immediate zone-to-zone connections. The set of
neighboring zones of each depot includes the SZs of all its
out-lanes. The neighboring zone of a non-EZ zone c on a 3. LAYOUT DESIGN OF THE ROAD NETWORK
lane is the adjacent zone of c on the lane with respect to
the direction of the lane. The neighboring zone of the EZ of 3.1 Quayside Container Transport at an ACT
any in-lane of a depot is the depot. The neighboring zones
of the EZ of an in-lane of a crossing (i.e., an at-crossing In this work, we consider a container handling scenario in
zone) are the SZs of all the out-lanes of the crossing. We which the quay cranes (QCs) and yard stackers (YSs) are
use Υc to denote the set of neighboring zones of zone c. in charge of the container collection operations in the quay
(One can see from Assumption 1 that neighboring zones area (QA) and yard area (YA); and a team of AGVs are
are defined for any zone in C.) used for shuttling the containers between the two areas.
The operation is illustrated in Figure 1. Specifically, in
We impose the following assumption on the layout of the discharging a vessel, the QCs take the containers off the
road network. vessel and put them in the associated container buffers.
Assumption 1. (a) Each lane is an in-lane (out-lane) of These containers will be later on picked up by some the
either a unique crossing or a unique depot; (b) each lane AGVs, and transported across the transportation area
has at least two zones; (c) any in-lane of a crossing has at (TA) to the container buffers of some specified YSs in the
least one neighboring lane. yard area. There the containers will be put in container
stacks by the YSs. The other way around, in loading a
vessel, the containers are collected from certain yard stacks
2.2 Vehicle routes, states and events by the YSs and transported to designated QCs by the
AGVs. In this case study, the workspace of the AGVs is
We consider an AGV system with N vehicles, which is considered as the combined area of the QA, TA and YA.
denoted by the set V = {v1, v2, · · · , vN }. A finite sequence
of zones ai1, ai2, · · · , aiei , ei ∈e N, is called a route of vi ∈ V Now suppose that a road network defined as in Section 2
−1 is built over the workspace, with the buffer of each QC or
if a
p+1 ∈ Υapi for any p ∈ Z1 (For any m, n ∈ N, Zm
i i n is
YS modeled as one or more zone(s) (See also Subsection
defined to be {m, m + 1, · · · , n} if n ≥ m and ∅ if n < m). 3.2). For simplicity, these zones are said to be the zones
There are only two types of vehicle states: being in aip, for of the QC or YS. Then based on the working scenario
any p ∈ Z1ei , or moving from api to ap+1 i
, for any p ∈ Ze1i−1. described above, each AGV in operation can be seen as

13853
18th IFAC World Congress (IFAC'11)
Milano (Italy) August 28 - September 2, 2011

being assigned a sequence of tasks, where a task of a multiple lanes, joined by crossings, on the same vertical
vehicle means an ordered pair of zones, say (c1, c2). We level. See Figure 3 for an example of such layouts.
call the former zone c1 and the latter c2 the source zone
and the destination zone of the task. In addition, a route For simplicity, each crossing is chosen to be rectangular.
of a vehicle is said to be a route for the task (c1, c2) if it In order to avoid collision between a vehicle passing some
starts with c1 and ends up with c2. crossing and another vehicle waiting at the same crossing.
Each crossing should be able to accommodate the whole
3.2 Layout Design of the Road Network body of a vehicle. Considering the turning radius of the
vehicles, the horizontal dimension of the crossings is set as
In this case study, we consider a typical large container 15m. The vertical dimension of a crossing depends on the
terminal with a TA of the dimension 2000m×40m; and 20 layout design and its location in the TA.
QCs and 66 YSs distributed on the sea side and yard side As the vertical dimension of the TA is considered here
respectively (see Figure 5). In addition, without any loss of as 40m, it allows at most 6 parallel roads. Based on
generality, we assume that there is only one depot, which this restriction, four types of layouts for the TA will be
is located outside the workspace (but connected to the TA discussed:
by its in-lanes and out-lanes), and hence not a subject of
our layout design. For this reason, in the remainder of this (T1) 4 parallel roads and small crossings.
subsection, by zones we mean non-depot zones. (T2) 4 parallel roads and big crossings.
(T3) 6 parallel roads and small crossings.
Geometrical properties of the zones First we fix the (T4) 6 parallel roads and big crossings.
shape of the zones to be rectangular because the rect- For want of space, only examples of the layouts of Types
angular shape of the vehicles. We call the dimension of T1 and T2 are shown here in Figures 3 and 4. (We refer
the edge of a zone along which a vehicle moves the length the reader to Li et al. [2011] for illustrations of other types
of the zone; the dimension of the other edge the width. of layouts.)
The geometric and kinematic specifications of the AGVs
considered in this paper are listed in the table below. A small crossing is defined as a crossing which connects
two adjacent parallel roads (See Figure 3). A big crossing
Geometric and kinematic characteristics of AGVs is a crossing which connects more than two parallel roads
Length Width Max speed Max acc. Max dec. (see Figure 4). One advantage of small crossings is that
10m 4m 7m/s 0.5m2/s 2.5m2/s only a small number of AGVs need to wait at the crossing
Based on the above specifications of AGVs, we put the while some other vehicle is passing the crossing. The main
length of a zone 22m (slightly larger than the sum of the disadvantage of using small crossings is that an AGV has
length of a vehicle and the distance a vehicle takes to make to pass a relatively large number of crossings for a QC-
a full stop from the maximum speed) and the width 7m. YS or YS-QC task. On the other hand, the use of big
The basic rationale behind this choice is that, for safety crossings might increase the waiting time of the AGVs
reasons, the body of a vehicle must be completely inside at the crossings due to possibly more vehicles competing
a zone when the state of the vehicle is in the zone. See Li for the crossings; but in general, compared with using
et al. [2011] for more details. small crossings, it reduces significantly the distances of the
transportation tasks by providing direct channels linking
Layout design for the quay area and yard area For the QA and the YA. The latter point is illustrated in
simplicity, the road network has the same layout at each Figures 3 and 4 by two routes (in dashed lines) that serve
QC or YS, which is illustrated in Figure 2. The buffer of a the same purpose for a vehicle to cross the TA.
QC or YS is modeled by the SZs on three different lanes
In Figures 3 and 4, each crossing with an upward (resp.
in between two crossings. The 2 in-lanes of one of the two
downward) arrow, called a TA-YS (resp. TA-QC) crossing
crossings (left one in Figure 2) allow vehicles to drive off
is linked with the in-lanes and out-lanes of two adjacent
the TA and towards the QC or YS; while the 2 out-lanes
YSs (resp. one QC), i.e., it allows a vehicle to move to two
of the other crossing lead the vehicles back to the TA.
YSs (one QC) from the TA and vice versa by passing it.

Fig. 2. Road network at a QC or YS

Layout design for the transportation area In view of


the geometric characteristics of the TA, we focus on the
layouts with straight horizontal (parallel to the long edge
of the TA) adjacent roads. A road is indeed composed of Fig. 3. An example of layout type T1

13854
18th IFAC World Congress (IFAC'11)
Milano (Italy) August 28 - September 2, 2011

zone for its route out of the neighboring zones of c. But


from Assumption 1 we know that if c is an off-crossing
zone, then c has only one neighboring zone. Therefore,
the only case of interest is when c is an at-crossing zone.
Roughly speaking, the routing algorithm introduced here
ensures that the vehicle always chooses a zone such that
it will be closer to the destination of its current task at
the resulting next crossing. As a direct consequence, this
guarantees the vehicle reach the destination in finite zones.
Beyond this prime property, the algorithm also intends to
“smooth” the motion of the vehicles as much as possible.
Now suppose that cd is the destination zone of the current
task of some vehicle, which may be some zone of a QC, a
YSs or the depot. Then before arriving at some at-crossing
Fig. 4. An example of layout type T2 zone c of crossing r ∈ R, the vehicle needs to select a zone
out of the zones in Υ c = Sr := {cis : li ∈ Or }, where Or
In the layouts consisting of 4 parallel roads, all lanes in the is the set of out-lanes of crossing r, and csi denotes the SZ
2 upper roads have the same direction which is opposite of li. For any zone c∗ , we use dm (c∗ , cd ) and xm (c∗ , cd) to
to the common direction of the lanes in the 2 lower roads. represent the length of the shortest route(s) between zone
Analogously, in the case of 6-road layouts, the direction of c∗ and cd and the smallest number of crossings a
all lanes in the 3 upper roads is the opposite to that of the vehicle has to pass when running from c∗ to cd .
lanes in the 3 lower roads. These choices aim at making
the AGVs brake as little as possible when traveling across Before we state our routing algorithm, define two zone sets:
the TA. See Li et al. [2011] for more details.
4. TRAFFIC CONTROL AND ROUTING Γ1 = {c is∈ Sr : dm(c ,ie cd) < dm(c, cd)},
ALGORITHM Γ2 = {c∗ ∈ Γ1 : c∗ is available};
4.1 Traffic Control where cei is the EZ of the lane li (By assumption 1(a), cei
is either an at-crossing zone or the EZ of the in-lane of the
The goal of the traffic control is to prevent inter-vehicle
depot); and c∗ is “available” means that there is no
collisions and deadlocks. This is an important issue that
vehicle in c∗ , moving from some zone to c∗ , or moving
has attracted intensive research effort [Evers and Koppers,
from c∗ to some zone. In addition, let
1996, Rajeeva et al., 2003, Steenken et al., 2004, Vis, 2006,
Kim et al., 2006]. See also Fanti et al. [1997], Yeh and Yeh
[1998], Reveliotis [2000], Wu and Zhou [2005] for related ∗ ∗ ∗
Γ3 { {c ∗ ∈ Γ2 : c∗ = argmin dm (c∗ , cd)}, if Γ2 ̸= ∅;
studies.
= {c ∈ Γ1 : c = argmin dm(c , cd)}, otherwise.
In Li et al. [2010], we have proposed a set of traffic rules,
required to be checked with when any vehicle intends
to leave any zone, to realize the avoidance of deadlocks Online Vehicle Routing Algorithm
and inter-vehicle collisions. Applying the traffic control
to this case study, it is ensured that (1) no inter-vehicle Step 1: If cd is on some lane lk ∈ Or, then choose the
collision can happen (2) each vehicle can complete any zone cks; otherwise go to Step 2.
finite number of tasks, with any route for each task, if Step 2: If #Γ3 = 1, then choose the single zone in Γ 3;
the route of the last task ends up with a depot (meaning otherwise go to Step 3.
that the vehicle terminates its operation by parking in the Step 3: Let Γ 4 = {c∗ ∈ Γ3 : c∗ = argmin xm (c∗ , cd})
depot). We refer the reader to Li et al. [2011] for the details .If #Γ4 = 1, then choose the single zone in Γ4;
otherwise go to Step 4.
of the traffic control.
Step 4: If there is a zone c∗ in Γ4 to reach which the
An appealing advantage of the proposed traffic control vehicle does not need a turn, then c∗ is selected;
is that it offers a large extent of freedom in routing the otherwise choose randomly a zone from Γ4.
AGVs. In fact, the route for each task of a vehicle can
even be established online rather than necessarily fixed Remark 2. In each layout of the road network designed in
beforehand. Specifically, before a vehicle arrives at some Subsection 3.2, the set Γ1 cannot be empty; and such a c∗ in
zone, it can freely choose its next zone for the route as Step 4, if exists, must be unique.
long as it will reach its current destination in finite zones.
Note that to implement the routing algorithm above, one
Another merit of the traffic control is that it is time-
needs to maintain for each crossing r and each possible
efficient: the time complexity is linearly dependent on the destination zone cd the data sets {dm(c, cd) : c is any at-
number of vehicles in the system.
crossing zone of r }, { dm (cis , cd ), xm (cis , cd ), dm (cie , cd ) : for
4.2 Routing algorithm any li ∈ Or }. But for each given layout, these data can
be generated and stored offline before the system starts
As mentioned in the previous subsection, before a vehicle running. Therefore, the time complexity of the proposed
arrives at some zone c it has the freedom to choose the next routing algorithm is very small.

13855
18th IFAC World Congress (IFAC'11)
Milano (Italy) August 28 - September 2, 2011

5. SIMULATION STUDIES with T ji the time (in second) for vehicle vi to complete its
jth task. (Recall that N is the number of vehicles; and Mi
In this section, via computer simulations, we investigate is the number of tasks for vehicle vi .)
the performance of the AGV systems with the designs and
controls presented in the preceding content. From an energy cost point of view, another performance
measure of interest is the “average travel distance” (ATD)
5.1 Task dispatching scheme whose definition mimics that of ATT with Tji replaced by
Dij , the travel distance for the jth task of vi .
Here we introduce a simple stochastic task dispatching
The last two performance measures, namely “average tar-
scheme that is used to drive the simulations. An in-depth
diness” (ATR) and “standard deviation of the tardiness”
discussion on this topic is beyond the scope of the paper.
(STDTR), indicate how much are the motions of the AGVs
See Steenken et al. [2004] and Stahlbock and Voß [2008b]
delayed by the traffic. The formula for computing the ATR
for recent progress in the research on this subject.
has the same form as (2) but with T ji substituted by
There are three types of tasks for each AGV:
(1) traveling from the depot to a zone of a QC or YS Dij = T ji − min time required to complete task j of vi
(first task only); Here the second term on the right hand side is the time
(2) traveling across the TA, i.e., from a zone of a QC to for vi to complete the task via the shortest route while
a zone of a YS or the other way around; assuming vi the only vehicle running in the workspace.
(3) traveling from a zone of a QC or YS to the depot (last
The STDTR is computed by
task only).
‚ ∑M i
For the tasks of the second type, the source and destination . ∑N (Di − ATR)2
.
stations are not arbitrarily chosen. Instead, according to ∑N
ST DT R = ,
i=1
a typical container transshipment scenario, the 20 QCs j=1 j . (3)

i=1 Mi
and 66 YSs are divided into 5 groups respectively. The
groups of the QCs are disjoint, each of which consists
of 4 QCs; while the groups of the YSs are overlapping 5.3 Simulation results
with each group containing 24 YSs. For each vehicle, the
source and destination zones for any task of the second We run simulations with N = 20, 30, · · ·, 120 and Mi = 50
type are randomly selected respectively from the zones of for any i ∈ZN 1. In order to obtain adequate precision of
the related QC and YS groups (see Figure 5). analysis, we repeat the simulations for each layout for 20
times, and take the average value for each performance
In addition, for the tasks of types (1) and (2), after a measure. Because of the page limit, we refer the reader to
vehicle arrives at the destination zone of each task, it will Li et al. [2011] for the numerical results.
stay stationary in the zone for 30 seconds for a container
loading or discharging process before being assigned a new We see from the simulation results that the layout with
task. four roads and big crossings remains the best choice for
this wide range of the size of the vehicle team. This can be
roughly explained as follows: As mentioned in Subsection
3.2, the use of small crossings may have less vehicles
waiting at the crossings in the TA (not much less since
simultaneous passings of a big crossing by two vehicles
are not necessarily conflict with each other). However, in
average, the travel distance of the vehicles in this case
is considerably larger than that in the case of using big
crossings. The use of six roads, compared with four roads,
renders more space (lanes, zones) for vehicles’ movement.
But this benefit is offset by increased travel distances and
Fig. 5. Task dispatching scheme number of vehicles competing for crossings.
In addition, we note that using the layout with four roads
5.2 Performance measures and big crossings gives moderate ATR and STDTR even
for a large team of operational vehicles, which implies that
Several performance measures are defined here in different in this case (1) the traffic congestion is not too bad; and
aspects of concern. The first measure is related to the (2) one can estimate for each task the arriving time of the
throughput of the terminal, called “average tasks per- vehicle performing the task with an acceptable error.
formed per hour” (ATP), which is defined as
3600 6. CONCLUSIONS AND DISCUSSIONS
ATP = , (1)
ATT In this paper, we give a solution to the design and
where ATT denotes the “average task time” calculated by control of the AGV system for an automated container
∑ N ∑ Mi j terminal. The popular zone control approach is used here
ATT = i=1 j=1 T i, (2) to ease the traffic management. Various road network
∑N
i=1 Mi
layouts are designed based on the practical dimension of

13856
18th IFAC World Congress (IFAC'11)
Milano (Italy) August 28 - September 2, 2011

a container terminal. A traffic control scheme as well as K. H. Kim and J. W. Bae. A dispatching method for auto-
a routing algorithm are developed. Computer simulations mated guided vehicles to minimize delays of container-
demonstrate the efficiency of our results in terms of some ship operations. International Journal of Management
defined performance measures. Science, 5:1–25, 1999.
K. H. Kim, S. M. Jeon, and K. R. Ryu. Deadlock
For an interesting comparison, we also did computer prevention for automated guided vehicles in automated
simulations with free ranging AGVs, where the collision container terminals. OR Spectrum, 28:659–679, 2006.
avoidance is realized by the so called potential field based M. Lehmann, M. Grunow, and H.-O Günther. Deadlock
approaches [Whitcomb et al., 1992]. We refer the reader handling for real-time control of agvs at automated
to the conclusion part of Li et al. [2011] for our findings. container terminals. OR Spectrum, 28:631–657, 2006.
There are several possible directions for further research. Q. Li, J. T. Udding, and A. Y. Pogromsky. Mod-
First, in our traffic control strategy, it is required that each eling and control of the AGV system in an au-
lane of the road network must contain at least two zones. tomated container terminal. In Proc. of the Asi-
To somehow weaken this restriction may be important for aMIC 2010, Phuket, Thailand, 2010. (available at
some applications where the workspace of the AGVs is http://yp.wtb.tue.nl/pdfs/12192.pdf).
very limited. In addition, one could think of a relaxation of Q. Li, A. C. Adriaansen, J. T. Udding, and A. Y. Pogrom-
the token-holding requirement in the traffic control scheme sky. Design and control of automated guided vehicle sys-
so that multiple vehicles can leave different at-crossing tems: A case study. Technical report, Mar. 2011. (avail-
zones simultaneously, and hence the performance of the able at http://yp.wtb.tue.nl/pdfs/12777.pdf).
AGV system can be improved. As far as we could tell, using V. D. Nguyen and K. H. Kim. A dispatching method for
local crossing tokens, instead of a global one required in automated lifting vehicles in automated port container
this work, and adding inter-crossing communications may terminals. Computers & Industrial Engineering, 56:
lead to a successful trial. 1002–1020, 2009.
L. M. Rajeeva, H. G. Wee, W. C. Ng, C. P. Teo, and
The designs of the layout of the road network and the N. S. Yang. Cyclic deadlock prediction and avoidance
routing algorithm are vital to the performance of the for zone-controlled agv system. International Journal of
AGV system. The designs presented in this paper are Production Economics, 83:309–324, 2003.
partially heuristic with the help of intuitions and practical S. A. Reveliotis. Conflict resolution in agv systems. IIE
experiences. Better results could be obtained by setting Transactions, 32:647–659, 2000.
up and solving formal optimization problems. Along this R. Stahlbock and S. Voß. Vehicle routing problems and
line of thinking, the balance between the optimality and container terminal operations - an update of research.
real-time applicability of the solutions become critical for In S. Raghavan B. Golden and E. Wasil, editors, The
practical applications. Vehicle Routing Problem - Latest Advances and New
In this work, the tasks of the AGVs are assigned in a Challenges, pages 551–589. Springer, Berlin, 2008a.
purely random manner. We predict a room for perfor- R. Stahlbock and S. Voß. Operations research at container
mance improvement in the future if a sophisticated task terminals: a literature update. OR Spectrum, 30:1–52,
dispatching strategy could be incorporated, although we 2008b.
have seen only a moderate tardiness for a typical number D. Steenke, A. Henning, S. Freigang, and S. Voß. Routing
of operational AGVs with our current strategy. Such a of straddle carriers at a container terminal with the
dispatching strategy is desired to work collaboratively with special aspect of internal moves. OR Spectrum, 15:167–
the routing algorithm, and utilize the feedback of the real- 172, 1993.
time traffic flow information. D. Steenken, S. Voß, and R. Stahlbock. Container terminal
operation and operations research - a classification and
REFERENCES literature review. OR Spectrum, 26:3–49, 2004.
I.F.A. Vis. Survey of research in the design and control
E. K. Bish, F. Y. Chen, Y. T. Leong, B. L. Nelson, of automated guided vehicle systems. European Journal
J. W. Cheong Ng, and D. Simchi-Levi. Dispatching Operation Research, 170:677–709, 2006.
vehicles in a mega container terminal. OR Spectrum, L. L. Whitcomb, D. E. Koditschek, and J. B. D. Cabrera.
27:491–506, 2005. Toward the automatic control of robot assembly tasks
D. Briskorn, A. Drexl, and S. Hartmann. Inventory-based via potential functions: the case of 2-d sphere assem-
dispatching of automated guided vehicles on container blies. In Proc. of the IEEE Int. Conf. on Robotics and
terminals. OR Spectrum, 28:611–630, 2006. Automation, Nice, France, 1992.
M. B. Duinkerken, J. A. Ottjes, and G. Lodewijks. Com- N. Q. Wu and M. C. Zhou. Modeling and deadlock
parison of routing strategies for agv systems using sim- avoidance of automated manufacturing systems with
ulation. In Proc. of the 2006 Winter Simulation Con- multiple automated guided vehicles. IEEE Trans. on
ference, Monterey, Canada, 2006. Systems, Man, and Cybernetics, Part B: Cybernetics,
J. J. M. Evers and S. A. J. Koppers. Automated guided 35:1193–1202, 2005.
vehicle traffic control at a container terminal. Trans- M. S. Yeh and W. C. Yeh. Deadlock prediction and
portantion Research Part A: Policy and Practice, 30: avoidance for zone-control agvs. International Journal
21–34, 1996. of Production Research, 36:2879–2889, 1998.
M. P. Fanti, B. Maione, S. Mascolo, and B. Turchiano.
Event-based feedback control for deadlock avoidance in
flexible production systems. IEEE Trans. on Robotics
and Automation, 13:347–363, 1997.

13857
Available online at www.sciencedirect.com

ScienceDirect
Procedia Engineering 97 (2014) 2011 – 2021

12th GLOBAL CONGRESS ON MANUFACTURING AND MANAGEMENT, GCMM 2014

Automated Guided Vehicle with Robotic Logistics System


Jaiganesh .V*, Dhileep Kumar . J1, Girijadevi . J2
* Professor, Department of Mechanical Engineering, S.A Engineering college,
Veeraraghavapuram, Thiruverkadu Post, chennai-600 077.

Abstract

Development of automated guided vehicle plays a major role in engineering industries to improve the material handling
technique for recent years. In this paper, an automated guided vehicle (AGV) includes a material transfer system located on the
top and driving device at the bottom to move the vehicle as desired. The vehicle is a customised AGV in which it will do the
special material handling task and also used for custom applications. The vehicle works on its own once the program is feed into
the control device. The control device is common to both driving device and transfer device which are connected together. The
control device operates the vehicle and maintains the ultimate process of automated guided vehicle.
Proximity sensors are set up in the AGV’s pathway to detect the vehicle movement which directly controls the start and
stop process of AGV. Photo sensors are incorporated to detect the material or object in the station. A material transfer syst em
includes loading and unloading of material through set of specific device, in which the electrical connections are interconnected.
The control device receives signal from the transfer device once transferring gets completed and transmits signal to the
driving system to move the vehicle to the next destination point. In accordance with the flow path, the magnetic tape method is
best suited to this vehicle for best outcome. The best flow path is designed considering all aspects. It is a battery powered vehicle
in which it charges automatically. Inductive power transfermethods were implemented in the vehicle to enhance better
performance.

©©2014
2014TheTheAuthors. Published
Authors. by Elsevier
Published Ltd. ThisLtd.
by Elsevier is an open access article under the CC BY-NC-ND license
(http://creativecommons.org/licenses/by-nc-nd/3.0/).
Selection and peer-review under responsibility of the Organizing Committee of GCMM 2014.
Selection and peer-review under responsibility of the Organizing Committee of GCMM 2014

Keywords:Control device; Proximity sensors; Transfer device; Photosensors; Magnetic tape;

* Corresponding author. Tel.: +91 9443278181.


E-mail address: drjai@saec.ac.in

1877-7058 © 2014 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license
(http://creativecommons.org/licenses/by-nc-nd/3.0/).
Selection and peer-review under responsibility of the Organizing Committee of GCMM 2014
doi:10.1016/j.proeng.2014.12.444
2012 V. Jaiganesh et al. / Procedia Engineering 97 (2014) 2011 – 2021

1. Introduction

The world is approaching for a new design, why not it should be applied for the material movement system.
To improve efficiency and reduce the cost of human operators in manufacturing as well assign logistics, the
corporate and organizations use the robots as an efficient tool in order to achieve the remarkable tasks. Automated
guided vehicle (AGV) is one typeof effeminatemobile vehicle designed primarily to move the material from one
place to another. AGV’s are commonly used in manufacturing plant, warehouses, distribution centers and terminals.
For navigation, AGV’s mostly use lane paths, signal paths or signal beacons. Various predominant sensors were also
using in the AGV’s; for example, optical sensor, laser sensor, magnetic sensor and camera. Modern AGV’s are free
ranging AGV’s in which their tracks are software programmed and especially they are customized one when new
stations and flows are added. The obstacles were also detected by the AGVand forthcoming danger in their path.

The prodigious automation tool in the industry for automation is Programmable Logic Controller (PLC).
To accomplish the desired functions Programmable digital computer (PLC) are used. PLC is user friendly and it will
not affect the environment at any cause which allows the processing of analog and digital signals from the sensors
so the present investigation use PLC [2]. In this proposal, an articulated robotic arm is mounted over the vehicle to
transfer the load from station to the AGV and from AGV to the station [3]. Since robots are the efficient equipment
for the use of movement of material which neglects the human operator.

2. Depiction of Control Device

The control device controls the vehicle movement and the speed of the vehicle up to a greater accuracy
through some set of specially convened equipment’s. The control device controls the entire vehicle up to a certain
distinct. The auxiliary components of the control device are PLC operated brushless DC motor, proximity sensor,
photo sensor and etc.

2.1 PLC operated Brushless DC motor

Brushed DC motors have been used in industry because of their linear characteristics and also due to their
ease of adjusting their speed through by means a simple power electronic circuit. The main drawback for such
system is the commutation used in brushed DC motors. However, these types of motors are able to operate at low
efficiency especially those which have low power ratings. TheBrushless DC motor (BLDCM) gives the solution for
the limitations in brushed AC motors and DC motors, with superior performance. The present investigation deals
with the PLC operated BLDCM for better performance [1] [7] [8].

Brushless DC motors can be divided into two types, Sinusoidal back-EMF (BEMF) and Trapezoidal
BEMF. The AGV always deals with the BLDC motor with a trapezoidal BEMF [1]. The constant torque in
BLDCMs’ BEMF is achieved by a trapezoidal wave form and also the stator winding which is fed to the circuit by a
rectangular current. The three phase windings are mounted on the stator while the rotor (with magnets) is free to
move. Two phases at any one time is always present to provide a continuous torque.
V. Jaiganesh et al. / Procedia Engineering 97 (2014) 2011 – 2021 2013

Figure 1 shows the drive system for BLDCM

The complete scheme for a BLDCM is shown in Fig. 1 which illustrates the sequence followed to drive
such systems. As the concept is explained, the controller waits until the position sensor sends the signal. The
controller controls the windings based on the position of the rotor. If the system constructs a loop system, the motor
will continue running.[1].

2.2 Proximity Sensor


The proximity sensors are used in our vehicle for two purposes; one for the object detection nearby the
vehicle whilst another for the detection of AGV at the stations. Inductive proximity switches are used to detect
metallic objects without physical contact with the object. Their high-speed switching and compactness in their size
make them indispensable in the applications of automation. Inductive proximity switches consist of a coil driven by
an oscillator. The oscillator develops an electromagnetic field which will appear at the active face of the switch.
When a metal target enters this region, the electromagnetic field is reduced and the switch turns on or off. They may
even be used to count metallic objects, monitor the position of the elements in a machine and also in sensing the
presence of the metallic parts. It avoids major accidents.

It is mounted on the lower bottom of the vehicle and it works under by applying voltage to couple of LEDs which in
order emits infrared light passed to the object through the air and reflected back. When the object is nearby to the
vehicle, the light is very stronger which is used to sense and stop the vehicle. It includes the operation such as
detection, positioning and inspection. It avoids collision and it is also termed as distance sensors. On the other hand,
a couple of proximity sensors were incorporated in the stations to sense the presence of AGV in the station, which in
turn signals the vehicle to stop [10].

2.3 Programmable Logic Controller

A Programmable Logic Controller (PLC) is an industrial computer control system that monitors the input
devices and makes resolution to control the output devices. PLC consists desperate circuits such as data transfer
circuit, fundamental programmable logic circuits, programmable wiring circuits, a decoder Circuit, and an input/
output circuit. The PLC microprocessor is substantially termed as industrial computer which as vast and spacious
application in developing industries. It performs automated tasks and resolving problem by itself without
intervention of humans according to the loaded programs which manipulate relay logic technology. To meet the
necessitate demands of hard industrial environments; PLCs are designed to be extremely robust, capable of
withstanding paramount temperatures, vibration, and noise. Due to the well said advantages the present proposal
uses PLC rather than the microcontroller which act as an effective tool for better performance [2].
2014 V. Jaiganesh et al. / Procedia Engineering 97 (2014) 2011 – 2021

2.4 Navigation

AGV‘s use tape for guide path. This sensor decides the path of AGV which guides the vehicle in a confined
direction. The magnetic tape method is used here where the vehicle follow the magnetic tape which is affixed or
adhered on the floor. This process can be done by magnetic guide sensor and motor controller. The magnetic sensor
detects the length and distance of magnetic tape and sends signal to motor controller. The motor controller steers the
vehicles according to the received signal which will maintain the vehicle in the centre of the magnetic tape in order
to avoid collision. One of the major advantages of this tape guidance method is that it can be easily removed and
relocated if the routing needs to change. The tape can also set the AGV to speed up, slow down or to stop. The dual
polarity is the trump card of these magnetic tape methods in AG carts. This helps in conflict-free routing and
scheduling of vehicles in flexible manufacturing system. In addition to that photo sensors are also used for better
accuracy and performance. For collision avoidance, the unexcelled sensor is vision based sensor [9].

3. Material Transfer Device

Fig. 2 shows the circuit diagram for Proximity sensor

The material transfer control system design consisted of the installation of the electronic components (the
PLC,a voltage regulator, a control pendent, a motor controller and a bride board circuit) [3].The entangled
programming was possible in the PLC for the automatic mode of operation. It was found that a Robotic Arm with a
simple mechanism can be manipulated in a different complicated way by using a PLC. A programmable logic
controller (PLC) is a type of digital computer that has an input and an output interface, controlled by a simulated
program designed in a computer and it is used for automation for electromechanical process, typically for industrial
use. In industry, PLCs are made to control the machinery of production lines. A PLC is designed for multiple input
and output arrangements and these inputs and outputs are logically programmed in different forms, such as a ladder
diagram, a structural text and a functional block diagram and stored in the PLC’s memory. PLCs are
reprogrammable and it can have monitors online to know the status of the operation. A PLC is best example for hard
real time system since output results must be produced in response to input conditions within a limited time,
otherwise an unintended operation will result.

3.1Robotic Arm
V. Jaiganesh et al. / Procedia Engineering 97 (2014) 2011 – 2021 2015

Fig. 3 shows the robotic arm

In the industrial world, automation is one of the most important elements for development. It helps to
reduce the need for humans and increase efficiency and productivity [5]. The field of automation occupies large
areas, mostly in industrial manufacturing and in addition to this; automation is applied to build a lot of sophisticated
equipment [6]. Among all of these outcomes, the Robotic Arm is one of them, which is widely used in industrial
purposes. A Robotic Arm can be compared to a human hand. It has a free rotating joint (rotation) and a translational
joint (displacement) for the movement of the arm. This arm movement is driven by an electric driver (motor).The
forthcoming sections cover the various parts of the Robotic Arm for its movement [3].

3.1.1 Functional Parts

A motor controller is a device that serves to control in a predetermined manner of the performance of an
electric motor. A motor controller may also include an automatic or manual means for starting and stopping the
motor and also in the selection of the forward or reverse rotation, selection and regulation of the speed, regulation or
limitation of the torque, and also in protection against overloads and faults. For this application, the two base motors
axes have a large amount of load because of the heavy arm and to drive motor smoothly a motor controller is
needed. During starting and stopping, the motor controller helps to accelerate and decelerate the motor in a
predefined speed to avoid damage and inaccuracy. Also the reason is that, the PLC analog output interface current is
not sufficient to operate the motors. Depending upon the task, the motors are driven at defined speeds and directions
which can be easy controlled using the motor controller. A DC motor is an electric motor that runs on direct current
(DC). A DC motor is used for driving the axis of the robot. The axis of the arm needs a larger amount of torque than
the nominal torque which is supplied by the DC motor in its nominal speed. So, the torque of the motor is amplified
with the help of a gear system which is embedded in the DC motor. A potentiometer is a three terminal resistor with
a sliding contact that forms an adjustable voltage divider for measuring the electric potential (voltage). It is
commonly used in many electrical devices such as volume controls in audio equipment, position transducers, signal
adjustment etc. [3].

3.1.2 Functional Description

A simple method for the movement of a Robotic Arm can be monitored and controlled by using a
potentiometer. This system is built to control every joint movement manually. The shaft of a potentiometer is
attached to the shoulder or elbow joint or motor.As the joint rotates, it turns the shaft of the potentiometer which
changes the resistance; this change in resistance indicates the precise position of the joint.
2016 V. Jaiganesh et al. / Procedia Engineering 97 (2014) 2011 – 2021

Fig. 4 shows the arrangement of components To move the robotic arm

In our conditions, the Robotic Arm has all the rotating joints so the present discussion uses the rotary
potentiometer for every joint. The rotary potentiometers have the limitation of angular freedom. Typically it has
about a 0°-280° rotation angle which is sufficient to move the Robotic Armin all directions.

A constant voltage is connected for each of the potentiometers and the position of the arm is given in an
Analog form, i.e. voltage. The analog value of voltage is converted to digital bits by the PLC and the calculation is
done on the basis of the digital values.

The axis of the Robotic Arm is driven by DC motors and each DC motor has a potentiometer attached to
the gear axis so that the potentiometer revolves as the motor rotates. The potentiometer gives a certain value of
voltage as a feedback which is used to detect the position of that particular motor. The analog input voltage is
converted to digital value and is saved as the position of the point. It is done for each point of the potentiometer. A
set of analog values from all the potentiometers is saved in the PLC as a digital number and this set of digital
numbers give the position and direction of the arm.

The saved digital number is now converted to an analog output voltage which is connected to a comparator.
The comparator compares the input voltage from the potentiometer and the output voltage from the PLC to control
the motor and the motor is stopped only when both voltages are the same. With the help of the comparator, the
motor is driven to the position where it should be. Depending upon the task, the Robotic Arm is then programmed to
move to every position in sequence and performs the gripping and releasing task [3].

3.2 Design and Construction of the Mechanical Gripper

This section includes the design and construction phase of the gripper. As compare to real design process in
a real working life, this gripper design also includes all the essentials aspects of a product development. The
following picture shows the mechanical gripper for a robotic arm. This mechanical grippers use the linear actuator
which can be easily controlled by the DC motor. The main aim to use the DC motor in gripper is to make all the
functions controlled by electrically. Using extra pneumatic or hydraulic system for gripper adds more complication
and expenses and unwanted needs as well as makes the vehicle more complex.
V. Jaiganesh et al. / Procedia Engineering 97 (2014) 2011 – 2021 2017

3.2.1 3D Diagram of the Mechanical Gripper

Fig.5 3D Diagram of the Mechanical Gripper


Features and Comments

✓ It uses the linear actuator as a driver.


✓ Large opening and closing.
✓ The parallel hand movements help effective gripping.
✓ Designed with high quality steel.
✓ Expensive design and manufacturing cost
✓ Smooth movement.
✓ Moderate weight.

3.2.2 Operating Conditions of the Mechanical Gripper

Fig.6 shows the closed position of the Mechanical Gripper

Fig.7 shows the open position of the Mechanical Gripper

TheFig.6 shows the closing position of the gripper. The gripping action is flexible for any size of object due
2018 V. Jaiganesh et al. / Procedia Engineering 97 (2014) 2011 – 2021

to the parallel movement of the hands. The working mechanism of the gripper is based on the formation of
parallelogram by four joints. The gripping action takes place due to the sliding of threaded bolt in a linear actuator
which changes the orientation of the parallelogram.The Fig.7 shows the Opening action of the gripper. Another
options for the gripper is magnetic gripper. The magnetic gripper can be bought from the market and, simply
assemble to the arm which save the time and the cost. But the lack of time and availability of the resources makes
the construction of gripper to be terminated. It is not recommended to use any kinds of gripper in this arm [3].

3.3 Working Mechanism of Material Transfer Device

Fig.8 shows the components of the PLC and functional parts for the movement of Robotic Arm. output
voltage for driving the motor depends upon the two potentiometers voltage value: the driving potentiometer and the
axis potentiometer. Two potentiometers are integrated into the feedback amplifier circuit. The driving potentiometer
voltage works like an input voltage and it can be set by the user whereas the axis potentiometer works like feedback
voltage, altering the output. The axis potentiometer voltage depends upon the position of the axis and its changes
due to the rotation of the axis. When the input voltage and feedback voltage are in the same phase, then the output
becomes positive and it drives the motors in a positive direction until the input voltage(driving potentiometer) and
the feedback voltage (axis potentiometer) have the same voltage value. When the input voltage and feedback voltage
are in the inverse phase, then the output becomes negative and it drives the motors in a negative direction until these
voltages are the same. In every case, the direction of the motor is set in such a way that it rotates to change the
voltage of the axis potentiometer the same as the driving potentiometer voltage and then the

Fig.8 shows the components of the PLC to control the Robotic Arm
V. Jaiganesh et al. / Procedia Engineering 97 (2014) 2011 – 2021 2019

4. Construction of AGV

Fig. 9 shows the construction of the present Proposal of the AGV

Proposal of the AGV has the following functional parts

1. PLC operated brushless DC motor


2. PLC Components
3. Battery
4. Front wheel
5. Rear wheel
6. Proximity Sensor
7. Photo Sensor
8. Robotic Arm
9. Material to be shifted
10. Provision for accommodation of material

In the fig. 9, The PLC operated BLDCM is positioned on the rear end of the vehicle to move the vehicle as
desired. A PLC panel board is sited behind the front axle of the vehicle. The battery of the vehicle is situated behind
the PLC panel board for easy connections. A robotic arm s mounted on the top of the vehicle and the PLC
components for robotic arm is placed in the panel board. The BLDCM and Robotic Arm were interconnected with
the PLC panel. The photo sensor is placed on the upper front of the vehicle and a proximity sensor is placed on the
lower front of the vehicle which is interconnected to the PLC panel board. A provision is provided on the top of the
vehicle for the accommodation of materials to be transferred.

5. Routing of AGV – Sample Layout

A - Location of Couple of Proximity Sensor at station 1


B – Station 1
C – Location of couple of proximity sensor at station 2
D- Station 2
2020 V. Jaiganesh et al. / Procedia Engineering 97 (2014) 2011 – 2021

E- Location of couple of proximity sensor at station 3


F – Station 3
“a”-AGV 1
“b”-AGV 2

Fig. 10 shows the sample routing of AGV for Material movement

Working of the proposed AGV in Sample Routing

The AGV receives signal from the common network so that it starts from the chambers and reach the
destination point, the AGV will be stopped by the couple of proximity sensor which is placed before each and every
stations. Once the AGV reached the station, the sensor will read whether the material is available or not if so it is
observed then the robotic arm will turn around right angle and move a little forward to pick the material and comes
back to its original position after placing the material in its right place provided a provision in the vehicle. The
output signal of the robotic arm will send a signal to the PLC which generates a pulse to move the vehicle through
brushless DC motor so that it will reach the next destination point and the process continues until it receives signal
from the network.

6. Inductive Power Transfer (IPT)

Fig. 11 shows the circuit of Inductive Power transfer system


V. Jaiganesh et al. / Procedia Engineering 97 (2014) 2011 – 2021 2021

The charging process is done by a constant interaction between the track supply and the onboard battery
management system. The track supply only provides the required power. The Inductive Power Transfer primarily
consists of primary parts which is coupled with track supply and secondary part which is coupled with rectifiers in
the vehicle. Both the parts are magnetically and slackly coupled with an air gap. When the guided vehicle is in the off-
charging position, the primary coil is de-energized by the track supply. The minimum boarded storage energy is
adequate to move the vehicle to adjacent station. There is no need of cables and plugs and it is safer and reliable
operation [4].

7. Conclusion

It is clear that the design of this proposal is very modern and advanced, achieved through the help of PLC.
The salient feature of this proposal is that the vehicle suppresses a Robotic arm on the top for easy transaction of
material. It is expected that this investigation will give higher efficiency than the existing vehicles. The cost saving
is another privilege of this proposal wherein the existing AGV’s need a separate material transfer system for the
transaction of material. Though each and every component in the vehicle is operated with the help of PLC, therefore
PLC plays a paramount role in this proposal. BLDCM, Robotic arm are the crucial parts in the vehicle which gives
accurate movements due to the influence of PLC, which will further increases the efficiency of the entire vehicle.
The inductive battery system implemented in the vehicle gives an additional benefit to the vehicle, which acts as a
battery frees AGV. Therefore the entire set up will be procured with the enlightenment of PLC.

References

[1]M. Tawadros, J. Rizk, and M. Nagrial “Brushless DC motor control using PLC” University of Western Sydney (Australia)
[2]Avvaru Ravi Kiran, B.VenkatSundeep, Ch. SreeVardhan, Neel Mathews, “The Principle of Programmable Logic Controller
And its role in Automation” International Journal of Engineering Trends and Technology- Volume4Issue3- 2013
[3] BhimBahadur Lama “Controlling a Robotic Arm manipulator with a PLC” HAMK university of applied science.
[4] Grant A. Covic, Senior Member IEEE, and John T. Boys, Inductive Power Transfer, Proceedings of the IEEE | Vol. 101, No. 6, June 2013
[5]V.Srinivasan, MichealBalraj, “Design and analysis of an articulated robot arm”, Proceedings of the “National Conference on Emerging Trends
in Mechanical Engineering 2k13”
[6]S.Pachaiyappan , M.MichealBalraj, T.Sridhar, design and analysis of an articulated robot arm for various industrial applications, iosr Journal
of Mechanical and Civil Engineering (IOSR-JMCE) e- ISSN: 2278-1684, p-ISSN : 2320–334X PP 42-53
[7] parvizAmiri, MahsaBagheri, “Speed Control of DC Motor by Programmable Logic Control with High Accuracy”Universal Journal of Control
and Automation 1(4): 91-97, 2013 http://www.hrpub.org DOI: 10.13189/ujca.2013.010401
[8] MohdShakir& Abraham T Mathew, “Programmable Logic Control Based Simultaneous Speed Control for Brushless DC Motor & Linear
Induction Motor”, National Institute of Technology Calicut, Kozhikode, India
[9] Zahari Taha1, and JessnorArif Mat Jizat, “A Comparison of Two Approaches for Collision Avoidance of an Automated Guided Vehicle Using
Monocular Vision”, Applied Mechanics and Materials Vol. 145 (2012) pp 547-551Online available since 2011/Dec/08 at www.scientific.net©
(2012) Trans Tech Publications, Switzerlanddoi:10.4028/www.scientific.net/AMM.145.547
[10] Goeger. D, Blankertz. M and Woern. H, “A tactile Proximity sensor” Institute for process control, Autom& Robot, Karlsruhe Inst. Of
technol., Karlsruhe, Germany
Procedia Manufacturing
Volume 5, 2016, Pages 1092–1106

44th Proceedings of the North American Manufacturing


Research Institution of SME http://www.sme.org/namrc

A Prototype Smart Materials Warehouse Application


Implemented using Custom Mobile Robots and Open
Source Vision Technology Developed using EmguCV
David Culler 1*and James Long 2†
1
Oregon Institute of Technology, MMET, Klamath Falls, OR
2
Oregon Institute of Technology, CSET, Klamath Falls, OR
david.culler@oit.edu, james.long@oit.edu

Abstract
Customized mobile robots are commonplace in manufacturing and material handling applications. One
class of mobile robots, known as Automatic Guided Vehicles (AGV), follow a fixed path along the floor
using tracks, RFID tags, or magnetic tape. These robots typically travel along predetermined routes and
offer limited flexibility for changes to be made or for their use in environments like hospitals or military
installations. Moving away from traditional fixed AGV systems to wireless and dynamic control and
monitoring presents some distinct advantages and new opportunities. This type of robot is known as an
Autonomous AGV. A prototype smart materials warehouse is presented in this paper as a platform to
explore some particular aspects of this technology. First, four multi-purpose mobile robots were built
using off-the-shelf BattleBot kits, wireless Arduino controls, and fabricated components. Secondly, a
Microsoft Kinect camera was installed on one robot for visual feedback, obstacle avoidance, and shape
recognition. Lastly, a ceiling mounted IP camera was used with software developed using Visual Studio
.NET and the C# wrapper for OpenCV (EmguCV) to facilitate robot path development, video processing
and real-time tracking. Testing of the completed system was done in a 2000 sq. ft. mock warehouse set
up with stations for shipping/receiving, storage, staging areas, and processes including cutting, milling,
and turning for preparing raw stock to be used in production. As cyber-physical systems research
continues to grow, the integration of computational algorithms, physical systems, wireless controls, and
custom user interfaces will undoubtedly lead to their increased use throughout society. This work was
completed as part of the Northwest Manufacturing Initiative at the Oregon Institute of Technology.
Possibilities for applying the results of this work in the military, retail and service sectors are also
identified. All hardware and software for the project was developed to facilitate future work.

Keywords: cyber-physical systems, vision, smart automation, open source, Automated Guided Vehicle (AGV)

*
Professor, Manufacturing and Mechanical Engineering and Technology (MMET), PI on ARL 2010-2012 project, Author.

Professor, Computer Software Engineering Technology (CSET), Software architecture/programming advisor.

1092 Selection and peer-review under responsibility of the Scientific Programme Committee of NAMRI/SME
§
c The Authors. Published by Elsevier B.V.
doi:10.1016/j.promfg.2016.08.080
A Prototype Smart Materials Warehouse Application Implemented using Custom Mobile Robots and
Open Source Vision Technology Developed using EmguCV David Culler and James Long

1 Introduction
Product and material handling and transportation systems have been used in manufacturing
companies for many years. The level of automation utilized in industry has continued to grow over time
as robotic, electronic, and computer technologies have improved in terms of functionality, ease of use,
integration with other devices, and affordability. With this progress, the level of complexity and inter-
activity between all types of systems has increased dramatically. International interest in innovating and
sharing advances with these systems is being driven by the need to collaborate on projects and increase
global competitiveness. Since mechanical, physical, electrical, and software components cannot be
isolated and do not function independently, the term Cyber-Physical Systems (CPS) has been coined to
guide present and future work. Some definitions for CPS were taken from a recent international
conference and are presented here to provide a foundation and perspective for the work in this paper
(Isaksson & Drath, 2016).
• A system of collaborating computational elements controlling physical entities
• Interconnection between a physical and a virtual world (models)
• Ability for autonomous behavior, self-control, self-optimization
• Different system borders: CPS including or excluding of a physical world
Examples of CPS in manufacturing are being used to modernize and add flexibility to production
systems and integrate them in control networks that span departments and multi-site engineering or
distribution facilities. As mass customization and agility in production line changeovers become a higher
priority, the ability to deliver materials, tooling and supplies must also be considered. The use of multi-
function mobile robots for this purpose is a viable alternative. An Automated Guided Vehicle System
(AGVS) enables flexible material routing and dispatching, and is especially suited for flexible
manufacturing environments in which product mix and priorities may continuously vary (Reveliotis,
2000). Their movement is directed by a combination of software and sensor-based guidance systems.
As this technology continues to expand, many areas outside of manufacturing will benefit from
innovations in the following areas: 1) mobile and autonomous robots, 2) open source software, 3) vision
systems, and 4) wireless communications. Some of the applications to think about are medicine
distribution in hospitals, replenishing products in retail settings, military field operations, and
supplies/tool deployment in large construction projects. Mechanical, manufacturing, industrial,
electrical, and computer engineers must collaborate to develop versatile systems for the future.

1.1 Classification and Description of AGVS


For material handling purposes, it is convenient to use mobile robots that can move freely throughout
a given space. Many manufacturing facilities such as automotive and textile plants as well as storage
and distribution centers use automated material handling equipment. Although this may be the ideal
scenario, the reality is that many tasks require humans and the workplace is a mix of automated and
human operated stations with conditions that change minute by minute. In these environments, work is
often done using AGVS that use markers imbedded in the floor to determine location and path.
Throughout this paper, the terms Mobile Robot and AGV are used without a real distinction being made
due to the fact that once the wirelessly controlled mobile robots were developed and functioning, the
scenario of a materials warehouse was selected to demonstrate the use of these robots as Autonomous
AGV (AAGV) by incorporating path development algorithms and an integrated ceiling mounted camera
for tracking and dynamic control purposes. There has been significant work in classifying AGVS based
on the guide-path, capacity and vehicle addressing method. Previous work has served to formalize an
AGVS control classification scheme. The classification clarifies the impact of design alternatives on the
necessary controller requirements (Peters, Smith, & Venkatesh, 1996). Some of the other classifications

1093
A Prototype Smart Materials Warehouse Application Implemented using Custom Mobile Robots and
Open Source Vision Technology Developed using EmguCV David Culler and James Long

suggested a simple characterization that mentions important characteristics of multi-robot systems and
fits with the description of the warehouse system used in this project (Cao, Fukunaga, & Kahng, 1997).
1) Architecture, Centralized or Decentralized.
2) Differentiation of agents, Homogeneous (similar control systems), Heterogeneous (different).
3) Communication structure, Via the environment, Via sensing, Via communication (intentional)
4) Models of other agents, Intentions, Capabilities, States, Beliefs.
Some previous work was used as a foundation for the project described in this paper such as the
utilization of multi-use, multi-robot AGVs that are controlled under a centralized, homogeneous, direct
signaling and that possess specific capabilities to perform jobs within a warehouse setting (forklift,
platform, light/heavy workload). The addition of the ceiling mounted cameras along with the multi-
capability software to build paths, track the robots and make adjustments based on feedback through the
video processing makes the work presented here different and innovative.
The approach to developing the smart materials warehouse also demonstrates the multi-disciplinary
contributions to the project. Manufacturing, mechanical, and computer engineers initially worked
independently on robot construction and control systems development for mobile robots, computer
software applications, user interfaces, and vision technologies of various types. The subsystem approach
was borrowed from previous work in AGVS evaluation methods. The authors stated that “each sub-
module was thoroughly evaluated in a stand-alone mode. Due to the inherent properties of each sub-
module, system management was evaluated only in simulation, navigation was evaluated both in
simulation and in hardware. The AGVs are equipped with wireless Ethernet communication devices and
communicate with the workstations via the factory intranet (Berman, Schechtman, & Edan, 2008).
Engineering faculty and students from the Oregon Institute of Technology combined off the shelf
items, existing technology, and developed hardware/software to design and implement a prototype smart
materials warehouse where robots were utilized to deliver materials to a series of stations. Real time
tracking was implemented using a ceiling mounted IP camera and a software application built in C-
Sharp (C#) using the .Net wrapper for OpenCV called EmguCV. Initial path development and live
processing of video for locating and monitoring multiple robots working on the warehouse floor was
implemented. Special capabilities like RFID readers, sonar, and

1.2 Description of Project and Breakdown of Paper Content


This paper will describe the main parts of the project including: 1) Robot construction and control,
2) The 3D image processing / obstacle avoidance system using the Kinect vision camera technology
from X-Box video games mounted on a mobile robot, 3) The “eye in the sky” ceiling mounted camera,
and 4) The implementation of the prototype smart materials warehouse used to demonstrate the
capability of the individual and combined components used for the project.
Four robots were designed and built for use in the warehouse application. They are all similar but
have different capabilities and functions to perform. Each is based loosely on the IRobot platform that
has been used in many consumer and military applications, including the Roomba vacuum cleaner and
floor cleaning systems, models for
cleaning swimming pools and
gutters, and multiple models for
infantry and special operations
missions like building clearing,
raids, dangerous conditions, bomb
disposal and larger models for
heavier payloads. See examples of
the robots built for the project
(Kinect camera and two wheel with
castor model) in Figure 1. Figure 1: Two Examples of Custom Mobile Robots

1094
A Prototype Smart Materials Warehouse Application Implemented using Custom Mobile Robots and
Open Source Vision Technology Developed using EmguCV David Culler and James Long
Once the robots were tested, they were ready to be used in a warehouse setting. Computer Software
Engineering Technology (CSET) students and faculty developed a vision system that utilized an “Eye
in the Sky” 3S IP camera and software developed using Visual Studio and EmguCV (the C# version of
OPENCV) to detect the presence of robots in a manufacturing materials warehouse, track their
movement, and identify pathways and obstacles on the floor. As a simulated working scenario for this
research project, the small
robots must transport
materials to be stored,
moved, cut to size,
processed and delivered to
students that are
completing course projects
in the machine shop using
lathes, mills, drill presses,
and grinders. Figure 2
shows the layout and
distribution of equipment
for the mock warehouse
used for the project. Figure 2: Layout for Smart Materials Warehouse

2 Context and Related Work


The smart materials warehouse was completed as part of a three year Army Research Laboratories
(ARL) grant awarded to develop laboratory, academic and research capabilities in the Mechanical and
Manufacturing Engineering and Technology (MMET) department at OIT. Robots are being used
increasingly in the manufacturing and material handling industry as well as in the service sector.
Automation is vital for performing hazardous, boring, or dirty jobs in a cost-effective manner, and
reprogrammable automation allows for fast changeover depending on the specific processes or work
required. New areas and technologies related to Vision, Wireless and Open Source software are bringing
innovation and rapid progress to engineering applications and industry.
There are many possible uses for robots and robot vision including inspection (Jang, Lee, & Moo,
2009), security, and navigation (Okuno, 2008) (Cesetti, Frontoni, Mancini, Zingaretti, & Longhi, 2010).
Using robot vision for navigation purposes is much more involved than following markers in the floor.
It requires a significant amount of processing power (Jang, Lee, & Moo, 2009) and in most cases
complex algorithms for detecting lines, masses, and color. “Non-vision-based techniques of person
localization, such as those based on RFID tags or radio waves, require the person to carry certain
technical devices which are unsuitable in everyday situations. Motion sensors can detect a person
entering or leaving a room, but cannot provide the precise location information. Infrared cameras are
costly and suffer from high degrees of noise in indoor settings. Compared to these approaches, a vision
system promises to provide good performance and a wide use scope at a reasonable cost.” (Wenjie,
Weber, & Wermter, 2011). The reality is that hybrid systems will provide solutions for the future
because each type has benefits that are not produced by the other systems.
Specific examples of work similar to this project are discussed here to provide a background for this
project. Early work done on mobile robot positioning identified six types of possible solutions. Relative
Position measurements including odometry (encoders) and inertial navigation, Absolute Positioning
with beacons, artificial landmarks, natural landmarks, and model matching by onboard sensors
(Borenstein, Everett, & Feng, 1996). This early work is still being used today, only applying modern
technology and wireless devices. Active and Passive tracking systems have been used indoors to locate
objects or devices. “In an active mobile architecture, an active transmitter on each mobile device

1095
A Prototype Smart Materials Warehouse Application Implemented using Custom Mobile Robots and
Open Source Vision Technology Developed using EmguCV David Culler and James Long
periodically broadcasts a message on a wireless channel. In a passive mobile architecture, fixed nodes
at known positions periodically transmit their location (or identity) on a wireless channel, and passive
receivers on mobile devices listen to each beacon.” (Smith, Balakrishnan, Goraczko, & Priyantha, 2004)
By using wireless signals sent by the robots in this work, and using visual tracking by colors placed on
each robot, there is some additional feedback to the system that can be helpful as well as visual for
tracking and path re-generation that is being studied in the project presented here. Some successful
systems for tracking robots and humans in well-structured environments using cameras have been
presented. For example, a mobile robotic system for tracking and following moving people was
developed using a combination of cameras and lasers to overcome the shortcomings of camera-only
systems that are limited to slow moving objects and environments without many obstacles. Such a
system provides important capabilities for human robot interaction and assistance of humans in various
settings. Tracking with mobile robots has been an active research area with many systems developed,
such as for museum, hospital assistance, or pedestrian tracking. (Kobilarov & Sukhatme, 2006)
Projects have been done which resemble the project presented in this paper. In one research project
a prototype was designed in which a solution will be developed to track assets inside a warehouse using
two cameras on a rooftop, products with RFID tags, and a forklift with Wi-Fi to transfer data in real
time. An Omron F-150 camera was used in a very small workspace (0.5M by 2.5M), with the camera
only 3M from the floor (Aryal, 2012). The project did produce some good images but was not able to
track positions or produce routing/re-routing capabilities. Another project used mobile robots with
cameras mounted to them to recognize colors on the floor so as to stay in specific work-zones. For this
EMGU CV (the .NET wrapper for Visual Studio) was used. Each robot periodically takes a snapshot of
the floor in front of it and then perform color processing to check whether it is capable of continuing on
the floor/terrain. Each robot has its own role and will be aware of which terrain it cannot cross; the floor
will be divided into different colored regions each representing a difficulty level. This provided some
background for the work completed in this project (Saunders, Sterritt, & Wilkie, 2016). Another
approach has been to explore mounting robots with cameras on the ceiling so a wide area of coverage
is achieved and these systems can be connected computer systems. (Sentry, 2016)

3 Project Concept Overview


The objective of this project was to develop a prototype smart materials warehouse where delivery
of raw materials within a machine shop scenario could be simulated using mobile robots customized to
perform specific tasks and test ideas. The warehouse included a ceiling-mounted video camera that
monitors the floor space for all activity and interactions between humans and machines. The camera is
connected to a computer that acts as the central processing station for the system. Software on this
computer analyzes images to identify the robot, permanent obstacles, and temporary obstacles. The
software also has path-planning algorithms and the generated paths are wirelessly sent to the robots.
The robots are simple as far as processing power and almost entirely controlled wirelessly by the
computer using a custom designed user interface. They have onboard sensors that can stop the robot
motion if an unexpected obstacle is detected, but all other guidance functions are handled by the control
computer. The primary purpose of the robots would be material handling and material identification.
The second part of the project was to mount a Kinect camera from X-Box gaming applications on one
of the mobile robots to detect obstacles and perform simple shape recognition of objects normally found
in a shop (ie. boxes, pallets, humans, forklifts). Next an overhead camera/vision system allows the user
to develop paths for different robots to deliver materials, adjust to changes in real time and track the
progress along a pre-defined path determined for raw material preparation. The following 3 sections
describe how the individual pieces of technology were made and section 4 describes the simulation,
testing, and results that were achieved.

1096
A Prototype Smart Materials Warehouse Application Implemented using Custom Mobile Robots and
Open Source Vision Technology Developed using EmguCV David Culler and James Long

3.1 Mobile Robot Development


Mobile robots that work as Automated Guided Vehicles (AGV) in this application were made from
kits purchased as part of the grant and then modified to perform typical material handling tasks including
deliveries from receiving to staging stations, transfers between stations, lifting of pallets, transport of
raw stock, and deployment to locations where they are needed in the plant. Special controllers and
devices were added to the basic robot platforms so that they adapt to different situations by receiving
commands from a host computer and have sensing capabilities to monitor and interact with what is
around them.
The first steps in the
project included testing
with the basic platform
(shown in figure 3) that
was built for the project
and some code that had
been developed using an
Arduino board. There had
also been some research
into the systems used by
Figure 3: Basic Platform Figure 4: IRobot Platform
the commercial IRobot
system as the onboard processor. The IRobot system is shown in figure 4.
Battle bot chassis were purchased for this project. The scope of the project was to design, build, and
program four robots. Two robots were for each of the CSET teams. One team developed an onboard
vision system using a laptop computer and a Microsoft Kinect camera system. The robot built for that
team was called BattleBot. The other team developed software to control and monitor the robots based
on a ceiling-mounted camera system. The robot built for this team was called SmallBot. The other
mobile robots were built to demonstrate material handling and pallet identification. The robot that is
equipped with the RFID reader was called ProtoBot and the material transport robot was called LiftBot.
The robots had several capabilities built into them. Each was intended for different purposes and
therefore needed specific equipment and functionality. All of the robots are capable of wireless serial
communication, forward and backward linear motion, and turning around the center point between the
two primary drive wheels. They use Zigbee XBees for wireless serial communication. The serial
communication allows for commands to be sent and feedback to be received using text. Forward and
backward motions are achieved by sending the text command “P+” or “P-” (respectively) followed by
a distance in mm over the wireless serial connection. Code for the Arduino receives the command and
compares it to a list of known commands. Turning is achieved in much the same way as linear motion
by controlling the left and right motors separately using the command interface.
SmallBot and BattleBot were built with only these basic capabilities. More functionality has been
added to them by their respective CSET teams. This has been done using additional equipment such as
cameras and software run on either a laptop or a desktop computer. ProtoBot and LiftBot have both been
equipped with sonar. Parallax sonar units have been mounted facing in all four directions on both robots.
The sonar units use sound to measure the distance to the nearest object that is in front of them. The sonar
is operated by a separate Arduino microcontroller. ProtoBot has also been fitted with an RFID reader
and an LCD screen. The RFID reader is powered by and communicates with the control Arduino. When
an RFID tag is read, the Arduino receives the tag number over wired serial communication. The tag
number is then displayed on the LCD screen and sent wirelessly to the command computer (fig. 5).
LiftBot has a linear actuator mounted on the front of the robot chassis. This actuator has a forklift
mounted to it. This robot is meant for material handling. The forks are manipulated using four power

1097
A Prototype Smart Materials Warehouse Application Implemented using Custom Mobile Robots and
Open Source Vision Technology Developed using EmguCV David Culler and James Long

Figure 5: Schematic for RFID Reader Interface and Digital Readout

relays to control the polarity across the motor leads. The relays are controlled by the primary Arduino
using the commands “LU” and “LD” to move the forklift up and down (for a number of seconds).
The frame and drive train were built slightly different for the robots. All of the robots are using 4
inch (in.) diameter drive wheels. SmallBot and ProtoBot are both two wheel drive with a caster for
support in the back. ProtoBot
uses hard plastic wheels that
are directly connected to the
output shafts on the motor. The
wheels are attached using a set
screw. SmallBot uses rubber
wheels that are connected to
the motors through a gear and
chain transmission. This
transmission is identical to
what is used for the battle bot
chassis except that the battle
bot chassis has four motors and
the front and back motors on
each side are connected by an
additional chain. Figure 6
shows the completed platform
and Figure 7 the Graphical
User Interface developed for
the LiftBot configuration. Figure 6: Mounting of Components on BattleBot Chassis

1098
A Prototype Smart Materials Warehouse Application Implemented using Custom Mobile Robots and
Open Source Vision Technology Developed using EmguCV David Culler and James Long

Many of the principles and concepts used for the mobile robots are very different from what would
be used for a robot with a fixed base. Because the robot can only move on the floor, the robots location
only needs to be determined in terms of X and Y position. The existing robots could also be interfaced
with other robots such as
articulated arms. In this way
they could be used as part of a
fully automated manufacturing
demonstration. Some additional
capabilities could be built on to
these robots to allow them
greater mobility. More
functionality could be added to
the Arduino to allow for
multiple commands to be
received in one transmission.
The robot would then be able to
execute the commands one at a
time. Communication routines
could also be added to allow for
control of multiple robots
simultaneously. Figure 7: User Interface Created for Robot Operation

3.2 Kinect Driven Autonomous Robot for Obstacle Detection


The Kinect Driven Autonomous Robot (K-DAR) is intended for use in a manufacturing
environment. Some modern factory robots are "autonomous" within the strict confines of their direct
environment. When in such an environment, removing direct human interaction becomes beneficial.
Without people on the manufacturing floors the robots will be able to work together seamlessly to
deliver parts between stations. Costs could also be reduced because there would be a reliable and
adaptable tool on the floor to take the place of traditional moving equipment. Another important benefit
this type of robot is to promote safety.
The following section describes the functionality built into the system. They include: Path-Finding,
Object Avoidance, and Computer-User Interaction:
1. Path Finding was incorporated to record distance travelled, command the robot to travel
from one point to another, create a path in Cartesian points that are converted to polar
coordinates to drive the robot without further user interaction, complex navigation (maze),
and recalculate a path when an obstacle is encountered.
2. Object avoidance was implemented using the Kinect camera system and software routines
for stereographic vision object detection, static and moving object avoidance, utilize user
input avoidance strategies (wait, move around).
3. Computer – User interactions included controlled speed/acceleration, direction/turning,
input points using coordinates, user stored paths.
In addition many types of editing and path management functionality was implemented and a
standard windows type user interface so that new users could easily adapt to robot operation. A database
was also added for data and path management. The interface was developed using C# standard libraries
and the .NET framework, an open source communication manager, and the Microsoft Kinect Software
Development Kit. The robot and user interface for the Kinect system are shown below in Figure 8.

1099
A Prototype Smart Materials Warehouse Application Implemented using Custom Mobile Robots and
Open Source Vision Technology Developed using EmguCV David Culler and James Long

 Uses the Microsoft Kinect as its eyes to navigate a manufacturing floor.


 Uses a Database to save robot paths and system user information.
 Promotes safety on a manufacturing floor.
 Users can create, edit, and delete paths as desired.
 Designed to be interfaced with the Eye in the Sky robotics team.
Figure 8: Summary of Kinect Mobile Robot System & Capabilities

3.3 “Eye in The Sky” Camera, Path Development & Tracking


Automated Guided Vehicle Systems (AGVS), or (AGV), are robots that follow markers or wires on
the floor, or use vision or lasers to guide the robot around the manufacturing course/floor. AGVS are
mostly used with industrial applications that include transporting materials around a manufacturing
facility or warehouse. Most AGVS have the following functionality incorporated in it: Navigation and
Guidance, Routing, Traffic Management, Load Transfer, and System Management (Savant Automation,
2012). Navigation and Guidance refers to the AGV following a predetermined route that the system
creates. Routing is defined as the AGVS ability to decide its next set of path instructions. Traffic
Management is implemented in order to avoid collisions with other vehicles that are present around the
facility. Load transfer is a delivery-system that defines the AGVs main purpose: to transport materials
from one place to another. System management is the methods that dictate how the AGVs operate. At
this point, it can be determined that AGVS utilize two main guidance systems: Line Following and
Vision. For this project, it was decided to focus on utilizing Computer Vision to guide the robots on a
manufacturing floor.
Smart Warehouse refers to a warehouse that is mainly run autonomously without the aid of human
interaction (Ackerman, 2012). With the exception of few staff members to monitor the warehouse, the
AGVs on the floor transport materials from one part of the warehouse to the other, Smart Warehouse
utilizes AGVs in order to increase warehouse effective ness, improve the visibility to warehouse
activities and inventory, and create a more “agile and flexible” warehouse to keep up with increasing
demands (Pal, et al., 2011). Similar to an application of a Smart warehouse, a control system was
developed that will control the robots across the manufacturing floor with the “Eye in the Sky” system.
This system monitors operations and activities in the warehouse with ceiling mounted cameras,
where robots are used to transport materials to specific destinations in the warehouse. Included in the
software application, is an option for the user to select paths created by the system or create their own
paths. After the path has been selected, the system commands the robot to move based on the path
created. All path data is synchronized to the included repository, where the user is free to save or retrieve
path data. The user also is able to directly control the robot, such as when a robot encounters an obstacle.

1100
A Prototype Smart Materials Warehouse Application Implemented using Custom Mobile Robots and
Open Source Vision Technology Developed using EmguCV David Culler and James Long

Although all of the functionality listed in the following section was not implemented, most of the
capability was realized during the testing and demonstration phases of the project. The system was
implemented using the robots described in section 3.1 along with the following software tools: programs
in C#, OpenCV libraries to process live footage from the cameras (EMGU is a C# wrapper used with
OpenCV), runs on Windows 7 Operating System, utilizes the Arduino board to relay signals between
the system and robot. The physical system, physical component and functional architecture diagrams of
the prototype warehouse are presented in Figures 8, 9 & 10 respectively.

Figure 8: Physical System Diagram Figure 9: Physical Component Diagram

Figure 10: Functional Architecture Diagram

1101
A Prototype Smart Materials Warehouse Application Implemented using Custom Mobile Robots and
Open Source Vision Technology Developed using EmguCV David Culler and James Long

The goal of the project was to combine the work from the standard mobile robot design (3 robots) with
a single mobile robot that could detect obstacles using a front mounted MS Kinect camera, and then use
the ceiling mounted camera with the centralized computer. These components were used to run a sample
scenario where an order for parts comes in and a robot is deployed to the warehouse floor using a
specified path to take the parts from the receiving area to each one of the processing stations and then
to the finished parts area. A 3S 5MP Box camera was used for the application with a wide angle lens to
cover the 50’ X 40’ area used for the materials warehouse. The camera was mounted on the truss nearly
20’ above the floor. See the picture and data sheet information presented in Figure 11 below.

Image Sensor 1/3” CMOS


Scanning Mode Progressive Scan
Resolution Up to 2560 x 1920
Effective Pixels 5.0 Megapixel
Ethernet RJ-45 10/100 BASE-T
Max Frame Rate 15 fps @ 2560 x 1920
Video Comp Format H.264 (MPEG)

Figure 11: Ceiling Mounted 3S 5MP IP Box Camera and Specifications


The following section will discuss the systems capabilities and go into detail as to how the system
performs operations defined by the developers. Each operation for the system intends to describe a single
behavior in the system. After converting operations into behaviors, they are then grouped together based
on similarity. Each group is called a functional area, where each area focuses on the specific operation
of the system. They are to be treated as individual modules that are developed, tested, and then deployed.
The Image Processing contains functions involved with utilizing a camera to view the manufacturing
floor, determine obstacles, determine the robot’s path, and relay the live view to the system. For the
image processing program, the system utilizes OpenCV libraries for edge detection, image filtering, and
object tracking. The system processes an image from the video feed to identify the robot on the floor
and also determines any objects on the floor as obstacles. It can determine if an object is static
(stationary) or dynamic (moving). The system checks the robots position against detected obstacles and
renders relevant information to the abstract view. The system pulls a new image for processing every 5
milliseconds and highlights objects over the live video feed in real time.
The Robot Mobility contains capabilities that pertain to the movement of the robot. The Robot should
move in any direction, turn at specific times, halt, move at a set path, and follow set instructions that the
system will relay to the robot. It enables the robot to move at a set speed, in a direction (90 degree
angles), for a given distance, or to stop completely. The Robot uses set routines that the system will call
when the paths have been created by the system or the user. Once paths are created, they will be
processed by the path determining components of the system, and then translate the distances and
number of turns for each path that is given. In some cases, where a robot might encounter the obstacle,
the robot must be able to stop moving at its current course. After receiving a halt instruction, they must
wait for further instructions from the user or the system.
An important ability of the system is that it has a path determination package that contains the
functionality to calculate paths that are created by the user through the path creator. If the paths are
determined by the system, the system will use path finding algorithms (such as Dijkstra’s) to find the
shortest path. If the user draws a path, the system will verify all the paths that the user has created are
valid for the robot to follow. Once the paths are verified, the system instructs the robot to follow the
paths. If the robot encounters an obstacle, the system has to either instruct the robot to halt, or create a
different route that avoids the obstacle.
The User Interaction contains requirements centered on user interaction with the system. The user is
able to determine from the live feed of the camera which are the starting and ending points. A user select
a robot and define the destination. It also allows the user to select an optional load point, warns the user

1102
A Prototype Smart Materials Warehouse Application Implemented using Custom Mobile Robots and
Open Source Vision Technology Developed using EmguCV David Culler and James Long

when a dynamic obstacle is encountered, set the time that the robot remains at the destination, stop the
robot at any time, queue up paths from the selected layout, or load up a stored path. User interaction can
be depicted in many ways for most software applications. In this case, the user interactions are limited
to how the user can manipulate the system. The user, first, will have to log-into the system so they can
have access to all the functionality of the “Eye in the Sky” system. The user can define both loading and
destination points for the robot to follow. The user can define paths manually via the included path
creator if necessary. The user can control the robot via robot control system if they need to halt a robot,
or need to move it. Through the Path Creator, users create paths based on a picture of the live footage
from the ceiling mounted camera. With this path creator, users can draw paths for the robot to travel on.
The user can draw connecting 90 degree lines based on a frame of the image from the camera. When
the user draws the first line, it will have nodes that can be named and the start/end nodes can be set once
each line is created. Nodes where the robot will stop on the way to the destination can be selected. The
results are then stored to the database, where they can then be edited or reloaded at a later time.
Figure 12 below is a picture of the mock warehouse that was used for the use case stories below.
The picture to the right depicts the image from the overhead camera as it appears in the user interface
(Figure 13) including robot docking stations, where the robots are parked here when they are not being
actively used by the system. The receiving dock station is where materials are received and placed for

Figure 12: Mock Warehouse Figure 13: User Interface with Video Feed
the robots. The storage is where all materials are inventoried until they are to be used. The processing
area is a general section where the robots will transport the raw materials to be pre-cut. The staging
stations are areas where the robots take processed materials to be inspected by engineering staffs.
This use case survey will describe a normal session on a manufacturing floor, where materials from
the loading docks are then stored, processed, prepared, or moved to be specific areas. This following
story describes a manufacturing engineer, the shop supervisor, receiving some packaged materials that
need to be processed and then transported to the staging areas. The shop supervisor receives a message
from his superiors that the company has received material. They want him to process the incoming
materials at the saw machine, where it is handled by on-site manufacturing engineers. After being
processed by the engineers, the shop supervisor will then need to move the processed materials to the
staging area where they are inspected by the inspection team before further use. With that in mind, the
shop supervisor opens the Eye in the Sky application, logs into the application, and then checks to see
which of the docked robots are inactive. The robot is commanded to move from the docking station to
the materials docking station. The robot will then proceed to move the materials from the docking station
to the processing station where engineers can take over. After the materials have been processed, the
shop supervisor commands the robot to move the materials from the processing area to the staging
stations, where it is loaded onto the stage and inspected. When completed, the robot returns to the
docking station and waits for further commands.

1103
A Prototype Smart Materials Warehouse Application Implemented using Custom Mobile Robots and
Open Source Vision Technology Developed using EmguCV David Culler and James Long

The Path Determination user case survey describes a scenario where the system will generate paths
for the robot to follow. If the user finds the determined paths satisfactory, then the user can select that
path, or the user can draw paths on the image so the robot can follow the paths that the user has drawn
out. The shop supervisor receives an order from his superiors that requires him to move materials across
the manufacturing floor. After the shop supervisor has received materials from the docking station, he
then sets the station coordinate points and tasks that the robot needs to complete. The system then
automatically generates paths for the robot to follow. After the system has calculated the paths, it will
then display the paths for the shop supervisor to decide. The shop supervisor doesn’t like any of the
paths that the system has created, so he decided to draw out the paths on the program. After drawing out
the paths for the robot to follow, he starts the task up, and the robots now follow the new path that the
shop supervisor has drawn out.
The Obstacles Handling user story will describe a scenario where the robot encounters an obstacle
in its path as it is on its way to deliver the raw materials to the destination. The system will utilize the
cameras mounted on the ceiling to see if any obstacles have moved from the start of the process. If the
system detects a new obstacle, it will either re-route the robot to a new path, or halt and contact the user
for an advisory action. In the case of the advisory
action, the system will then contact the user that
the system is unable to re-route the robot, and
requires an advisory action to proceed. The shop
supervisor notices from the cameras that one of
the engineers is blocking the robot’s path. The
robot will stop, and the system will display a
warning to the program. The shop supervisor
notices this warning and recognizes the engineer
who is blocking the robot’s path. The shop
supervisor will then call the engineer and tell him
to move out of the robot’s path. Once the engineer
has moved, the system will detect that there are
no more obstacles that are encountered in the
path. The system will then command the robot to
proceed to drop off the raw materials to the
robot’s destination. The shop supervisor can
resume doing his own work as the robot navigates
the floor unhindered. Sample code developed in
EmguCV is shown here in Figure 14. Figure 14: Open Source Eye in the Sky Code

4 Conclusions and Future Work


The smart warehouse project reached many of the expected results. The application allows the user
to draw paths from the path creator, in which it can be then stored to the database so it can be accessed
later on. The Eye in the Sky application utilizes a ceiling mounted camera to track the robot on the floor.
The robot is tracked as a rectangle in the abstract view of the application. Through the application, once
the paths have been drawn, the user can then select to load it up and then draw it into the path abstract
view. From there, the user can then select a start node, a load node, then a destination node for the robot
to follow. This creates a path, in which the user can then select to add it to a path queue, or just run it by
pressing the Start button in the application. The robot would then follow the path, reach the destination,
and then return to the start point to wait for instructions (or proceed in the next part of the path from the
path queue).

1104
A Prototype Smart Materials Warehouse Application Implemented using Custom Mobile Robots and
Open Source Vision Technology Developed using EmguCV David Culler and James Long

The equipment and software that has been developed for this project could be utilized for many
future projects. The vision systems could be refined to the level that the robots could be guided to
specific storage locations and retrieve and store materials. The information from the sonar units could
be utilized for more accurate robot positioning when the robot is in close proximity to objects of known
location. Other items that could be used as next steps in this project include improve image detection,
add complexity to how the robot is tracked, add to the types of obstacles that can be detected, allow for
higher resolutions, multiple robots running simultaneously, incorporate multiple cameras.
The existing robots could also be interfaced with other robots such as articulated arms. In this way
they could be used as part of a completely automated manufacturing demonstration. Additional
capabilities could be added to these robots to allow them greater mobility. More functionality could be
added to the Arduino to allow for multiple commands to be received in one transmission. The robot
could then execute the commands one at a time. Additional communication routines could be used to
allow for control of multiple robots simultaneously.
These types of systems could eventually be used for military applications such as search and rescue,
warehouse management, munitions management, arming and disarming of military vehicles, and in
medical facilities for deliveries. Many of these tasks are simple enough that the robot could handle the
entire task, and some task could be assisted by robot for improved safety and reduced cost.

5 Acknowledgements
The students of Manufacturing and Mechanical Engineering and Technology and the students of
Computer Software Engineering Technology at the Oregon Institute of Technology were key
contributors to the material presented in this paper. Also thanks to the work of John Anderson who
began this work in 2008-2009 before retiring in 2012. Graduate students Noah Anderson, Matthew
Floyd and HoeJin Kim were instrumental in keeping this project moving forward.

References
Arduino. (2012). Arduino board mega 2560. Retrieved May 15, 2012, from Arduino Products:
http://arduino.cc/en/Main/ArduinoBoardMega2560
Aryal, S. (2012). Integrating Camera Recognition and RFID System for Assets Tracking And Warehouse
Management. Thesis, Central Ostrobothnia University of Applied Sciences, Industrial
Management, Ylivieska.
Berman, S., Schechtman, E., & Edan, Y. (2008). Evaluation of automatic guided vehicle systems.
Robotics and Computer Aided Manufacturing, 2(9), 1-7.
Borenstein, J., Everett, H., & Feng, L. (1996). Where am I? Sensors and Methods for Mobile Robot
Positioning. Ann Arbor: University of Michigan.
Cao, Y., Fukunaga, A., & Kahng, A. (1997). Cooperative mobile robotics: antecedents. Autonomous
Robots 1997;4:1–23., 4(1), 1-23.
Cesetti, A., Frontoni, E., Mancini, A., Zingaretti, P., & Longhi, S. (2010). A Vision-Based Guidance
System for UAV Navigation and Safe Landing using Natural landmarks. J Intell Robot Syst.,
57, 233-257.
Isaksson, A., & Drath, R. (2016, April 6). ABB Group. Retrieved from CPS Conference Presentations:
http://download.steinbeis-europa.de/2013-10-29-30_CPS/30102013-cps-isaksson-abb.pdf
Jang, O. J., Lee, O. S., & Moo, Y. B. (2009). Bridge Inspection Robot System with Machine Vision.
Automation in Construction, 18, 929-941.

1105
A Prototype Smart Materials Warehouse Application Implemented using Custom Mobile Robots and
Open Source Vision Technology Developed using EmguCV David Culler and James Long
Kobilarov, M., & Sukhatme, G. (2006). People tracking and following with mobile robot using an
omnidirectional camera and a laser. Proceedings of the 2006 IEEE International Conference
on Robotics and Automation, (pp. 557-562). Orlando, FL.
Okuno, Y. T. (2008). A robot vision system for collision avoidance using a bio-inspired algorithm. Lect.
Notes Comput. Sci. 4985, 107-116.
Peters, B., Smith, J., & Venkatesh, S. (1996). A control classification of automated guided vehicle
systems. Int J Ind Eng, 3(1), 29-39.
Reveliotis, S. (2000). Conflict resolution in AGV systems. IIE Trans 2000;32:647–59, 32, 647-659.
Saunders, C., Sterritt, R., & Wilkie, G. (2016, February 14). The Utility of Robot Sensory Devices in a
Collaborative Autonomic Environment. Retrieved from University of Ulster:
http://uir.ulster.ac.uk/30789/1/CatherineSaunders-
TheUtilityofRobotSensoryDevicesInACollaborativeAutonomicEnvironment(PhDstreamEAS
e2014).pdf
Sentry, T. (2016, February 16). Retrieved from Sentry Technology Corporation:
www.sentrytechnology.com/travelingproducts/sentryvisionsmarttrack.html#maindivider
Smith, A., Balakrishnan, H., Goraczko, M., & Priyantha, N. (2004). Tracking Moving Devices with the
Cricket Location System. MobiSYS.
Wenjie, Y., Weber, C., & Wermter, S. (2011). A hybrid probabilistic neural model for person tracking
based on a ceiling-mounted camera. Journal of Ambient Intelligence and Smart Environments,
3, 237–252.

1106
Available online at www.sciencedirect.com

ScienceDirect
Procedia Engineering 192 (2017) 665 – 670

TRANSCOM 2017: International scientific conference on sustainable, modern and safe transport

Implementation of Automated Guided Vehicle system in healthcare


facility
Marko Pedana*, Milan Gregora, Dariusz Plintab
a
University of Zilina, Faculty of Mechanical Engineering, Department of Industrial Engineering, Univerzitná 1, Žilina 010 26, Slovak Republic
b
The University of Bielsko-Biała, Faculty of Mechanical Engineering and Computer Science, Department of Production Engineering, Willowa 2,
Bielsko-Biała 43 309, Poland

Abstract

The article deals with the use of automated guided vehicle (AGV) system in the hospital. This paper provides the requirements and
technical specifications of AGV cart designed for healthcare facility. The second part describes the application and benefits of
AGV implementation in selected health care facility gained from computer simulation that is used as a verification tool. This part
also contains the economic evaluation of this implementation and summary of further investments related to this technology.
© 2017The
© 2017 TheAuthors.
Authors. Published
Published by Elsevier
by Elsevier Ltd. is an open access article under the CC BY-NC-ND license
Ltd. This
Peer-review under responsibility ofthe scientific committee of TRANSCOM 2017: International scientific conference on
(http://creativecommons.org/licenses/by-nc-nd/4.0/).
Peer-review
sustainable,under responsibility
modern of the scientific committee of TRANSCOM 2017: International scientific conference on sustainable,
and safe transport.
modern and safe transport
Keywords: AGV; healthcare; improvement; simulation; efficiency

1. AGV for healthcare facilities

Automatic Guided Vehicles (AGV) or self-guided vehicles (SGV), have been widely used in material handling for
decades [1]. In these days, the demand for mobile robots and their use in hospitals has increased due to changes in
demographic trends and medical cost control. For healthcare facilities, these automated systems are designed
specifically for handling bulk material, pharmacy medicines, laboratories samples, central supply and transportation
of food, dirty dishes, bed laundry, waste (biological, recyclable), biomedical instruments etc. Operating efficiency is
gained by automating these supplies, which allows the transfer of human resources to other departments or activities.

* Corresponding author. Tel.: +421-41-513-2713 ; fax: +421-41-513-1501.


E-mail address: marko.pedan@fstroj.uniza.sk

1877-7058 © 2017 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license
(http://creativecommons.org/licenses/by-nc-nd/4.0/).
Peer-review under responsibility of the scientific committee of TRANSCOM 2017: International scientific conference on sustainable, modern and safe transport
doi:10.1016/j.proeng.2017.06.115
666 Marko Pedan et al. / Procedia Engineering 192 (2017) 665 – 670

Automated systems are working 24 hours a day, 7 days a week. Automated solution can streamline traffic flow of
material in the hospital, control costs, reduce workload. Hospital operating installation have to fulfill some important
requirements, see Fig. 1. These requirements and the use of modern logistics systems significantly affects the operation
of the entire facility and its economy, improves the quality of patient care and increases their safety. [2]

Fig. 1. The requirements for AGV integration in healthcare facility (Source: author).

2. The 3D design of AGV model for healthcare facilities

The use of automated transport system (AGV) relieves hospital staff and allows them to spend most of their time
on direct patient care. This increases safety in the hospital by minimizing potential injury to the staff when pushing
heavy carts. The system monitors all major movements in the hospital and may prefer the most important jobs and
tasks that can be completed first (e.g. surgical instruments transported first, then food for patients, bedding, eventually
garbage, etc.) [3]. AGV is equipped with sensors to detect obstacles that allow safe stop before hitting obstacles that
might be in the way. The system and its vehicles is reliable, safe, efficient and cost-effective [4]. Applications and
commands are mediated through a user-friendly touch screen. The system is fully integrated for automatic control of
doors, elevators, trolley washers, garbage dump truck, etc.

Fig. 2. Low profile Automated Guided Vehicle for healthcare facilities (Source: our research).

This designed 3D model of vehicle, see Fig. 2., has technical specifications specified in Table 1, and we will use it
in the simulation model, which verifies its potential implementation in the hospital [5].
Marko Pedan et al. / Procedia Engineering 192 (2017) 665 – 670 667

Table 1. Technical specifications of designed AGV.


Technical parameters Values
Dimensions 127 x 63 x 32 cm
Speed (max.) 2 m/s
Load carrying capacity (max.) 500 kg
Battery capacity 100 Ah

The transport principle will be carried out in such a way that the medical supplies will be transported by special
transport boxes that AGV cart undercuts and then lifts up. Fig. 3. describes the principle of object transportation using
designed AGV cart. Fig. 3. also shows the key dimensions of the vehicle and transported objects which are necessary
to be observed when transporting boxes.

Fig. 3. The minimum dimensions of the transport box and the distance between the vehicle and transport box (Source: our research).

3. AGV implementation in a ward of healthcare facility.

In selected healthcare facility we designed AGV integration in the following areas [6]:

 Food transportation to the patient rooms. This process represents the provision of food transportation from the
food arrival, which provides the external company, to the food transportation provided by AGV's to the kitchen,
then sorting the meals for patients by healthcare staff and distributing the meals by AGV's to the patient rooms.
The rooms have designed areas for precise stopping and unloading food from AGV's.
 Collection and transportation of used and clean laundry. Healthcare facility has their external company,
which carries away and washes the dirty laundry and delivers the clean one back. The facility can use AGV's for
the internal transport service. Transportation would consist of loading the laundry box and transporting it to the
desired location (central storage). Transportation through the floors will be carried by freight elevator.
 Waste transportation. Waste will be transported from a well-marked spaces and areas from the whole ward.
The waste will be collected on these places in special boxes. The AGV's will then take and move the waste to the
temporary storage of waste to the base floor.
668 Marko Pedan et al. / Procedia Engineering 192 (2017) 665 – 670

3.1. The visualization of AGV integration in simulation software SIMIO

We used simulation software for verification of our suggested implementations of AGV's in inpatient ward. For
this purpose, we used software Simio in which we have imported the real objects and the physical disposition of
healthcare facility. Fig. 4. shows the transportation process of food in the ward in digital environment [5,6]. AGV's in
this simulation software follows the inpatient ward streams that we have mapped and analyzed (i.e. the movement of
medical staff and medical material). Transport between the floors will be carried by freight elevator, see Fig. 4a. AGV's
will then transport the food to a designated locations in patient rooms [7], see Fig. 4b.

Fig. 4. (a) food transportation to the inpatient ward by freight elevator; food unloading in patient room (Source: our research).

3.2. Simulation as a decision-making tool

Simulation in healthcare can be considered as an effective tool, technique or method [8]. Healthcare personnel
especially decision makers - directors and managers - need reliable operational tool that supports them in decision-
making process. Such techniques and tools help them in reductions of costs, waiting time of patients, future predictions
of patients arrivals and provide them with visualization that enables them to prepare staff and all resources that are
necessary for provision of high-quality healthcare service to the patients at the right time [9]. These tools should also
facilitate the decision making evidence and informative environment. Simulation models, especially those with
transparent structure to their core variables that can be easily understood and trusted by people with decision-making
competence, are a useful tool to support decision-making, communication, discussion, ideas, policies, scenario
analysis, from which they can gain knowledge and from which they can learn [10]. That was also our case, since we
needed to create a simulation model of a real healthcare facility that will help the management to decide whether to
implement the AGV system or not. After many interviews with the hospital leaders we have created several model
variants. Whereas the creation and process of a simulation study is very extensive, in the next chapter, we bring and
summarize the most important outcomes and results that were the key ones for healthcare managers [7].

3.3. The benefits from AGV integration gained from simulation runs

With the AGV integration we were able to save 345 minutes of total 1440 minutes (representing 23.96 %) for
medical assistant (MA) per day, see Table 2. This can result in the transfer or movement of medical assistants to
activities and tasks that our legislation allows them [5,6], so they can spend more time with patients. Furthermore, we
were able to relieve the cleaning and transporting services of heavy and dangerous waste by AGV system integration.
Among other things, the AGV can be also used for transportation of medicines and medical supplies with low
requirements on safety or hygiene. This integration will bring benefits associated with the reduction of damage,
unreasonable and incorrect shipments, and physically heavy transport. Another advantage is that the vehicle can
operate 24 hours a day while meeting the requirements of charge.
Marko Pedan et al. / Procedia Engineering 192 (2017) 665 – 670 669

Table 2. Healthcare staff time reduction.


Staff Activity Current state (min.) After AGV integration (min.) Time reduction (min.)
Morning toilet 150 150 0
Blood test subscription 300 300 0
Clean laundry delivery 75 0 75
Breakfast delivery 60 0 60
Lunch delivery 90 0 90
MA
Afternoon toilet 300 300 0
Dinner preparation 30 30 0
Dinner delivery 60 0 60
Evening toilet 300 300 0
Dirty laundry transport 60 0 60
Total 1440 1080 345

3.4. Economic evaluation

The final chapter brings the economic evaluation of AGV integration in selected healthcare facility. In Table 3 we
can see the investment intensity of the project. It must be said that the costs are calculated according to initial analysis
only approximately, so for precise determination of the exact amount of the costs of such a project there will be needed
additional analyzes [6]. These analyzes have not been carried out, due to the short duration of the project and other
requirements from healthcare facility.

Table 3. Investment intensity of AGV integration.


Costs Price (€) Comments
Purchasing and setting AGV cart 50, 000
Installing and marking AGV navigation 700 Floor marking
Customization of hospital environment 65,000 Door, elevator and AGV interface
Total 115,700

To calculate the hourly costs for running an AGV's we used input costs, which are around € 115,700. From these
calculated costs we expressed our monthly operating costs of 4 % (specified by the AGV manufacturer) [6],
representing a value of € 4,628/month. From this value we expressed the operating costs necessary for one day
provision (€ 154.27/day). In the last stage of the calculation, we found out the hourly cost of running the AGV at
around € 6.43/hour. These values were then compared to the hourly cost of medical assistant in inpatient ward (€ 3.5)
from which we can see that operating costs of AGV are almost 2 times higher, see Table 4. And although this is a
rough calculation of operating costs, it gives an approximate idea to the managers of healthcare facilities whether it is
good to think about the implementation of this technology.

Table 4. Hourly operating costs.


Operator Operating costs per hour (€)
Medical assistant 3.5
AGV 6.43
670 Marko Pedan et al. / Procedia Engineering 192 (2017) 665 – 670

4. Conclusion

Case study, which we have tried to introduce to you in this article was carried out at the request of the director of
healthcare facility in order to identify the potential implementation of AGV system in the hospital. In the case study,
we designed AGV cart and transport methods for inpatient ward of healthcare facility. This way of transport we
subsequently created in a 3D environment where we have simulated and verified the movement of AGV in terms of
the physical layout of the building and material flows in the ward. The final economic assessment then pointed out
that the AGV technology is currently not cheap and is affordable only for bigger facilities managing in profit. Proper
and effective implementation for a given type of healthcare facility depends on many factors and requires a detailed
assessment and analysis [11]. The world's top hospitals, have already adopted this technology and therefore they are
reducing operating costs and increasing the quality of their healthcare services, which lead them to rapid cost recovery.
However, from our view, Slovakia and its healthcare facilities are not ready to integrate this technology now. The
healthcare system is in a position in which he could not benefit from the advantages of AGV systems in a way that
the world does [12]. AGV is also a technology, which potential is high but its specific application must be analyzed
through several methods of industrial engineering (e.g. simulation). Since many healthcare facilities are deterred
particularly by high acquisition costs of this technology [13], healthcare managers need to realize that the purpose of
the new, modern technology is mainly to help healthcare professionals to work more efficiently and improve the
quality of healthcare services. If our healthcare facilities want to respond to technology-driven environment of care
and be prepared for the future development, the designers must not only design healthcare facilities as a buildings.
They need to meet the requirements of patients and staff, and must predict the future.

Acknowledgements

This paper is the part of research supported by project KEGA 032ŽU-4/2015.

References

[1] D. Plinta, M. Krajčovič, Production system designing with the use of digital factory and augmented reality technologies, in: Advances in
Intelligent Systems and Computing, vol. 350 (2016), ISSN 2194-5357, pp. 187-196.
[2] B. Mičieta, M. Gašo, M. Krajčovič, Innovation performance of organization, in: Communications – Scientific letters of the University of Žilina,
vol. 16, no. 3A (2014), ISSN 1335-4205, pp. 112-118.
[3] E.Weremeychik, Best Of 2014: How To Design A SmartHospital. (2014). Available on internet:
<http://www.healthcaredesignmagazine.com/article/how-design-smart-hospital?page=2>.
[4] M. Krajčovič, et al., Intelligent manufacturing systems in concept of digital factory, in: Communications – Scientific letters of the University
of Žilina, vol. 15, no. 2 (2013), ISSN 1335-4205, pp. 77-87.
[5] S. Chandel, Automatic Guided Vehicles serve food to patients at the Southmead Hospital. (2014). Available on internet:
<http://www.damngeeky.com/2014/09/03/24689/automatic-guided-vehicles-serve-food-patients-southmead-hospital.html>.
[6] M. Gregor, M. Pedan, L. Mizeráková, "SMART" zdravotnícke zariadenia - využitie moderných technológií v zdravotníctve, in: ProIN :
dvojmesačník CEIT, ISSN 1339-2271, vol. 16, no. 5-6 (2015), pp. 21-24.
[7] C. Pennington, Building a Smart Hospital that Stays Smart Well into the Future. (2012). Available on internet:
<http://today.uconn.edu/2012/04/building-a-smart-hospital-that-stays-smart-well-into-the-future/>.
[8] Štefánik, P. Grznár, B. Mičieta, Tools for Continual Process Improvement–Simulation and Benchmarking, in: Annals of DAAM for 2003
&Proc. of the 14th Intern. DAAAM Symposium: Intelligent manufacturing & automation: Focus on reconstruction and development, (2003),
ISBN 978-3-901509-34-6, pp. 443-444.
[9] M. Krajčovič, A. Štefánik, Ľ. Dulina, Logistics processes and systems design using computer simulation, in: Communications – Scientific
letters of the University of Žilina, vol. 18, no. 1A (2016), ISSN 1335-4205, pp. 87-94.
[10] R. Webner, Hospital of the future: How the typical hospital will change with technology and shift to patient-centered care. (2014). Available
on internet: <http://www.post-gazette.com/healthypgh/2014/05/22/Hospital-of-the-future-How-typical-hospital-will-change-with-technology- and-
shift-to-patient-centered-care/stories/201405250011>.
[11] L. Krkoška, M. Gregor, M. Haluška, Tvorba a transformácia dát pre použitie v optimalizačných projektoch zdravotníckych zariadení, in:
Metody i techniki kształtowania procesów producyjnych, ISBN 978-83-65182-37-1, (2015), pp. 169-182.
[12] S. Palajová, Š. Figa, M. Gregor, Simulation on manufacturing and logistics systems for the 21th century, in: Applied computer science :
management of production processes, ISSN 1895-3735, vol. 7, no. 2 (2011), pp. 57-70.
[13] J. Barjis, Healthcare simulation and its potential areas and future trends, in: SCS M&S Magazine, (2011), vol. 2, no.5, pp. 1-6.