You are on page 1of 6

2014 IEEE 17th International Conference on

Intelligent Transportation Systems (ITSC)


October 8-11, 2014. Qingdao, China

Approaching Index Based Collision Avoidance for V2V Cooperative


Systems *
Boyuan Xie, Keqiang Li, Xiaohui Qin, Hang Yang, and Jianqiang Wang

cooperative collision avoidance. However, neither TTC nor


Abstract—Vehicle risk evaluation is a crucial step for collision


avoidance algorithm. By applying wireless communication THW describes the relative-motion between self-vehicle and
technologies, environment vehicles can be sensed and environment vehicles on neighbor lanes. To reflect the
considered as potential objects. This paper explores the collision risk under complex traffic conditions, such as
definition, algorithm and applications of approaching index (AI). intersection scenarios, several kinds of risk index were
First, by taking the advantage of communication, trajectory developed. Hillenbrand used time-to-react (TTR) for
cross point (TCP) can be derived through calculating vehicle collision mitigation system in intersection-like scenarios [13].
future trajectories, and helps consider the neighbor vehicles as Mussone etc. determined the collision possibility by arrival
potential objects, which is useful under complicated traffic time to conflict point (ATCP) [14]. Kwon calculated
scenes. Secondly, based on TCP, the definition and algorithm of overlapping interval (OI) of different vehicle to judge threat
AI is built. AI is very important for selecting collision object
risk [15]. Chan defined criticality index (CI) to account for
from potential collision objects. Then, hardware in the loop
the severity and likelihood of a probable conflict [16]. CI is
(HIL) tests with different traffic conditions were conducted.
Test data confirms the effectiveness of AI in different scenes proportional to the speed squared of the involved vehicle and
although subjected to interference. Driver can gain more time to inversely proportional to the time to conflict (TTC). Ueki
avoid oncoming collision by taking the advantage of AI. Finally, developed collision risk indicator (CRI) for five mobility
the limitations and future work of AI are discussed. models of two vehicles [17]. CRI is related to distance,
heading angle, and velocity, which can be exchanged by
I. INTRODUCTION wireless communication. ATCP, OI, CI and CRI are all based
Collision warning systems (CWS) always use sensors on simple motion relationship, and collision/conflict point is
such as radar/lidar [1, 2], camera [3], and acoustic sensors to used as the calculating object. Collision point means two
acquire target information. Due to the limitations of vehicular vehicles overlap in time and space. However, it does not
sensors, only limited relative-motion information, including always exit in the beginning of most potential accidents.
relative speed and relative location, of the front vehicle can be Inter-vehicle communication helps self-vehicle perceive
detected. Based on vehicular sensor detection, commercial environment vehicles by getting their motion state parameters
CWS generally utilizes time to collision (TTC) [4] and time [18]. The vehicle trajectory parameters are important for
to headway (THW) [5] to determine the likelihood of collision risk evaluation. If two trajectories cross in a future
collision. Traditional CWS only concerns about direct front point, the approaching relationship of those two vehicles can
vehicles, which can’t satisfy the requirement in complicated be determined. In this paper, we define the trajectory cross
traffic scenes [6]. Vehicle to vehicle communication (V2V) point (TCP) to depict the point on which the neighbor vehicle
[7, 8] has thus been developed to exchange information, overlaps the self-vehicle trajectory in the future, and then
including speed, dynamic location, acceleration, yaw angle, develop approaching index (AI) to describe the approaching
heading angle, and so on [9]. Self-vehicle (SV) can get more level of two adjacent vehicles.
detailed information of neighbor vehicles (NV) to implement
cooperative collision warning [10], collision avoidance [11], The aim of this paper is to define and deduce approaching
and platoon control [12]. index, through which we can establish the relationships of
object sensing methods and decision making strategies. This
Risk evaluation is a crucial step for vehicle collision paper is organized as follows. Trajectory cross point is defined
avoidance algorithm. Vehicle information exchange supports and discussed in section II. Approaching index is defined and
its fuzzy algorithm is deduced in section III. Hardware in the
*Research supported by the National Natural Science Foundation of loop (HIL) tests were designed to examine the effectiveness of
China (Grant No. 51175290) and the National High Technology Research approaching index for collision avoidance in section IV. The
and Development Program of China (Grant No. 2012AA111901). limitations of AI and future work are discussed in section V.
Boyuan Xie, was with State Key Laboratory of Automotive Safety and Finally, conclusion is put forward in section VI.
Energy, Tsinghua University, Beijing, CO 100084, P. R. China, he is
currently with Dingyuan Automotive Proving Ground, Nanjing CO 210028, II. DEFINITION OF TRAJECTORY CROSS POINT AND
China (e-mail: wsxby02@aliyun.com ). APPROACHING INDEX
Keqiang Li, and Xiaohui Qin are with State Key Laboratory of
Automotive Safety and Energy, Tsinghua University, Beijing, CO 100084, P. In a local traffic environment, a driver detects neighbor
R. China (e-mail: likq@tsinghua.edu.cn, qxh880507@163.com, ). vehicles, and compares self-trajectory with the neighbor one
Hang Yang is School of Mechanical Engineering, Chongqing University,
Chongqing, CO 400030, P. R. China (e-mail: 20112775@cqu.edu.cn ) .
to determine collision or not. Here, we don’t make the
Jianqiang Wang is State Key Laboratory of Automotive Safety and judgment whether the neighbor vehicle wants to take a lane
Energy, Tsinghua University, Beijing, CO 100084, P. R. China change, and we just predict its future trajectory based on the
(corresponding author; phone: (+86) 010-627-95774; e-mail: exchanged data. In most cases, trajectories cross point (TCP)
wjqlws@tsinghua.edu.cn ).

978-1-4799-6078-1/14/$31.00 ©2014 IEEE 127


helps driver take effective measures to avoid collision. TCP is vehicles are not reflected clearly in the above formula, DTCP
an important factor to reflect two vehicles’ relationship. Since can be corrected accordingly to satisfy realistic applications.
the AI depends on the TCP, we need to make the following
definitions at first. Definition 2.4: Approaching index: AI can describe the
closeness level of two adjacent vehicles whose future
Definition 2.1: Trajectory cross point: a vehicle trajectory trajectories cross in a predictable time range. TTCP, THW
overlaps with another one within a certain time range. and TTCi work together to get the approaching index through
fuzzy algorithm model.
D( pij ) = min(|| Psv ( xt(i) , yt(i) )  Pnv ( xt(j) , yt(j) ) ||2 )  ε
 AI  f (TTCP, THW , TTCi ) 
ti , t j  [0, h ]
The detail of the algorithm of AI will be introduced in the
Where, Psv(xt(i),yt(i)) is self-vehicle position at ti and
next section.
Pnv(xt(j),yt(i)) is neighbor vehicle position at tj. The physical
distance D(Pij) is the shortest distance between the two future III. APPROACHING INDEX ALGORITHM BASED ON FUZZY
trajectories, while ti is not necessarily equal to tj. So TCP is METHOD
different from collision point (CP), the time overlap is not
necessary to judge the existence of TCP. However, ti and tj On the basis of the above definitions, AI depends on three
should be within τh. main factors: TTCP, THW and TTCi. If self-vehicle finds a
neighbor vehicle has a trajectory cross point (TCP) with itself,
Definition 2.2: TCP is a mapping of the neighbor vehicle self-vehicle can judge the risk level with AI, and decides the
on the future trajectory of the self-vehicle. next countermeasures. Similar to driver’s decision process,
the judgment method is not an accurate quantitative
description, but a fuzzy analysis decision making process. For
example, when a neighbor vehicle is cutting in, self-vehicle
driver controls speed dynamically to reserve enough space
and appropriate speed according to the relative motion
between self-vehicle and TCP. Fuzzy algorithm is an
appropriate method for calculating AI.
Taking TTCP, THW and TTCi as the inputs, AI as the
output of the fuzzy system, we can get a fuzzy inference
Figure 1. TCP of self-vehicle and neighbor vehicle (the acre lines mean system with 3 inputs and 1 output. Fig.2 shows its structure.
communication range)

If the neighbor vehicle is in front of the self-vehicle, TCP TTCP

is a real point. If the neighbor vehicle is on side of self-vehicle,


Fuzzific Fuzzy Defuzzifi
TCP is a virtual point. Since TCP is a mapping of neighbor THW
ation Inference cation AI

vehicle, it should be given the same motion state of its origin


source, such as velocity and acceleration. TCP now TTCi

represents a physical entity and has its motion properties. Membership Function Mamdani Method Centroid Method

Definition 2.3: Time to TCP (TTCP) is the time for the


neighbor vehicle driving to TCP. TTCP holds the same effect Figure 2. Approaching Index algorithm based on fuzzy method
with THW and TTC.
A. Inputs Fuzzification
In the following analysis, TCP is used to substitute the
Three variables need to be fuzzified. Firstly, TTCP is the
potential objects, even if it isn’t actually in front of
time of neighbor vehicle driving to TCP. A smaller TTCP
self-vehicle. Now, we can borrow the general definitions of
means neighbor vehicle is closer to TCP. The universe of
THW and TTC. However, several modifications adapt them
TTCP is [0, 1.5] here. If TTCP>1.5s, the neighbor vehicle is
for complicated traffic scenes based on TCP. THW is the time
far away to TCP and will not be considered. If TTCP=0,
of self-vehicle to TCP, and the definition of TTC is shown
neighbor vehicle is in front of self-vehicle. TTCP is fuzzified
below.
to 3 grades. They are Neighbor (N, [0.9, 1.5]), Transitional (T,
[0.3, 1.2]) and Front (F, [0, 0.6]) respectively. Fig. 3 shows
TTCP  t j the fuzzy set of TTCP.
THW  ti
 1.0
DTCP
TTC  0.8
degree of membership

vsv  vnv
0.6
Where, TCP represents the actual neighbor vehicle. DTCP 0.4
Front Transitional Neighbor

is the distance of self-vehicle to TCP; vsv is the velocity of


self-vehicle; vnv is the velocity of neighbor vehicle. Usually, 0.2

TTCi, which is the reciprocal of TTC, is used to avoid infinity 0.0


0.0 0.3 0.6 0.9 1.2 1.5
of TTC. Though the sizes of self vehicle and neighbor TTCP

Figure 3. Fuzzy set of TTCP

128
Secondly, THW is the time of self-vehicle to TCP. A TABLE I. FUZZY RULES OF AI
smaller THW means self-vehicle is closer to TCP. The No. TTCP THW TTCi AI
universe of THW is [0, 1.5] here. If THW>1.5s, the TCP is far 1 N F L VS
and will not be considered. THW is fuzzified to 3 grades. 2 N F M S
They are Far (F, [0.9, 1.5]), Near (N, [0.3, 1.2]) and Close (C, 3 N F S RS
[0, 0.6]) respectively. Fig. 4 shows the fuzzy set of THW. 4 N N L S
5 N N M RS
1.0
6 N N S M
0.8 7 N C L RS
degree of membership

0.6
8 N C M M
Close Near Far 9 N C S RL
0.4
10 T F L S
0.2 11 T F M RS
12 T F S M
0.0
0.0 0.3 0.6 0.9 1.2 1.5 13 T N L RS
THW
14 T N M M
Figure 4. Fuzzy set of THW 15 T N S RL
16 T C L M
Finally, TTCi is the reciprocal of TTC. The smaller TTCi 17 T C M RL
18 T C S L
means the bigger TTC. The universe of TTCi is [-0.5, 2] here. 19 F F L RS
If TTCi <-0.5s, self-vehicle has no possibility to overtake 20 F F M M
TCP and will not be considered. TTCi is fuzzified to 3 grades. 21 F F S RL
They are Long (L, [-0.5, 0.5]), Medium (M, [0, 1.5]) and 22 F N L M
Short (S, [1, 2]) respectively. Fig. 5 shows the fuzzy set of 23 F N M RL
TTCi. 24 F N S L
25 F C L RL
1.0
26 F C M L
27 F C S VL
degree of membership

0.8

0.6
Long Medium Short
The total sequences of inference rules can be represented
0.4 by the following formula:
0.2
n

0.0
-0.5 0 0.5 1 1.5 2
R Ri , n  27 
i=1
TTCi

Figure 5. Fuzzy set of TTCi D. Defuzzification


The centroid method is used to defuzzify the fuzzy
B. Output Fuzzification inference results. In order to improve computational efficiency,
The universe of AI is [0, 1]. According to the above without having to calculate the value of AI at the point of
fuzzification of TTCP, THW and TTCi, to get a better continuous space, discrete centroid method is utilized here.
resolution [19], AI is divided into 7 grades. They are Very
Small (VS, [0, 0.3]), Relative Small (RS, [0.25, 0.45]), 7

Medium (M, [0.3, 0.7]), Relative Large (RL, [0.55, 0.75]),   (x )xi i i
AI  i 1

Large (L, [0.6, 0.9]), and Very Large (VL, [0.7, 1]) 7

respectively. Fig. 6 shows the fuzzy set of AI.   (x )


i 1
i i

1.0 Where, the µi(xi) is the membership grade of the i-th fuzzy
0.8 subset of AI calculated by Mamdani method, while xi is the
degree of membership

0.6
VS S RS M RL L VL corresponding universe of AI.
0.4 IV. HIL TESTS UNDER COMPLICATED TRAFFIC SCENARIOS
0.2
To verify the effectiveness of AI under complicated traffic
0.0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 conditions, 4 different tests are designed on the dynamic
AI
simulation test bed (Fig. 7).
Figure 6. Fuzzy set of AI
The dynamic simulation test bed consists of driver,
C. Fuzzy Rules dynamic driving compartment, central control platform,
vehicle dynamic simulation software, and scene simulation
Since three inputs are fuzzified to 3 grades separately, 27 system. User can edit traffic scenes and control algorithms,
fuzzy rules can be used to describe relationships of three monitor driver’s dynamic operation in real time.
inputs and one output. The table 1 is the fuzzy rules.

129
A. Scene 1 test result
Self-vehicle is cruising on straight road freely, and
neighbor vehicle cuts in from adjacent lane.

TABLE III. SIMULATION PARAMETERS OF SCENES 1


Self-vehicle speed 120km/h, cruising freely
Neighbor vehicle speed 108km/h
Distance to Neighbor vehicle 30m
Event Neighbor vehicle cutting in
Road type Straight road

There are two methods for self-vehicle to perceive the


neighbor vehicle, one is through communication, and the other
one is by radar. We can compare the difference of object
identification spot by the following AI-time curve.

Figure 7. dynamic simulation test bed


Several repeating tests have proved the effectiveness of AI,
which helps self-vehicle identify neighbor vehicle earlier than
To confirm the effectiveness of AI, several comparison radar does. Fig. 9 is one of the test results. AI threshold can
tests were launched on the test bed. During the tests, 4 help find potential object at 1.28s, while radar-based
different scenes representing typical traffic phenomenon were identification method is at 1.79s. In scene 1, 0.51s is advanced
developed by scene simulation system. Drivers manipulated for self-vehicle avoiding collision.
test bed under virtual scenes. The dynamic parameters 0.6

(trajectory, speed, acceleration, brake pressure, throttle status, 0.55

etc.) of test bed were sampled and transferred from Carsim to 0.5

Matlab/Simulink. Based on these parameters, the algorithm 0.45

of AI was coded in Simulink. Besides, the object


Approaching Index

0.4

identification algorithm of radar was also developed to 0.35


NV is identified by radar

compare with the AI method. Time of identifying potential 0.3


object is the examining indicator of those two methods. Fig. 8 0.25
is the hierarchy of the HIL system. 0.2

0.15 NV is identified by communication


Vehicle Dynamic Simulation Software
0.1
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5
Carsim Matlab/Simulink Time(s)

Figure 9. AI of neighbor vehicle in scene 1

B. Scene 2 test result


Central Control Self-vehicle is following the front vehicle on straight road,
Platform
and neighbor vehicle cuts in from adjacent lane.

TABLE IV. SIMULATION PARAMETERS OF SCENES 2


Self-vehicle speed 120km/h, following front vehicle
Dynamic Driving Scene Simulation Neighbor vehicle speed 72km/h
Driver Compartment System Distance to Neighbor vehicle 30m
Figure 8. Hierarchy of HIL system Event Neighbor vehicle cutting in
Road type Straight road
The following table details the 4 different scenes. Notice We can compare the difference of object identification
that we have done many experiments with different drivers to spot by the following AI-time curves of Fig. 10.
verify the efficiency of AI, but only a few of them are picked
out as examples shown in the following figures. 0.8
AI of Neighbor Vehicle
AI of front vehicle
0.7
TABLE II. TEST SCENES
NV is identified by radar
0.6
Approaching Index

Object
No. Description 0.5
Identification NV is identified by communication

Neighbor vehicle cuts in while self-vehicle 0.4


1
cruising freely 0.3
Neighbor vehicle cuts in while self-vehicle
2 (1)Communicati
following front vehicle 0.2
on
Front hidden slow vehicle appears after (2) Radar
3 0.1
0 0.5 1 1.5 2 2.5
front vehicle cutting out suddently Time(s)

Neighbor vehicle cuts in while self-vehicle Figure 10. AI of neighbor vehicle and front vehicle in scene 2
4
cruising freely on curved road

130
The speed of NV can be changed freely. However, the 0.5
NV is identified by radar
influence from front vehicle can’t be ignored. In Fig. 10, 0.45

self-vehicle judges the object by the value of AI, because of 0.4 NV is identified by communication
the disturbance from front vehicle; NV is identified at 1.77s, a

Approaching Index
little late. Only 0.1s is advanced for self-vehicle avoiding 0.35

collision. But if self-vehicle is at high speed, 0.1s is very 0.3

import for taking measures. 0.25

C. Scene 3 test result 0.2

Self-vehicle is following the front vehicle on straight road, 0.15

a slow vehicle is sheltered by front vehicle. After front vehicle 0.1


0 1 2 3 4 5 6 7 8
cutting out, slow vehicle appears in front of self-vehicle Time(s)
Figure 12. AI of neighbor vehicle in scene 4
suddenly.
The curved road disturbs the judgment of TCP,
TABLE V. SIMULATION PARAMETERS OF SCENES 3
self-vehicle can’t lock object stably during test. However, AI
Self-vehicle speed 120km/h, following front vehicle threshold still helps self-vehicle identify neighbor vehicle at
Slow hidden vehicle 72km/h 0.37s, while radar-based identification method is at 1.35s. In
Distance to hidden vehicle 50m scene 4, 0.98s is advanced for self-vehicle avoiding collision.
Event Front vehicle cutting out But, because of the disturbance of curved road, AI has a period
Road type Straight road
of declination from 0.37s to 1.35s.
We can compare the difference of object identification
spot by the following AI-time curves of Fig. 11. V. DISCUSSIONS AND FUTURE WORK
The time of direct front vehicle cutting out can be changed Generally, communication can help self-vehicle perceive
according to test needs. Drivers can’t finding potential danger objects earlier than radar, and AI plays an important role for
on time if cutting out too late, and collide on the slow hidden identifying potential target under complicated conditions. In
vehicle. During the HIL simulation, 50 m was selected as the scene 1 and 3, it achieves significant amount of time advance
distance to hidden vehicle, when direct front vehicle began to without disturbance. In scene 2, the comparisons of two
cut out. It’s an appropriate value for driver to perceive vehicles’ AI results in lately identification. In scene 4, the
potential danger and take urgent measures. disturbance from road affects the value of AI. Despite this, AI
0.9
can still provide more effective reference for collision
AI of direct front vehicle avoidance control.
0.8 AI of hidden slow vehicle

0.7
In this paper, the definition of TCP is given without more
Hidden slow vehicle is identified by communication detailed explanations on its calculation algorithm. Because it
Approaching Index

0.6
involves another complicated field: vehicle dynamic trajectory
0.5
prediction, which is discussed in another part of our work. As
0.4
Hidden slow vehicle is identified by radar
the same reason, collision avoidance decision strategies and
0.3 vehicle dynamic control methods are complicated fields
0.2
beyond the research scope of this paper. The further research
work is to develop decision making strategies based on
0.1
0 0.5 1 1.5 2
Time(s)
2.5 3 3.5 4 4.5 approaching index.
Figure 11. AI of front vehicle and hidden slow vehicle in scene 3
VI. CONCLUSIONS
AI makes a significant improvement in this typical scene.
This paper explores the definition, algorithm and
Self-vehicle judges the object by the value of AI. Since the
application of approaching index, when vehicles are
slow vehicle holds a bigger AI, it was identified at 0.05s, as
connected by wireless communication. The goal of this paper
soon as possible. 2.64s is advanced for self-vehicle to avoid
is to provide a novel method of object identification by
collision.
communication under complicated traffic conditions.
D. Scene 4 test result
By taking the advantage of communication, TCP can be
Self-vehicle is cruising on curved road freely, and derived by calculating vehicle trajectories. TCP helps consider
neighbor vehicle cuts in from adjacent lane. the neighbor vehicles as potential objects, which is useful
under complicated traffic scenes. Based on TCP, the definition
TABLE VI. SIMULATION PARAMETERS OF SCENES 4 and algorithm of AI is built. AI is very important for selecting
Self-vehicle speed 100km/h, cruising freely collision object from potential objects.
Slow hidden vehicle 80km/h
Distance to hidden vehicle 30m
HIL tests with various traffic conditions were conducted.
Event Front vehicle cutting in The test data reflected the effectiveness of AI in different
Road type curved road scenes although subjected to interference. Driver can gain
more time to avoid oncoming collision by application of AI.
We can compare the difference of object identification
spot by the following AI-time curve of Fig. 12. Since the disturbances from undetermined sources affect
the value AI, the future work should focus on AI threshold

131
selection and development of corresponding collision
avoidance algorithm in different traffic scenes.

REFERENCES
[1] S. Mivahara, J. Sielagoski, F. Ibrahim, “Radar-based target tracking
method: Application to real road,” SAE transactions, vol. 114, no.7, pp.
421-430, 2005.
[2] L. Choon-Young, L. Ju-Jang, “Object Recognition Algorithm for
Adaptive Cruise Control of Vehicles Using Laser Scanning Sensor,”
2000 IEEE Intelligent Transportation Systems Conference Proceedings.
pp. 305-310, 2000.
[3] S. Noriko, F. Kazumi, O. Takahiko, et al., “An Algorithm for
Distinguishing the Types of Objects on the Road Using Laser Radar
and Vision,” IEEE transactions on intelligent transportation systems,
vol. 3, no. 3, pp.189-195, 2002
[4] J. Leonard, J. How, S. Teller, et al., “A Perception-Driven Autonomous
Urban Vehicle,” Journal of Field Robotics, vol. 25, no. 10, pp. 727 –
774, 2008.
[5] A. Polychronopoulos, M. Tsogas, A. Amditis, et al., “Dynamic
situation and threat assessment for collision warning systems: the
EUCLIDE approach,” in Proceedings of the IEEE Intelligent Vehicles
Symposium, 2004, pp. 636–641.
[6] J. Ploeg, B. Scheepers, E. van Nunen, et al, “Design and experimental
evaluation of cooperative adaptive cruise control,” 2011 14th
International IEEE Conference on Intelligent Transportation Systems
(ITSC). 2011, pp: 260-265.
[7] C. Motsinger, T. Hubing, “A review of vehicle-to-vehicle and
vehicle-to-infrastructure initiatives,” Informe técnico, The Clemson
University Vehicular Electronics Laboratory, 2007.
[8] J. Ploeg, S. Shladover, H. Nijmeijer, et al, “Introduction to the Special
Issue on the 2011 Grand Cooperative Driving Challenge,” Intelligent
Transportation Systems, IEEE Transactions on, vol. 13, no. 3, 2012,pp:
989-993.
[9] C. Hedges, F. Perry, “Overview and Use of SAE J2735 Message sets
for commercial vehicles,” SAE Technical Paper: 2008-01-2650, 2008.
[10] R. G. Anouck, B. Lozio, L. Sousa, et al, “A control architecture for
integrated cooperative cruise control and collision warning systems,”
Proceedings of the 40th IEEE Conference on Decision and Control.
Florida, USA, 2001,pp: 1491-1496.
[11] G. Toulminet, J. Boussuge, C. Laurgeau, “Comparative synthesis of the
3 main European projects dealing with Cooperative Systems (CVIS,
SAFESPOT and COOPERS) and description of COOPERS
Demonstration Site 4,” ITSC 2008. 11th International IEEE
Conference on. IEEE, 2008,pp: 809-814.
[12] C. Bergenhem, E. Hedin, D. Skarin, “Vehicle-to-vehicle
communication for a platooning system,” Procedia-Social and
Behavioral Sciences, vol.48, 2012,pp: 1222-1233.
[13] J. Hillenbrand, K. Kroschel, “A Study on the Performance of
Uncooperative Collision Mitigation Systems at Intersection-like
Traffic Situations,” in Proceedings of the IEEE Conference on
Cybernetics and Intelligent Systems, 2006, pp. 1–6.
[14] M. Lorenzo, S. Gianguido, “An analysis of lateral support systems to
increase safety at crossroad,” Intelligent Vehicles Symposium, Torino,
Italy, June 9-11 2003, pp. 383-388.
[15] K. Oje, S. H. Lee, Joon-Seok Kim, et al., “Collision Prediction at
Intersection in Sensor Network Environment,” ITSC 2006, Toronto,
Canada, September 17-20, 2006, pp. 982-987.
[16] C. Y. Chan, “Defining safety performance measures of
driver-assistance systems for intersection left-turn conflicts,” In
Intelligent Vehicles Symposium, 2006 IEEE (pp. 25-30). IEEE.
[17] U. Junpei, M. Junichiro, N Yuusuke, et al., “Development of
Vehicular-Collision Avoidance Support System by Inter-Vehicle
Communications,” Vehicular Technology Conference, 2004, pp
2940-2945.
[18] C. Hedges, and F. Perry, “Overview and Use of SAE J2735 Message
Sets for commercial Vehicles,” Training, 2006, 06-30.
[19] N. Michael, Artificial intelligence: A guide to Intelligent Systems
(Second Edition). New Jersey: Addison-Wesley, 2005.

132

You might also like