You are on page 1of 5

International Journal of Mechanical Engineering and Technology (IJMET)

Volume 8, Issue 1, January 2017, pp. 208212, Article ID: IJMET_08_01_023


Available online at http://www.iaeme.com/IJMET/issues.asp?JType=IJMET&VType=8&IType=1
ISSN Print: 0976-6340 and ISSN Online: 0976-6359
IAEME Publication

MODERN VEHICLE FOR THE PHYSICALLY


CHALLENGED PEOPLE USING BLUE EYE
TECHNOLOGY
M. Selvaganapathy
Assistant Professor, Department of Electronics and Communication Engineering,
CK College of Engineering and Technology, Cuddalore, Tamil Nadu, India

N. Nishavithri
Assistant Professor, Department of Electronics and Communication Engineering,
Mailam Engineering College, Mailam, Tamil Nadu, India

T. Manochandar
Assistant Professor, Department of Electronics and Communication Engineering,
CK College of Engineering and Technology, Cuddalore, Tamil Nadu, India

G. Manikannan
Assistant Professor, Department of Electronics and Communication Engineering,
CK College of Engineering and Technology, Cuddalore, Tamil Nadu, India

ABSTRACT
The main aim of this project is to design a modern vehicle (Wheelchair) for the physically
challenged people using blue eye technology. In olden days, the voice oriented wheelchair is
being used by the physically disabled person to move from one place to another. Then an
advancement with voice operated wheelchair and the brain wave controlled wheelchair had taken
place. In this proposed system, the modern technology called blue eye technology is used for the
controlling of wheelchair. The main concept behind the proposed system is that the wheelchair
senses the movement of eye and locates the wheelchair movement accordingly. The proposed
system consists of eye tracking module, eye tracking computer interface, processing unit and the
wheelchair unit consisting of PMDC motor module. The outcome of this project is that one can
easily locate their wheelchair by looking at the object itself. The distance of that particular object
is being calculated initially and processed to the eye tracking computer interface and the
movement of the wheelchair is carried out.
Key words: Blue eye technology, eye tracking computer module, Data Processing Module,
PMDC Control Unit.
Cite this Article: M. Selvaganapathy, N. Nishavithri, T. Manochandar and G. Manikannan.
Modern Vehicle for the Physically Challenged People Using Blue Eye Technology. International
Journal of Mechanical Engineering and Technology, 8(1), 2017, pp. 208212.
http://www.iaeme.com/IJMET/issues.asp?JType=IJMET&VType=8&IType=1

http://www.iaeme.com/IJMET/index.asp 208 editor@iaeme.com


Modern Vehicle for the Physically Challenged People Using Blue Eye Technology

1. INTRODUCTION
The blue eye technology aims to create computational machine which have the perceptual ability and the
sensor ability. This technology uses eye tracking sensor to locate the target at any point of extent. The term
blue eye technologies can be split up into blue and eye in which the term blue means the Bluetooth
technology that enables the powerful wireless technology and the term eye denotes the information / data
acquisition through the eyes. The blue eye technology is needed to build a prototype that can understand
the emoticons of the user.
Here the prototype used is the modern wheelchair which is for the physically challenged people. The
eye tracking computer interface is an interface that tracks the movement of an eye and processes the
input informations to the computer. The various technologies used for locating the eye movements are
emotion mouse, Manual and Gaze Input Cascaded (MAGIC), Artificial Intelligence Speech Recognition,
Simple User Interface Tracker (SUITOR) and the eye movement sensor. The technology that is adopted in
this paper is an eye movement sensor.

2. METHODOLOGY
The proposed methodology has various modules such as eye ball tracking module, eye tracking computer
interface, Data Processing unit, Robotic Unit and PMDC motor unit.

Eye Movement Tracking

Eye tracking Computer


Interface

Data Processing Unit

Robotic Unit

PMDC Motor Unit

Chart 1 gives the methodology Chart for the movement of wheelchair

3. EYE MOVEMENT TRACKING

Figure 1 Shows positioning of eye movement tracking (Center, Left and Right)

http://www.iaeme.com/IJMET/index.asp 209 editor@iaeme.com


M. Selvaganapathy, N. Nishavithri, T. Manochandar and G. Manikannan

The eye movement tracking is used to determine the motion of eye balls. Based on the motion of eye,
the commands are given to the vehicle. This eye tracking module consists of web camera which is mounted
at the ECI module connected with the head of a user. The user may wear this module when they were
alive. Once the module is worn, it is activated and starts sensing the eye motion of the user suddenly.
The camera attached with the eye movement tracking module has an additional features like inbuilt
light capability, inbuilt memory, etc. The eye tracking module is interfaced with the computer by means of
eye tracking computer interface (ECI) module which has the driver for the eye motion tracking sensor.
The ECI interface performs various functionalities such as Motion capturing, Feature extraction and color
detection of a pixel, etc [1]

4. EYE TRACKING COMPUTER INTERFACE


The ECI module has various modules such as motion capturing module, Motion detection module, raw
data transmission module and wireless module (Bluetooth). The motion capture module initially captures
the motion of the user and process it to the human eye motion detection module. This motion detection
module has specific detection schemes based on the position of eyes (i.e.) either left, right or a side view
(angled view of an eye).
After detecting the motion of the eye, the module sends the raw data information to the wireless
module that is attached at the end of the ECI module. In Parallel, it checks for the distance of the object
from the location. Once the distance is measured, then the start command is being forwarded; and once the
wheelchair nears the targeted location, then the wheelchair automatically sends the stop command to the
PMDC unit. The wireless module used here is a Bluetooth module; since the short distance communication
is enough to execute the ECI module [2]

Human
Motion Eye Motion
Capturing Detection
using Web
Camera

Bluetooth Raw Data


Module Transmission

Figure 2 Shows the various blocks available in ECI Module

5. DATA PROCESSING UNIT


The Data Processing unit has Bluetooth receptor which receives the Bluetooth command sent by the
Bluetooth transmitter of ECI module. The level analysis platform checks for the direction command
received from the ECI module. Once the user starts targeting the object, the data processing unit gets the
start command and once the user nears the object, it automatically stops the wheelchair. The primary
directions concentrated in this data processing module are right or left and front or back. Once the direction
in noticed, the serial raw data is extracted and processed to the end unit called Robotic unit.

http://www.iaeme.com/IJMET/index.asp 210 editor@iaeme.com


Modern Vehicle for the Physically Challenged People Using Blue Eye Technology

Figure 3 Shows the blocks available in Data Processing Unit

6. ROBOTIC UNIT
The robotic module is connected to the wheels of the wheelchair. This module has data reception unit,
ARM controller, PMDC motor unit and a display. The data reception unit is used to receive the processed
data from the previous module and processing to the controller unit. The controller unit controls the PMDC
motor as per the instructions received from the data processing module. If the instruction received from the
data processing is left, then the ARM controller controls the PMDC motor 1 to rotate to the left side (and
PMDC motor 2 in case of right side).

Figure 4 Shows the blocks available in Robotic Module


And once the user starts focusing the target object, then the robotic module receives the command to
move front, thereby commanding both the PMDC motors to move front. And once, the targeted distance is
reached, then the stop command is instructed to the PMDC unit.

7. CONCLUSION
This paper proposes a modern wheelchair for the physically challenged people controlling by the blue eye
technology. The user can wear the ECI module at their head which captures the eye motion and process it
to the motion detection unit. Once, the user targets the object present at the environment then the
wheelchair automatically moves towards the object without any primary instructions. There are six
operations performed: Start / Stop, Left / Right and Front / behind operations. Once the motion is
acquisited from the ECI module, the data processing unit processes the raw data to the robotic module

http://www.iaeme.com/IJMET/index.asp 211 editor@iaeme.com


M. Selvaganapathy, N. Nishavithri, T. Manochandar and G. Manikannan

which in turn performs the movement of the wheelchair according to the request received from the Data
Processing Unit.

REFERENCES
[1] Danjie Zhu, Mikail Khunin and Theodore Raphan, Robust Hi Speed Binocular 3D eye movement
tracking system using two radii eye model, International Conference of IEEE Engineering in
Medicine and Biology, 2006.
[2] Mr. M. Selvaganapathy, Mrs. N. Nishavithri, Smart Wheel Chair using Neuro Sky Sensor,
International Journal of advanced Research in Computer and Communication Engineering, Vol. 4, Issue
11, November 2015, P. No. 361 366
[3] Congcong Ma; Wenfeng Li; Raffaele Gravina; Giancarlo Fortino, Activity recognition and
monitoring for smart wheelchair users, IEEE 20th International Conference on Computer
Supported Cooperative Work in Design (CSCWD), 2016, P. No. 664 669.
[4] Kazuto Miyawaki; Daiki Takahashi,Investigation of whole-body vibration of passenger sitting
on wheel chair and of passenger sitting on wheelchair loaded on lifter, International
Symposium on Micro NanoMechatronics and Human Science (MHS), 2016, P. No. 1 6.
[5] Nobuaki Kobayashi; Masahiro Nakagawa, BCI based control of electric wheelchair, IEEE
4th Global Conference on Consumer Electronics (GCCE), 2015, P. No. 429 430.
[6] M. Selvaganapathy and N. Nishavithri, Smart Wheel Chair using Neuro Sky Sensor,
International Journal of advanced Research in Computer and Communication Engineering,
Vol. 4, Issue 11, November 2015, P. No. 361 366.
[7] Oral Motion Controlled Intelligent Wheelchair, SICE Annual Conference 2007, Japan.
[8] Non-Invasive, Wireless and Universal Interface for the Control of Peripheral Devices by
Means of Head Movements, Proceedings of the 2007 IEEE 10th International Conference on
Rehabilitation Robotics, June 12-15.
[9] Rafael Barea, et. al. Guidance of a Wheelchair using Electro oculography, Electronics
Department University of Alcala, SPAIN.
[10] R. C. Simpson, Smart Wheelchairs: A Literature Review,Journal of Rehabilitation Resource
Development, 2005, P. No. 423 436.
[11] L. Fehr, W. Edwin Langbein and S. B. Skaar, Adequacy of Power Wheelchair Control
Interfaces for Persons with Severe Disabilities: A Clinical Survey, Journal of Rehabilitation
Resource Development, Issue 37, No. 3, 2000, P. No. 353360.
[12] S. Stoddard, et. al. Chart book on Work and Disability in the United States, 1998, An Info
Use Report, U. S. National Institute on Disability and Rehabilitation Research, 1998.
[13] Parth Ranjan Singh, Prathishastry, Jagadish D. Kini, Farzadtaheri and T.G.Giri Kumar, Design
and Development of a Data Glove for the Assistance of the Physically Challenged.
International Journal of Electronics and Communication Engineering & Technology (IJECET),
4(4), 2013, pp. 3641

http://www.iaeme.com/IJMET/index.asp 212 editor@iaeme.com

You might also like