Professional Documents
Culture Documents
N. Nishavithri
Assistant Professor, Department of Electronics and Communication Engineering,
Mailam Engineering College, Mailam, Tamil Nadu, India
T. Manochandar
Assistant Professor, Department of Electronics and Communication Engineering,
CK College of Engineering and Technology, Cuddalore, Tamil Nadu, India
G. Manikannan
Assistant Professor, Department of Electronics and Communication Engineering,
CK College of Engineering and Technology, Cuddalore, Tamil Nadu, India
ABSTRACT
The main aim of this project is to design a modern vehicle (Wheelchair) for the physically
challenged people using blue eye technology. In olden days, the voice oriented wheelchair is
being used by the physically disabled person to move from one place to another. Then an
advancement with voice operated wheelchair and the brain wave controlled wheelchair had taken
place. In this proposed system, the modern technology called blue eye technology is used for the
controlling of wheelchair. The main concept behind the proposed system is that the wheelchair
senses the movement of eye and locates the wheelchair movement accordingly. The proposed
system consists of eye tracking module, eye tracking computer interface, processing unit and the
wheelchair unit consisting of PMDC motor module. The outcome of this project is that one can
easily locate their wheelchair by looking at the object itself. The distance of that particular object
is being calculated initially and processed to the eye tracking computer interface and the
movement of the wheelchair is carried out.
Key words: Blue eye technology, eye tracking computer module, Data Processing Module,
PMDC Control Unit.
Cite this Article: M. Selvaganapathy, N. Nishavithri, T. Manochandar and G. Manikannan.
Modern Vehicle for the Physically Challenged People Using Blue Eye Technology. International
Journal of Mechanical Engineering and Technology, 8(1), 2017, pp. 208212.
http://www.iaeme.com/IJMET/issues.asp?JType=IJMET&VType=8&IType=1
1. INTRODUCTION
The blue eye technology aims to create computational machine which have the perceptual ability and the
sensor ability. This technology uses eye tracking sensor to locate the target at any point of extent. The term
blue eye technologies can be split up into blue and eye in which the term blue means the Bluetooth
technology that enables the powerful wireless technology and the term eye denotes the information / data
acquisition through the eyes. The blue eye technology is needed to build a prototype that can understand
the emoticons of the user.
Here the prototype used is the modern wheelchair which is for the physically challenged people. The
eye tracking computer interface is an interface that tracks the movement of an eye and processes the
input informations to the computer. The various technologies used for locating the eye movements are
emotion mouse, Manual and Gaze Input Cascaded (MAGIC), Artificial Intelligence Speech Recognition,
Simple User Interface Tracker (SUITOR) and the eye movement sensor. The technology that is adopted in
this paper is an eye movement sensor.
2. METHODOLOGY
The proposed methodology has various modules such as eye ball tracking module, eye tracking computer
interface, Data Processing unit, Robotic Unit and PMDC motor unit.
Robotic Unit
Figure 1 Shows positioning of eye movement tracking (Center, Left and Right)
The eye movement tracking is used to determine the motion of eye balls. Based on the motion of eye,
the commands are given to the vehicle. This eye tracking module consists of web camera which is mounted
at the ECI module connected with the head of a user. The user may wear this module when they were
alive. Once the module is worn, it is activated and starts sensing the eye motion of the user suddenly.
The camera attached with the eye movement tracking module has an additional features like inbuilt
light capability, inbuilt memory, etc. The eye tracking module is interfaced with the computer by means of
eye tracking computer interface (ECI) module which has the driver for the eye motion tracking sensor.
The ECI interface performs various functionalities such as Motion capturing, Feature extraction and color
detection of a pixel, etc [1]
Human
Motion Eye Motion
Capturing Detection
using Web
Camera
6. ROBOTIC UNIT
The robotic module is connected to the wheels of the wheelchair. This module has data reception unit,
ARM controller, PMDC motor unit and a display. The data reception unit is used to receive the processed
data from the previous module and processing to the controller unit. The controller unit controls the PMDC
motor as per the instructions received from the data processing module. If the instruction received from the
data processing is left, then the ARM controller controls the PMDC motor 1 to rotate to the left side (and
PMDC motor 2 in case of right side).
7. CONCLUSION
This paper proposes a modern wheelchair for the physically challenged people controlling by the blue eye
technology. The user can wear the ECI module at their head which captures the eye motion and process it
to the motion detection unit. Once, the user targets the object present at the environment then the
wheelchair automatically moves towards the object without any primary instructions. There are six
operations performed: Start / Stop, Left / Right and Front / behind operations. Once the motion is
acquisited from the ECI module, the data processing unit processes the raw data to the robotic module
which in turn performs the movement of the wheelchair according to the request received from the Data
Processing Unit.
REFERENCES
[1] Danjie Zhu, Mikail Khunin and Theodore Raphan, Robust Hi Speed Binocular 3D eye movement
tracking system using two radii eye model, International Conference of IEEE Engineering in
Medicine and Biology, 2006.
[2] Mr. M. Selvaganapathy, Mrs. N. Nishavithri, Smart Wheel Chair using Neuro Sky Sensor,
International Journal of advanced Research in Computer and Communication Engineering, Vol. 4, Issue
11, November 2015, P. No. 361 366
[3] Congcong Ma; Wenfeng Li; Raffaele Gravina; Giancarlo Fortino, Activity recognition and
monitoring for smart wheelchair users, IEEE 20th International Conference on Computer
Supported Cooperative Work in Design (CSCWD), 2016, P. No. 664 669.
[4] Kazuto Miyawaki; Daiki Takahashi,Investigation of whole-body vibration of passenger sitting
on wheel chair and of passenger sitting on wheelchair loaded on lifter, International
Symposium on Micro NanoMechatronics and Human Science (MHS), 2016, P. No. 1 6.
[5] Nobuaki Kobayashi; Masahiro Nakagawa, BCI based control of electric wheelchair, IEEE
4th Global Conference on Consumer Electronics (GCCE), 2015, P. No. 429 430.
[6] M. Selvaganapathy and N. Nishavithri, Smart Wheel Chair using Neuro Sky Sensor,
International Journal of advanced Research in Computer and Communication Engineering,
Vol. 4, Issue 11, November 2015, P. No. 361 366.
[7] Oral Motion Controlled Intelligent Wheelchair, SICE Annual Conference 2007, Japan.
[8] Non-Invasive, Wireless and Universal Interface for the Control of Peripheral Devices by
Means of Head Movements, Proceedings of the 2007 IEEE 10th International Conference on
Rehabilitation Robotics, June 12-15.
[9] Rafael Barea, et. al. Guidance of a Wheelchair using Electro oculography, Electronics
Department University of Alcala, SPAIN.
[10] R. C. Simpson, Smart Wheelchairs: A Literature Review,Journal of Rehabilitation Resource
Development, 2005, P. No. 423 436.
[11] L. Fehr, W. Edwin Langbein and S. B. Skaar, Adequacy of Power Wheelchair Control
Interfaces for Persons with Severe Disabilities: A Clinical Survey, Journal of Rehabilitation
Resource Development, Issue 37, No. 3, 2000, P. No. 353360.
[12] S. Stoddard, et. al. Chart book on Work and Disability in the United States, 1998, An Info
Use Report, U. S. National Institute on Disability and Rehabilitation Research, 1998.
[13] Parth Ranjan Singh, Prathishastry, Jagadish D. Kini, Farzadtaheri and T.G.Giri Kumar, Design
and Development of a Data Glove for the Assistance of the Physically Challenged.
International Journal of Electronics and Communication Engineering & Technology (IJECET),
4(4), 2013, pp. 3641