You are on page 1of 2

MagiTact: Interaction with Mobile Devices Based

on Compass (Magnetic) Sensor


Hamed Ketabdar Kamer Ali Yüksel Mehran Roshandel
Quality and Usability Lab, Technical University of Berlin Deutsche Telekom
Deutsche Telekom Laboratories, Ernst-Reuter-Platz 7 Laboratories,
TU Berlin 10587, Berlin, Germany Ernst-Reuter-Platz 7
Ernst-Reuter-Platz 7 kamer.yuksel@telekom.de 10587 Berlin, Germany
10587 Berlin, Germany mehran.roshandel@telekom.de
hamed.ketabdar@telekom.de

ABSTRACT devices such cell phones, wrist watches, headsets, etc. In


In this work, we present a new technique for efficient use of these devices, it is extremely difficult to operate small
3D space around a mobile device for interaction with the buttons and touch screens. However, the space beyond the
device. Around Device Interaction (ADI) enables extending device can be easily used, no matter how small the device
interaction space of small mobile and tangible devices is. ADI can be also very useful for interaction when the
beyond their physical boundary. Our proposed method is device screen is not in the line of user's sight.
based on using compass (magnetic field) sensor integrated
in mobile devices (e.g. iPhone 3GS, G1 Android). In this ADI concept can allow coarse movement-based gestures
method, a properly shaped permanent magnet (e.g. in the made in the 3D space around the device to be used for
shape of a rod, pen or a ring) is used for interaction. The sending different interaction commands such as turning
user makes coarse gestures in the 3D space around the pages (in an e-book or calendar), changing sound volume,
device using the magnet. Movement of the magnet affects zooming, rotation, etc. ADI techniques are based on using
the magnetic field sensed by the compass sensor integrated different sensory inputs such as camera [1], infrared
in the device. The temporal pattern of the gesture is then distance sensors [2], touch screen at the back of device [3],
used as a basis for sending different interaction commands electric field sensing [4], etc.
to the mobile device. Zooming, turning pages, In this work, we propose ADI based on interaction with
accepting/rejecting calls, clicking items, controlling a music compass (magnetic) sensor integrated in some mobile
player, and game interaction are some example use cases. devices, using a properly shaped magnetic material. The
The proposed method does not impose changes in hardware user takes the magnet which can be in shape of a rod, pen or
specifications of the mobile device, and unlike optical ring in hand, and draws coarse gestures in 3D space around
methods is not limited by occlusion problems. the device (Fig. 1). These gestures can be interpreted as
different interaction commands by the device. In the next
Author Keywords section, we describe our approach in more detail.
Around Device Interaction, Mobile Devices, Compass
(Magnetic) Sensor, Magnet, Movement-Based Gestures. OUR APPROACH: INTERACTION BASED ON
MAGNETIC FIELD SENSOR
ACM Classification Keywords In this work, we demonstrate our initial investigations
I.5.4 [Computing Methodologies]: Pattern Recognition, towards using compass (magnetic field) sensor integrated in
Applications – Signal processing. mobile devices (e.g. iPhone 3GS, G1 Android) for ADI. In
General Terms our method, we use a regular magnetic material in a proper
Algorithms shape to be taken in hand (e.g. rod shaped, pen, ring), to
influence compass (magnetic) sensor by different
INTRODUCTION: AROUND DEVICE INTERACTION movement-based gestures, and hence interact with the
Around Device Interaction (ADI) is being increasingly device. We demonstrate the use of this interaction in
investigated as an efficient interaction technique for mobile different applications such as turning pages, zooming,
and tangible devices. ADI provides possibility of extending reacting to a call alert, and music playback.
interaction space of small mobile devices beyond their In the Introduction, we mentioned to a few methods for
physical boundary allowing effective use of 3D space ADI. Getting useful information from magnetic sensor is
around the device for interaction. This can be especially algorithmically much simpler than implementing computer
useful for small tangible/wearable mobile or controller vision techniques. Our simple yet effective approach also
Copyright is held by the author/owner(s). does not suffer from illumination variation and occlusion
IUI’10, February 7–10, 2010, Hong Kong, China. problems. Our technique does not impose changes or
ACM 978-1-60558-515-4/10/02.
adding huge number of extra sensors to mobile devices. For reaching an accuracy over 90%. This is still initial results
mobile devices such as iPhone and G1 Android, it is only and at the demo level. Although actual interaction with our
necessary to have a properly shaped magnet as an extra demo seems quite satisfactory, our ongoing studies show
accessory. In addition, considering the fact that the back of potential for obtaining even better classification results.
mobile device is usually covered by hand, optical ADI
techniques (e.g. camera and infra-red based) can face IMPLEMENTATION
difficulties for using the space at the back of device. We have implemented a demo application based on the
However, since the interaction in our method is based on presented method for iPhone 3GS. The interaction is used
magnetic field (which can pass through hand), the space at to turn pages left-right or up-down in a photo view and
the back of device can be efficiently used for interaction. document view application, as well as zooming a map in
and out. Zooming functionality can be also achieved using
space at the back of device, so that the screen doe not get
GESTURE RECOGNITION BASED ON MAGNETIC FIELD occluded. The application can also demonstrate
rejecting/accepting a call using our interaction method. This
The gestures are created based on moving the magnet (a rod
functionality can be achieved even when the phone is in a
or ring) by hand in the space around the device along
bag or pocket, and facilitates dealing with unexpected calls
different 3D trajectories. The gestures studied in this work
in an improper situation (e.g. office, meeting). In addition,
are mainly based on movement of magnet in front or at the
the application demonstrates interacting with a music player
back of device in different directions with different
for changing sound volume or music track.
periodicities.
The compass (magnetic) sensors provides a measure of REFERENCES
magnetic field strength along x, y, and z directions. The [1] Starner, T. and Auxier, J. and Ashbrook, D. and Gandy,
values change over a range of -128 to 128. The compass M, The gesture pendant: A self-illuminating, wearable,
sensor can be affected by the magnetic fields around the infrared computer vision system for home automation
device, most importantly the earth’s magnetic field. In order control and medical monitoring, International
to decrease the influence of these factors in gesture Symposium on Wearable Computing, 2000, pp. 87-94.
recognition, we calculate derivative of the signals over [2] Hinckley, K., Pierce, J., Sinclair, M., and Horvitz, E.
time. This is achieved by subtracting two consecutive 2000. Sensing techniques for mobile interaction. In
values, and acts as a high pass filter which allows only Proc. of
evidences related to gestures appear at the output. The
gestures are generated by rather fast movement of magnet UIST '00. ACM, pp. 91-100.
(resulting in fast changes in magnetic field), therefore their
[3] Baudisch, P. and Chu, G. Back-of-device interaction
evidence can be passed through a high pass filter.
allows creating very small touch devices. In Proc. of
The next processing step is feature extraction. The features CHI ‘09.
we have used are mainly based on average strength of
[4] Theremin, L.S. The Design of a Musical Instrument
magnetic field in different directions, average piecewise
Based on Cathode Relays. Reprinted in Leonardo Music
correlation between field strength in different directions,
J., No. 6, 1996, 49-50.
and zero crossing rate (for different directions). All the
features are extracted from high pass filtered signals. The
beginning and end of each gesture is identified based on
change of magnitude signal over a threshold. The
mentioned features are extracted over a window marked by
the beginning and end of gesture.
Final classification of gestures is done based on extracted
features using a heuristically designed binary decision tree.
The decision tree algorithm can run faster on the mobile
device as compared to some statistical machine learning
algorithms. Correlation between different directions,
magnitude in different directions and zero crossing rate are
used as basis for decision making and classification.
Figure 1
Initial evaluation of the algorithm has shown convincing
gesture recognition results for 6 gestures and 5 users,

You might also like