You are on page 1of 19

of the implementation of the system, rather than

on weaknesses in the algorithms. For example, side-


channel attacks may use timing, power consumption,
or electromagnetic measurements on the security
device. Side-channel attacks are primarily of concern
for biometric encryption systems and match-on-card
devices where an attack could potentially be mounted
by iteratively improving the presented biometric. Very
little research has been done to explore the feasibil-
ity of side-channel attacks, but the success of attacks
on biometric template security and biometric encryp-
tion suggests that such attacks are certainly feasible.
Security and Liveness, Overview
Signal to Noise Ratio
Information is transmitted or recorded by variations in a
physical quantity. For any information storage or trans-
mission system, there will be intended variations in the
physical quantity signal and unintended variations
noise. Inananalog telephone system, the signal (voice) is
represented by variation in a voltage level. As the signal is
transmitted along a phone line, it can pickup other
unintended variations e.g., leakage of other signals,
static from electrical storms that are noise. The ratio
of the signal level to the noise level is the signal to noise
ratio (SNR). Since it is a ratio, SNR is dimensionless.
However, SNR can be expressed as either an amplitude
ratio (voltage ratio for the phone example) or a power
ratio (milliWatts for the phone example). This leads to
confusion. SNR is frequently expressed as the log (base
10) of the ratio. When expressed as a log, the dimension-
less unit of SNR is decibel (dB).
What is signal and what is noise can depend on the
circumstances. Radio waves from lighting are noise to
an AM radio broadcast, but can be signal to a meteo-
rological experiment.
Iris Device
Signature Benchmark
Signature Databases and Evaluation
Signature Characteristics
Signature Features
Signature Corporate
Signature Databases and Evaluation
Signature Databases and Evaluation
MARCOS MARTINEZ-DIAZ, JULIAN FIERREZ
Biometric Recognition Group - ATVS,
Escuela Politecnica Superior, Universidad
Autonoma de Madrid, Campus de Cantoblanco,
Madrid, Spain
Synonyms
Signature benchmark; Signature corpora; Signature
data set
Definition
Signature databases are structured sets of collected
signatures from a group of individuals that are used
either for evaluation of recognition algorithms or as
part of an operational system.
Signature databases for evaluation purposes are, in
general, collections of signatures acquired using a
digitizing device such as a pen tablet or a touch-
screen. Publicly available databases allow a fair
performance comparison of signature recognition
algorithms proposed by independent entities. More-
over, signature databases play a central role in public
performance evaluations, which compare different
recognition algorithms by using a common experi-
mental framework. This type of databases is covered
in this entry.
On the other hand, signature databases can also be
a module of a verication or identication system.
1178 S Signal to Noise Ratio
They store signature data and other personal informa-
tion of the enrolled users. This signature database is
used during the operation of the recognition system
to retrieve the enrolled data needed to perform the
biometric matching. This kind of databases is not
addressed here.
Dynamic Signature Databases
Until the beginning of this century, research on auto-
matic signature verication had been carried out using
privately collected databases, since no public ones were
available. This fact limits the possibilities to compare
the performance of different systems presented in the
literature, which may have been tuned to specic cap-
ture conditions. Additionally, the usage of small data
sets reduces the statistical relevance of experiments.
The lack of publicly available databases has also been
motivated by privacy and legal issues, although the
data in these databases are detached from any personal
information. The impact of the signature structural
differences among cultures must also be taken into
account when considering experimental results on a
specic database. As an example, in Europe, signatures
are usually formed by a fast writing followed by a
ourish, while in North America, they usually corre-
spond to the signers name with no ourish. On the
other hand, signatures in Asia are commonly formed
by Asian characters, which are composed of a larger
number of short strokes compared with European or
North American signatures.
While some authors have made public the data-
bases used for their experimental results [1], most
current dynamic signature databases are collected by
the joint effort of different research institutions. These
databases are, in general, freely available or can be
obtained at a reduced cost. Many signature databases
are part of larger multimodal biometric databases,
which include other traits such as ngerprint or face
data. This is done for two main reasons: the research
interest on multimodal algorithms and the low effort
required to incorporate the collection of other biomet-
ric traits once a database acquisition campaign has
been organized.
Two main modalities in signature recognition exist.
Off-line systems use signature images that have been
previously captured with a scanner or camera. On the
other hand, on-line systems employ digitized signals
from the signature dynamics such as the pen position
or pressure. These signals must be captured with spe-
cic devices such as pen tablets or touch-screens.
The most popular databases in the biometric research
community are oriented to on-line verication,
although in some of them, the scanned signature
images are also available [2, 3]. Some efforts have been
carried out in the handwriting community to collect
large off-line signature databases such as the GPDS-960
Corpus [4].
Unlike other biometric traits, signatures can be
forged with relative ease. Signature verication systems
must not only discriminate traits from different sub-
jects (such as ngerprints) but also must discriminate
between genuine signatures and forgeries. In general,
signature databases provide a number of forgeries for
the signatures of each user. The accuracy of the for-
geries depends on the acquisition protocol, the skill of
the forgers, and on how much time the forgers are let
to train. Nevertheless, forgeries in signature databases
are usually performed by subjects with no prior expe-
rience in forging signatures, this being a limitation to
the quality of forgeries.
Most on-line signature databases have been cap-
tured with digitizing tablets. These tablets are, in
general, based on an electromagnetic principle, allow-
ing the detection of the pen position (x, y), inclination
angles (y, g)(azimuth, altitude), and pressure p. They
allow recording the pen dynamics even when the pen is
not in contact with the signing surface (i.e., during
pen-ups). On the other hand, databases captured
with other devices such as touch-screens (e.g., PDAs)
provide only pen position information, which is
recorded exclusively when the pen is in contact with
the device surface.
In the following, a brief description of the most
relevant available on-line signature databases is given
in chronological order.
PHILIPS Database
Signatures from 51 users were captured using a Philips
Advanced Interactive Display (PAID) digitizing tablet
at a sampling rate of 200 Hz [5]. The following signals
were captured: position coordinates, pressure, azi-
muth, and altitude.
Each user contributed 30 genuine signatures, leading
to 1,530 genuine signatures. Three types of forgeries are
present in the database: 1,470 over-the-shoulder for-
geries, 1,530 home-improved, and 240 professional
Signature Databases and Evaluation S1179
S
forgeries. There is not a xed number of forgeries avail-
able for each user. Over-the-shoulder forgeries were
produced by letting the forger observe the signing pro-
cess. Home-improved forgeries were produced by giving
to the forgers samples of the signature static image and
letting them to practice at home. Professional forgeries
were performed by forensic document examiners.
MCYT Bimodal Database
The MCYT bimodal database is comprised of signatures
and ngerprints from 330 individuals [2]. Signa-
tures were acquired using a Wacom Intuos A6 tablet
with a sampling frequency of 100 Hz. The users signed
repeatedly on a paper with a printed grid placed over the
pen tablet. The following time sequences are captured:
position coordinates, pressure, azimuth, and altitude.
There are 25 genuine signatures and 25 forgeries
per user, leading to 16,500 signatures in the database.
For each user, signatures were captured in groups of 5.
First, 5 genuine signatures, then 5 forgeries from an-
other user, repeating this sequence until 25 signatures
from each type, were performed. Each user provided 5
forgeries for the 5 previous users in the database. As
the user is forced to concentrate on different tasks
between each group of genuine signatures, the varia-
bility between groups is expected to be higher than the
one within the same group.
Genuine signatures and forgeries corresponding
to 75 users from the MCYT database were scanned
and are also available as an off-line signature database.
This signature corpus is one of the most popular for the
evaluation of signature verication algorithms that are
being used by more than 50 research groups worldwide.
BIOMET Multimodal Database
The BIOMETmultimodal database [6] is comprised of
ve modalities: audio (voice), face, hand, ngerprint,
and signature. The signatures were captured using
a Wacom Intuos2 A6 pen tablet and an ink pen with
a sampling rate of 100 Hz. The pen coordinates,
pen-pressure, azimuth, and altitude signals were cap-
tured. The database contains data from 84 users, with
15 genuine signatures and up to 12 forgeries per user.
Signatures were captured in two sessions separated by
35 months. In the rst session, 5 genuine signatures
and 6 forgeries were acquired. The remaining 10
genuine signatures and 6 forgeries were captured in
the second session. Forgeries are performed by 4 dif-
ferent users (3 forgeries each). This database contains
2,201 signatures, since not all users have complete data:
8 genuine signatures and 54 forgeries are missing.
SVC2004 Database
Two signature databases were released prior to the
Signature Verication Competition (SVC) 2004 [7]
for algorithm development and tuning. They were
captured using a Wacom Intuos digitizing tablet and
a Grip Pen. Due to privacy issues, users were advised to
use invented signatures as genuine ones. Nevertheless,
users were asked to thoroughly practice their invented
signatures to reach a reasonable level of spatial and
temporal consistency.
The two databases differ in the available data, and
correspond to the two tasks dened in the competi-
tion. One contains only pen position information,
while the other provides pressure and pen orientation
(azimuth and altitude) signals also. Each database con-
tains 40 users, with 20 genuine signatures and 20 for-
geries per user acquired in two sessions, leading to
1,600 signatures per database. Forgeries for each user
were produced by at least four other users, aided by a
visual tool, which represented the signature dynamics
on a display. Both occidental and asian signatures are
present in the databases.
SUSIG Database
The SUSIG database consists of two sets: one cap-
tured using a pen tablet without visual feedback
(Blind subcorpus) and the other using a pen tablet
with an LCD display (Visual subcorpus) [8]. There are
100 users per database, but these do not coincide,
as the Visual subcorpus was captured 4 years after
the Blind one. For the Blind subcorpus, a WACOM
Graphire2 pen tablet was used. The Visual subcorpus
was acquired using an Interlink Electronics ePad-ink
tablet, with a pressure-sensitive LCD. In both subcor-
pora, the pen coordinates and the pen pressure signals
were captured using a sampling frequency of 100 Hz.
While performing forgeries, users had prior visual
input of the signing process on a separate screen or
on the LCD display for the Blind and Visual subcorpus
respectively.
1180 S Signature Databases and Evaluation
For the Blind subcorpus, 8 or 10 genuine signatures
were captured in a single session. The users also
provided 10 forgeries from another randomly selected
user. Two sessions were performed in the Visual sub-
corpus. During each one, users provided 10 genuine
signatures and 5 forgeries.
MyIDea Multimodal Database
This signature set is a subset of the MyIDea Multimod-
al Biometric Database [9]. AWacom Intuos2 A4 graph-
ic tablet was used at a sampling rate of 100 Hz. Pen
position, pressure, azimuth, and altitude signals were
captured. This data set has the particularity that the
user must read loud what he is writing, allowing what
the authors call CHASM (Combined Handwriting and
Speech Modalities). This corpus consists of ca. 70 users.
Signatures were captured in 3 sessions. During each
session, each user performed 6 genuine signatures
and 6 forgeries, with visual access to the images of the
target signatures.
BiosecurID Multimodal Database
This database was collected by 6 different Spanish
research institutions [3]. It includes the following bio-
metric traits: speech, iris, face, signature, handwriting,
ngerprints, hand, and keystroke. The data were cap-
tured in 4 sessions distributed in a 4 month time span.
The user population was specically selected to contain
a uniform distribution of users from different ages
and genders. Nonbiometric data were also stored,
such as age, gender, handedness, vision aids, and man-
ual worker (if the user has eroded ngerprints). This
allows studying specic demographic groups.
The signature pen-position, pressure, azimuth, and
altitude signals were acquired using a Wacom Intuos3
A4 digitizer at 100 Hz. During each session, two sig-
natures were captured at the beginning and two at the
end, leading to 16 genuine signatures per user. Each
user performed one forgery per session of signatures
from other three users in the database. The skill level
of the forgeries is increased by showing to the forger
more information of the target signature incremen-
tally. In the rst session, forgers have only visual access
to one genuine signature; more data (i.e., signature
dynamics) are shown in further sessions and forgers
are let more time to train. Off-line signature data are
also available, since signatures were captured using an
inking pen.
BioSecure Multimodal Database
The BioSecure Multimodal Database was collected by
11 European institutions under the BioSecure Network
of Excellence [10]. It has three data sets captured in
different scenarios: DS1 was captured remotely over
the internet, DS2 was acquired in a desktop environ-
ment, and DS3 under mobile conditions. The database
covers face, ngerprint, hand, iris, signature, and
speech modalities and includes two signature subcor-
pora corresponding to the DS2 and DS3 data sets.
These two data sets were produced by the same group
of 667 users. The DS2 data set was captured using a
Wacom Intuos3 A6 digitizer at 100 Hz and an ink pen
while the user was sitting. On the other hand, the DS3
data set was captured with a PDA. Users were asked to
sign while standing and holding the PDA in one hand,
emulating realistic operating conditions. An HP iPAQ
hx2790 with a sampling frequency of 100 Hz was used
as capture device. The pen position, pressure, azimuth,
and altitude signals are available in DS2, while only the
pen position is available on DS3 due to the nature of
the PDA touch-screen.
Signatures were captured in two sessions and in
blocks of 5. An average of two months was left be-
tween each session. During each session, users were
asked to perform 3 sets of 5 genuine signatures and
5 forgeries between each set. Following this protocol,
each user performed 5 forgeries for the previous 4
users in the database. Thus, 30 genuine signatures and
20 forgeries are available for each user. Forgeries are
collected in a worst case scenario. For DS2, the
users had visual access to the dynamics of the signing
process of the signatures they had to forge on a com-
puter screen. In DS3, each forger had access to the
dynamics of the genuine signature on the PDA screen
and a tracker tool allowing to see the original strokes.
Some users were even allowed to sign following the
strokes produced by the tracker tool, reproducing
thus the geometry and dynamics of the forged signa-
ture with high accuracy.
The DS3 data set is the rst multisession database
captured on a PDA and represents a very challenging
database [11]. Apart from the high accuracy of the
Signature Databases and Evaluation S1181
S
forgeries, signatures from DS3 present sampling errors
and irregular sampling rates. Moreover, pen posi-
tion signals during pen-ups are not available, since
the acquisition device captures the pen dynamics only
when the PDA stylus is in contact with the touch-
screen surface.
The capture process for both DS2 and DS3 is shown
in Fig. 1. Examples of signatures from the BioSecure
Signature subcorpora corresponding to DS2 and DS3
are presented in Fig. 2. Unconnected samples represent
that at least one sample is missing between them due to
sampling errors.
In Table 1, the main features of the described sig-
nature databases are presented.
Signature Verification Evaluation
Campaigns
Despite the usage of a common database, one of the
main difculties when comparing the performance of
different biometric systems is the different experimen-
tal conditions, under which each system is evaluated by
its designers. To overcome these difculties, evalua-
tions and competitions provide a common reference
for system comparison on the same database and pro-
tocol. Public evaluations in the eld of automatic sig-
nature verication are less common than for other
biometric modalities such as ngerprint or speech.
In particular, only evaluations for the on-line signature
verication modality have been proposed. These in-
clude the Signature Verication Competition (SVC),
which took place in 2004 [7], the signature modality of
the BioSecure Multomodal Evaluation Campaign held
in 2007 [12], and the BioSecure Signature Evaluation
Campaign in 2009 [13].
Signature Verification Competition
(SVC 2004)
The Signature Verication Competition (SVC 2004)
represents the rst public evaluation campaign in the
eld of signature verication [7]. The competition was
divided into two tasks, depending on the available
signature signals. In Task 1, only the pen position
signals (x, y) and the sample timestamps were available.
In Task 2, the pen pressure p and azimuth and altitude
angles (y, g) were also available. Participants had prior
access to a signature dataset for each task. These data
sets were later released for public access, and are
referred to as the SVC2004 database. Signatures from
40 users are present in each data set. This evaluation
has the particularity that users were advised to use
invented signatures because of privacy issues. More-
over, they did not have visual feedback from the sign-
ing process, since signatures were captured with a
digitizing tablet and a special pen.
The evaluation results were rst released to par-
ticipants, which then had the choice to remain anony-
mous. The best Equal Error Rate (EER) in Task 1
was of 2.84% against skilled forgeries and 1.85%
for random forgeries. In Task 2 (which included
pressure and pen-inclination signals), the lowest
EERs were 2.89% against skilled forgeries and 1.70%
against random forgeries.
Signature Databases and Evaluation. Figure 1 PDA signature capture process in the BioSecure DS3 - Mobile Scenario
dataset (left) and pen-tablet capture process in the BioSecure DS2 - Access Control Scenario dataset (right). The
acquisition setup and paper template used in DS2 is similar to the ones used in MCYT, BIOMET, MyIDea and BiosecurID.
1182 S Signature Databases and Evaluation
BioSecure Multimodal Evaluation
Campaign (BMEC 2007)
The BioSecure Multimodal Evaluation Campaign
(BMEC) was held in 2007 with the aim of compar-
ing the performance of verication systems from
different research groups on individual biometric
modalities and fusion scenarios [14]. Two scenarios
were considered: access control and mobile condi-
tions. In particular, the Mobile Scenario consisted of
four modalities and fusion, using a subset of the
BioSecure Multimodal Database DS3 captured on
mobile conditions (i.e., using portable devices such
as a PDA).
Signature Databases and Evaluation. Figure 2 Examples of signatures and associated signals from the BioSecure
Multimodal Database DS2 and DS3 signature subcorpora captured using a pen tablet (top) and a PDA (bottom),
respectively. As can be seen, there are missing samples for the signature captured with PDA, and no signals are available
during pen-ups, contrary to the pen-tablet case.
Signature Databases and Evaluation S1183
S
In this evaluation, a signature subset from the Bio-
Secure Multimodal DS3 database was used. A set of
signatures from 50 users was previously released to
participants for algorithm development and tuning.
For each user, 20 genuine signatures (15 from the rst
session and 5 from the second) as well as 20 forgeries
were available.
Eleven signature verication systems from seven
independent European research institutions were pre-
sented to the evaluation. The results of the evaluation
and a description of each system that participated can
be found in [12]. Another evaluation study in similar
conditions, including a comparative analysis with
respect to the BMEC participants, can be found in
[11]. The best Equal Error Rate (EER) in the evalua-
tion was of 4.03% for random forgeries and of 13.43%
for skilled forgeries. The relatively high EER for skilled
forgeries reveals the high quality of the forgeries
acquired in this database.
BioSecure Signature Evaluation Campaign
(BSEC 2009)
The BioSecure Signature Evaluation Campaign is
aimed at measuring the impact of mobile acquisition
conditions, time variability, and the information con-
tent of signatures in the performance of verication
algorithms [13]. Signature subsets from the BioSecure
Multimodal Databases DS2 (pen tablet) and DS3 (PDA
touch-screen) corresponding to 50 users have been
released to participants prior to the evaluation. At the
time of publication, the results of the evaluation cam-
paign are still not available.
Related Entries
Biometric Sample Acquisition
Off-line signature verication
Performance Evaluation, Overview
Signature Recognition
References
1. Munich, M.E., Perona, P.: Visual identication by signature
tracking. IEEE Trans. Pattern Anal. Mach. Intell. 25(2),
200217 (2003)
2. Ortega-Garcia, J., Fierrez-Aguilar, et al.: MCYT baseline corpus:
a bimodal biometric database. IEE Proc. Vis. Image Signal Pro-
cess. 150(6), 391401 (2003)
3. Fierrez, J., Galbally, J., Ortega-Garcia, J., Freire, M.R., Alonso-
Fernandez, F., Ramos, D., Toledano, D.T., Gonzalez-Rodriguez,
J., Siguenza, J.A., Garrido-Salas, J., Anguiano-Rey, E., de Rivera,
G.G., Ribalda, R., Faundez-Zanuy, M., Ortega, J.A., Cardenoso-
Payo, V., Viloria, A., Vivaracho, C.E., Moro, Q.I., Igarza, J.J.,
Sanchez, J., Hernaez, I., Orrite-Urunuela, C., Martinez-Contreras,
F., Gracia-Roche, J.J.: Biosecurid: A multimodal biometric data-
base. Pattern Analysis & Applications (to appear) (2009)
Signature Databases and Evaluation. Table 1 Summary of the most popular on-line signature databases. The symbols
x, y, p, y, g denote pen position horizontal coordinate, vertical coordinate, pen pressure, azimuth and altitude respectively
Name Device Users Sessions
Signatures per user
Signals Interval between sessions Genuine Forgeries
PHILIPS Pen tablet 51 35 30 up to 70 x, y, p, y, g 1 week approx.
BIOMET Pen tablet 84 3 15 up to 12 x, y, p, y, g 35 months
MCYT Pen tablet 330 1 25 25 x, y, p, y, g -
SVC2004 Task 1 Pen tablet 40 2 20 20 x, y min. 1 week
SVC2004 Task 2 Pen tablet 40 2 20 20 x, y, p, y, g min. 1 week
SUSIG Blind Subcorpus Pen tablet 100 1 8 or 10 10 x, y, p -
SUSIG Visual Subcorpus Pen tablet 100 2 20 10 x, y, p 1 week approx.
MyIDea Pen tablet ca. 100 3 18 18 x, y, p, y, g days to months
BioSecurID Pen tablet 400 4 16 16 x, y, p, y, g 1 month approx.
BioSecure DS2 Pen tablet ca. 650 2 30 20 x, y, p, y, g 1 month approx.
BioSecure DS3 PDA ca. 650 2 30 20 x, y, p, y, g 1 month approx.
1184 S Signature Databases and Evaluation
4. Vargas, J., Ferrer, M., Travieso, C., Alonso, J.: Off-line hand-
written signature GPDS-960 corpus. In: Proceedings of ninth
International Conference on Document Analysis and Recogni-
tion, ICDAR, vol. 2, pp. 764768. Curituba, Brazil (2007)
5. Dolng, J.G.A., Aarts, E.H.L., van Oosterhout, J.J.G.M.:
On-line signature verication with Hidden Markov Models.
In: Proceedings of the International Conference on Pattern
Recognition, ICPR, pp. 13091312. IEEE CS Press. Brisbane,
Australia (1998)
6. Garcia-Salicetti, S., Beumier, C., Chollet, G., Dorizzi, B., Jardins,
J.L.L., Lanter, J., Ni, Y., Petrovska-Delacretaz, D.: BIOMET: A
multimodal person authentication database including face,
voice, ngerprint, hand and signature modalities. In: Proceed-
ings of IAPR International Conference on Audio- and Video-
based Person Authentication, AVBPA, pp. 845853. Springer
LNCS-2688. Brisbane, Australia (2003)
7. Yeung, D.Y., Chang, H., Xiong, Y., George, S., Kashi, R.,
Matsumoto, T., Rigoll, G.: SVC2004: First international signa-
ture verication competition. In: Proceedings of International
Conference on Biometric Authentication, ICBA, pp. 1622.
Springer LNCS-3072 (2004)
8. Kholmatov, A., Yanikoglu, B.: Newblock Susig: an on-line signa-
ture database, associated protocols and benchmark results. Pat-
tern Analysis & Applications (2008)
9. Dumas, B., Pugin, C., Hennebert, J., Petrovska-Delacretaz, D.,
Humm, A., Evequoz, F., Ingold, R., Rotz, D.V.: MyIDea -
multimodal biometrics database, description of acquisition pro-
tocols. In: Proceedings of third COST 275 Workshop (COST
275), pp. 5962. Hateld, UK (2005)
10. Association BioSecure: BioSecure multimodal database. (http://
www.biosecure.info) (2007). Last Accessed 03 March, 2009
11. Martinez-Diaz, M., Fierrez, J., Galbally, J., Ortega-Garcia, J.:
Towards mobile authentication using dynamic signature veri-
cation: useful features and performance evaluation. In: Proc.
Intl. Conf. on Pattern Recognition, ICPR pp. 16 (2008)
12. TELECOM & Management SudParis: BioSecure Multimodal Eval-
uation Campaign 2007 Mobile Scenario - experimental results.
Tech. rep. (2007). (http://biometrics.it-sudparis.eu/BMEC2007/
les/Results_mobile.pdf). Last Accessed 03 March, 2009
13. TELECOM & Management SudParis: Biosecure Signature
Evaluation Campaign, BSEC 2009. http://biometrics.it-sudparis.
eu/BSEC2009. URL http://biometrics.it-sudparis.eu/BSEC2009
14. Alonso-Fernandez, F., Fierrez, J., Ramos, D., Ortega-Garcia, J.:
Dealing with sensor interoperability in multi-biometrics: the
UPM experience at the BioSecure Multimodal Evaluation 2007.
In: Defense and Security Symposium, Biometric Technologies
for Human Identication, BTHI, Proc. SPIE, vol. 6944. Orlando,
USA (2008)
Signature Dataset
Signature Databases and Evaluation
Signature Features
MARCOS MARTINEZ-DIAZ
1
, JULIAN FIERREZ
1
,
SEIICHIRO HANGAI
2
1
Biometric Recognition Group - ATVS, Escuela
Politecnica Superior, UniversidadAutonoma de Madrid,
Campus de Cantoblanco, Madrid, Spain
2
Department of Electrical Engineering, Tokyo
University of Science, Japan
Synonyms
Signature characteristics
Definition
Signature features represent magnitudes that are extrac-
ted from digitized signature samples, with the aim of
describing each signature as a vector of values. The
extraction and selection of optimum signature features
is a crucial step when designing a verication system.
Features must allow each signature to be described in a
way that the discriminative power between signatures
produced by different users is maximized while allowing
variability among signatures from the same user.
On-line signature features can be divided into two
main types. Global features model the signature as
a holistic multidimensional vector and represent mag-
nitudes such as average speed, total duration, and
aspect ratio. On the other hand, local features are
time-functions derived from the signals, such as the
pen-position coordinate sequence or pressure signals,
captured with digitizer tablets or touch-screens.
In off-line signature verication systems, features
are extracted from a static signature image. They can
also be classied as global, if they consider the image as
a whole (e.g., image histogram, signature aspect ratio);
or local, if they are obtained from smaller image
regions (e.g., local orientation histograms).
This entry is focused on on-line signature features,
although a brief outline of off-line signature features
is also given.
Introduction
Several approaches to extract discriminative informa-
tion from on-line signature data have been proposed
Signature Features S1185
S
in the literature [1]. The existing systems can be broadly
divided into two main types: Global systems, in which a
holistic vector representation consisting of a set of
global features (e.g., signature duration, direction
after rst pen-up) is derived from the signature trajec-
tories [2, 3], and function-based systems, in which time
sequences describing local properties of the signature
are used for recognition [4, 5], (e.g., position, acceler-
ation). Although recent works show that global
approaches are competitive with respect to local meth-
ods in some circumstances [6], the latter approach has
traditionally yielded better results. Despite this advan-
tage, systems based on local features usually employ
matching algorithms, which are computationally more
expensive than global-feature ones.
Due to the usually low amount of training data
in signature verication, feature selection techniques
must be applied in order to reduce the feature vector
dimensionality. These techniques allow of nding the
optimal feature set for each system or scenario [7].
Feature extraction and preprocessing
Signature features are, in general, extracted from
the time functions captured from the pen dynamics
while an individual signs. In most cases, the capture
of time functions from the handwritten signature
is carried out with acquisition devices such as digitiz-
ing tablets or touch-screens. These devices provide
pen position information (i.e. horizontal x and verti-
cal y coordinates), and in some cases, pen pressure
z and pen inclination (azimuth and altitude).
A diagram showing the nature of the captured signals
and an example of the signals from a real signature
are shown in Fig. 1. Other less common examples of
on-line signature acquisition devices are special pens
with dedicated hardware inside that captures signa-
ture data such as coordinate, force, or velocity
information.
The sampling rate of these devices is, in general, bet-
ween 100 and 200 Hz. Since the maximum frequencies
of the pen movements during handwriting are 20-30 Hz
[1], these sampling rates satisfy the Nyquist criterion.
Preprocessing steps before feature extraction may
be performed, such as position, size and rotation nor-
malization, noise ltering, or resampling. In some
works, resampling is avoided as it degrades the velocity
related features [4].
Global features
Global feature-based systems describe each signature
as a multidimensional vector where each element
consists on a feature extracted from the whole pen
trajectory. Many feature sets have been proposed in
the literature [2, 3, 8, 9], with variable sizes and a
maximum size of 100 features [6]. Due to the train-
ing data scarcity and adverse effects of the curse of
dimensionality, feature selection techniques must be
applied to reduce the feature vector size. In Table 1,
the 100 features described in [6] are presented.
This global feature set includes most of the features
described in previous works from other authors.
Features are arranged in the order of descending
individual discriminative power. In Fig. 2, examples
of the distribution of global features presented in
Table 1 are shown.
Local features
Local features represent time sequences extracted
from the signature raw captured data. A set of local
features leads to a multidimensional discrete se-
quence that describes a signature. Depending on the
matching algorithm, feature sets of varying sizes have
been proposed in the literature. As a rule of thumb,
Dynamic Time Warping-based algorithms employ
few local features, while systems based on Hidden
Markov Models or Gaussian Mixture Models employ
larger feature sets. In Table 2, the most popular local
features found in the literature are presented [2, 3, 4, 5,
10, 11, 12].
As in the case of global features, feature selection
algorithms must be applied to discriminate the best
performing feature set. Usually, small feature sets are
selected for Dynamic Time Warping-based matching
algorithms. In these systems, speed-related features
extracted from the rst derivative of the pen-coordi-
nate time sequences (features 10-11 in Table 2) have
shown to be very effective [4]. On the other hand,
larger feature sets are used when Hidden Markov or
Gaussian Mixture Models are employed [5, 11] for
signature matching. Features related to second-order
derivatives (features 19-27 in Table 2) have not proved
to signicantly improve the system verication perfor-
mance [3]. Examples of the local features presented in
Table 2 are depicted in Fig. 3.
1186 S Signature Features
The usage of features related to pen orientation
(azimuth and altitude) is a subject of controversy.
Although some authors report that these features in-
crease the verication performance [12], others have
reported a low discriminative power for these features
[2]. Moreover, these features are not always available,
since many touch-screen acquisition devices such as
Tablet-PCs or PDAs are unable to capture pen orienta-
tion information.
The fusion of the global and local feature-based
systems has been reported to provide better perfor-
mance than the individual systems [6].
Signature Features. Figure 1 (a) Representation of the position, azimuth and altitude of the pen with respect to the
capture device. (b) Example of raw captured data from a signature.
Signature Features S1187
S
Signature Features. Table 1 Set of global features sorted by individual discriminative power (T denotes time interval,
t denotes time instant, N denotes number of events, y denotes angle. Note that some symbols are dened in different
features of the table (e.g., D in feature 7 is dened in feature 15)
Ranking Feature Description Ranking Feature Description
1 signature total duration T
s
2 N(pen-ups)
3 N(sign changes of dx dt and dy dt) 4 average jerk

j
5 standard deviation of a
y
6 standard deviation of v
y
7 (standard deviation of y)/D
y
8 N(local maxima in x)
9 standard deviation of a
x
10 standard deviation of v
x
11 j
rms
12 N(local maxima in y)
13 t(2nd pen-down) T
s
14 (average velocity v)/v
x, max
15
A
min
y
max
y
min
x
max
x
min

D
x

P
pendowns
i1
x
max ji
x
min ji
D
y
16 (x
last pen-up
x
max
) D
x
17 (x
1st pen-down
x
min
) D
x
18 (y
last pen-up
y
min
) D
y
19 (y
1st pen-down
y
min
) D
y
20 (T
w
v) (y
max
y
min
)
21 (T
w
v) (x
max
x
min
) 22 (pen-down duration T
w
)/T
s
23 v v
y, max
24 (y
last pen-up
y
max
) D
y
25
Tdy=dt =dx=dt >0
Tdy=dt =dx=dt <0
26 v v
max
27 (y
1st pen-down
y
max
) D
y
28 (x
last pen-up
x
min
) D
x
29 (velocity rms v)/v
max
30
x
max
x
min
D
y
y
max
y
min
D
x
31 (velocity correlation v
x, y
)/v
2
max
32 T(v
y
>0j pen-up) T
w
33 N(v
x
0) 34 direction histogram s
1
35 (y
2nd local max
y
1st pen-down
) D
y
36 (x
max
x
min
)/x
acquisition range
37 (x
1st pen-down
x
max
) D
x
38 T(curvature>Threshold
curv
) T
w
39 (integrated abs. centr. acc. a
Ic
)/a
max
40 T(v
x
>0) T
w
41 T(v
x
<0j pen-up) T
w
42 T(v
x
>0j pen-up) T
w
43 (x
3rd local max
x
1st pen-down
) D
x
44 N(v
y
0)
45 (acceleration rms a)/a
max
46 (standard deviation of x)/D
x
47
Tdx=dt dy=dt >0
Tdx=dt dy=dt <0
48 (tangential acceleration rms a
t
)/a
max
49 (x
2nd local max
x
1st pen-down
) D
x
50 T(v
y
<0j pen-up) T
w
51 direction histogram s
2
52 t(3rd pen-down) T
s
53 (max distance between points) A
min
54 (y
3rd local max
y
1st pen-down
) D
y
55 ( xx
min
) x 56 direction histogram s
5
57 direction histogram s
3
58 T(v
x
<0) T
w
59 T(v
y
>0) T
w
60 T(v
y
<0) T
w
61 direction histogram s
8
62 (1st t(v
x, min
))/T
w
63 direction histogram s
6
64 T(1st pen-up) T
w
65 spatial histogram t
4
66 direction histogram s
4
67 (y
max
y
min
)/y
acquisition range
68 (1st t(v
x, max
))/T
w
69 (centripetal acceleration rms a
c
)/a
max
70 spatial histogram t
1
71 y(1st to 2nd pen-down) 72 y(1st pen-down to 2nd pen-up)
73 direction histogram s
7
74 t(j
x, max
) T
w
75 spatial histogram t
2
76 j
x, max
77 y(1st pen-down to last pen-up) 78 y(1st pen-down to 1st pen-up)
79 (1st t(x
max
))/T
w
80

j
x
81 T(2nd pen-up) T
w
82 (1st t(v
max
))/T
w
1188 S Signature Features
Signature Features. Table 1 (Continued)
Ranking Feature Description Ranking Feature Description
83 j
y, max
84 y(2nd pen-down to 2nd pen-up)
85 j
max
86 spatial histogram t
3
87 (1st t(v
y, min
))/T
w
88 (2nd t(x
max
))/T
w
89 (3rd t(x
max
))/T
w
90 (1st t(v
y, max
))/T
w
91 t(j
max
) T
w
92 t(j
y, max
) T
w
93 direction change histogram c
2
94 (3rd t(y
max
))/T
w
95 direction change histogram c
4
96

j
y
97 direction change histogram c
3
98 y(initial direction)
99 y(before last pen-up) 100 (2nd t(y
max
))/T
w
Signature Features. Figure 2 Examples of genuine signatures and forgeries (left) and scatter plots of 4 different
global features from the 100-feature set presented in Table 1 (right). The signatures belong to the BioSecure database
and the Figure has been adapted from [13].
Signature Features S1189
S
Off-line signature features
Off-line signature verication systems usually rely
on image processing and shape recognition techni-
ques to extract features. As a consequence, additional
preprocessing steps such as image segmentation and
binarization must be carried out. Features are
extracted from gray-scale images, binarized images,
or skeletonized images, among other possibilities.
The proposed feature sets in the literature are nota-
bly heterogeneous, specially when compared with the
case of on-line verication systems. These include,
among others, the usage of image transforms (e.g.,
Hadamard), morphological operators, structural
representations, graphometric features [14], direc-
tional histograms, and geometric features. Readers are
referred to [15] for an exhaustive listing of off-line
signature features.
Related Entries
Feature Extraction
Off-line Signature Verication
On-line Signature Verication
Signature Matching
Signature Recognition
References
1. Plamondon, R., Lorette, G.: Automatic signature verication
and writer identication: the state of the art. Pattern Recogn.
22(2), 107131 (1989)
2. Lei, H., Govindaraju, V.: A comparative study on the consistency
of features in on-line signature verication. Pattern Recogn. Lett.
26(15), 24832489 (2005)
Signature Features. Table 2 Extended set of local features. The upper dot notation (e.g., x
n
) indicates time derivative
# Feature Description
1 x-coordinate x
n
2 y-coordinate y
n
3 Pen-pressure z
n
4 Path-tangent angle y
n
arctan(y
n
x
n
)
5 Path velocity magnitude
u
n

_ y
n
_ x
n
p
6 Log curvature radius r
n
log(1 k
n
) log(
n

_
y
n
), where k
n
is the curvature of the
position trajectory
7 Total acceleration magnitude
a
n

t
2
n
c
2
n
p

_ u
2
n
u
2
n
y
2
n
q
, where t
n
and c
n
are respectively the
tangential and centripetal acceleration components of the pen
motion
8 Pen azimuth g
n
9 Pen altitude f
n
1018 First-order derivative of features 19 x
n
, y
n
, z
n
,
_
y
n
, _ u
n
, _ r
n
, a
n
, _ g
n
,
_
f
n
1927 Second-order derivative of features 19 x
n
, y
n
,z
n
,

y
n
,u
n
, r
n
, a
n
,g
n
,

f
n
28 Ratio of the minimum over the maximum
speed over a window of 5 samples

n
r
min {
n4
, . . . ,
n
} max {
n4
, . . . ,
n
}
2930 Angle of consecutive samples and first
order difference
a
n
arctan(y
n
y
n1
x
n
x
n1
) _ a
n
31 Sine s
n
sin(a
n
)
32 Cosine c
n
cos(a
n
)
33 Stroke length to width ratio over a window
of 5 samples
r
5
n

P
kn
kn4

x
k
x
k1

2
y
k
y
k1

2
p
max x
n4
;:::;x
n
f gmin x
n4
;:::;x
n
f g
34 Stroke length to width ratio over a window
of 7 samples
r
7
n

P
kn
kn6

x
k
x
k1

2
y
k
y
k1

2
p
max x
n6
;:::;x
n
f gmin x
n6
;:::;x
n
f g
1190 S Signature Features
3. Richiardi, J., Ketabdar, H., Drygajlo, A.: Local and global feature
selection for on-line signature verication. In: Proceedings of
IAPR eighth International Conference on Document Analysis
and Recognition, ICDAR, Seoul, Korea (2005)
4. Kholmatov, A., Yanikoglu, B.: Identity authentication using im-
proved online signature verication method. Pattern Recogn.
Lett. 26(15), 24002408 (2005)
5. Fierrez, J., Ramos-Castro, D., Ortega-Garcia, J., Gonzalez-
Rodriguez, J.: HMM-based on-line signature verication: feature
extraction and signature modeling. Pattern Recogn. Lett. 28(16),
23252334 (2007)
6. Fierrez-Aguilar, J., Nanni, L., Lopez-Penalba, J., Ortega-Garcia, J.,
Maltoni, D.: An on-line signature verication system based on
fusion of local and global information. In: Proceedings of IAPR
Signature Features. Figure 3 Examples of functions from the 27-feature set presented in Table 2 for a genuine signature
(left) and a forgery (right) of a particular subject.
Signature Features S1191
S
International Conference on Audio- and Video-Based Biometric
Person Authentication, AVBPA, Springer LNCS-3546, pp. 523532
(2005)
7. Jain, A.K., Zongker, D.: Feature selection: evaluation, applica-
tion, and small sample performance. IEEE Trans. Pattern Anal.
Mach. Intell. 19(2), 153158 (1997)
8. Nelson, W., Turin, W., Hastie, T.: Statistical methods for on-line
signature verication. Int. J. Pattern Recogn. Artif. Intell. 8(3),
749770 (1994)
9. Lee, L.L., Berger, T., Aviczer, E.: Reliable on-line human signa-
ture verication systems. IEEE Trans. Pattern Anal. Mach. Intell.
18(6), 643647 (1996)
10. Dolng, J.G.A., Aarts, E.H.L., van Oosterhout, J.J.G.M.: On-line
signature verication with Hidden Markov Models. In: Proceed-
ings of the International Conference on Pattern Recognition,
IEEE Press, pp. 13091312 (1998)
11. Van, B.L., Garcia-Salicetti, S., Dorizzi, B.: On using the Viterbi
path along with HMM likelihood information for online signa-
ture verication. IEEE Trans. Syst. Man Cybern. B 37(5),
12371247 (2007)
12. Muramatsu, D., Matsumoto, T.: Effectiveness of pen pressure,
azimuth, and altitude features for online signature verication.
In: Proceedings of IAPR International Conference on
Biometrics, ICB, Springer LNCS 4642 (2007)
13. Martinez-Diaz, M., Fierrez, J., Galbally, J., Ortega-Garcia, J.: Towards
mobile authentication using dynamic signature verication: use-
ful features and performance evaluation. In: Proceedings Interna-
tional Conference on Pattern Recognition, ICPR, pp. 16 (2008)
14. Sabourin, R.: In: Off-line signature verication: recent advances
and perspectives. Lect. Notes Comput. Sci. 1339 8498 (1997)
15. Impedovo, D., Pirlo, G.: Automatic signature verication: The
state of the art. IEEE Trans. Syst. Man Cybern. C Appl. Rev.
38(5), 609635 (2008)
Signature Matching
MARCOS MARTINEZ-DIAZ
1
, JULIAN FIERREZ
1
,
SEIICHIRO HANGAI
2
1
Biometric Recognition Group - ATVS, Escuela
Politecnica Superior, Universidad Autonoma de
Madrid, Madrid, Spain
2
Department of Electrical Engineering Tokyo
University of Science, Japan
Synonyms
Signature similarity computation
Definition
The objective of signature matching techniques is to
compute the similarity between a given signature and a
signature model or reference signature set. Several pat-
tern recognition techniques have been proposed as
matching algorithms for signature recognition. In on-
line signature verication systems, signature matching
algorithms have followed two main approaches. Fea-
ture-based algorithms usually compute the similarity
among multidimensional feature vectors extracted
from the signature data with statistical classication
techniques. On the other hand, function-based algo-
rithms perform matching by computing the distance
among time-sequences extracted from the signa-
ture data with technique such as Hidden Markov Mod-
els and Dynamic Time Warping. Off-line signature
matching has followed many different approaches,
most of which are related to image processing and
shape recognition.
This essay focuses on on-line signature matching,
although off-line signature matching algorithms are
briey outlined.
Introduction
As in other biometric modalities, signature matching
techniques vary depending on the nature of the features
that are extracted from the signature data. In feature-
based systems (also known as global), each signature
is represented as a multidimensional feature vector,
while in function-based systems (also known as local)
signatures are represented by multidimensional time
sequences. Signature matching algorithms also depend
on the enrollment phase. Model-based systems estimate
a statistical model for each user from the training
signature set. On the other hand, in reference-based
systems the features extracted from the set of training
signatures are stored as a set of template signatures.
Consequently, given an input signature, in model-
based systems the matching is performed against a
statistical model, while in reference-based systems the
input signature is compared with all the signatures
available in the reference set.
Feature-Based Systems
Feature-based systems usually employ classical pattern
classication techniques. In reference-based systems, the
matching score is commonly obtained by using a dis-
tance measure between the feature vectors of input and
template signatures [1, 2], or a trained classier. Distance
1192 S Signature Matching
measures used for signature matching include Eucli
dean, weighted Euclidean, and Mahalanobis distance. In
model-based systems, trained classiers are employed,
including approaches such as Neural Networks, Gaussian
Mixture Models [3] or Parzen Windows [4].
Function-Based Systems
In these systems, multidimensional time sequences
extracted from the signature dynamics are used as fea-
tures. Given the similarity of this task to others related to
speaker recognition, the most popular approaches
in local signature verication are related to algorithms
proposed in the speech recognition community.
Among these, signature verication systems using
Dynamic Time Warping (DTW) [5, 6, 7] or Hidden
Markov Models (HMM) [8, 9, 10, 11] are the most
popular approaches in signature verication. In such
systems, the captured time functions (e.g., pen coordi-
nates, pressure, etc.) are used to model each user sig-
nature. In the following, Dynamic Time Warping and
Hidden Markov Models are outlined. An brief over-
view of other techniques is also given.
Dynamic Time Warping
Dynamic Time Warping (DTW) is an application of
the Dynamic Programming principles to the problem
of matching discrete time sequences. DTW was origi-
nally proposed for speech recognition applications
[12]. The goal of DTW is to nd an elastic match
among samples of a pair of sequences X and Y that
minimizes a predened distance measure. The algo-
rithm is described as follows. Lets dene two
sequences
X x
1
; x
2
; :::; x
i
; :::; x
I
Y y
1
; y
2
; :::; y
j
; :::; y
J
1
and a distance measure as
di; j x
i
y
j

2
between sequence samples. A warping path can be
dened as
C c
1
; c
2
; :::; c
k
; :::; c
K
3
where each c
k
represents a correspondence (i, j) be-
tween samples of X and Y. The initial condition of the
algorithm is set to
g
1
g1; 1 d1; 1 w1 4
where g
k
represents the accumulated distance after
k steps and w(k) is a weighting factor that must be
dened. For each iteration, g
k
is computed as
g
k
gi; j min
c
k1
g
k1
dc
k
wk

5
until the Ith and Jth sample of both sequences respec-
tively is reached. The resulting normalized distance is
DX; Y
g
K
P
K
k1
wk
6
where w(k) compensates the effect of the length of
the sequences.
The weighting factors w(k) are dened in order to
restrict which correspondences among samples of both
sequences are allowed. In Fig. 1a, a common denition
of w(k) is depicted, and an example of a warping path
between two sequences is given. In this case, only three
transitions are allowed in the computation of g
k
. Con-
sequently, Eq. (5) becomes
g
k
gi; j min
gi; j 1 di; j
gi 1; j 1 2di; j
gi 1; j di; j
2
4
3
5
7
which is one of the most common implementations
found in the literature. In Fig. 1b, an example of point
correspondences between two signatures is depicted to
visually show the results of the elastic alignment.
The algorithm has been further rened for signa-
ture verication by many authors [5, 7], reaching a
notable verication performance. For example, the
distance measure d(i, j) can be alternatively dened,
or other normalization techniques may be applied
to the accumulated distance g
K
among sequences.
DTW can be also applied independently for each
stroke, which may be specially well suited for oriental
signatures, since they are generally composed of seve-
ral strokes. Although the DTW algorithm has been
replaced in speech-related applications by more pow-
erful approaches such as HMMs, it remains as a highly
effective tool for signature verication as it is best
suited for small amounts of training data, which is
the common case in signature verication.
Hidden Markov Models
Hidden Markov Models (HMM) have been widely
used for speech recognition applications [13] as well
Signature Matching S1193
S
as in many handwriting recognition applications.
Several approaches using HMMs for dynamic signa-
ture verication have been proposed in the last years
[8, 9, 10, 11]. An HMM represents a double stochas-
tic process, governed by an underlying Markov
chain, with a nite number of states and a random
function set that generate symbols or observations
each of which is associated with one state [11].
Observations in each state are modeled with GMMs
in most speech and handwriting recognition applica-
tions. In fact, GMMs can be considered a single-state
HMM and have also been successfully used for signa-
ture verication [14]. Given a sequence of multi-
dimensional vectors of observations O dened as
O o
1
; o
2
; . . . ; o
N
;
corresponding to a given signature, the goal of HMM-
based signature matching is to nd the probability that
this sequence has been produced by a Hidden Markov
Model M
POjM;
where M is the signature model computed during
enrollment.
The basic structure of an HMM using GMMs to
model observations is dened by the following elements:
Number of hidden states N.
Number of Gaussian mixtures per state M.
Probability transition matrix A {a
ij
}, which con-
tains the probabilities of jumping from one state to
another or staying on the same state.
In Fig. 2, an example of a possible HMM con-
guration is shown. Hidden Markov Models are
usually trained in two steps using the enrollment
signatures. First, state transition probabilities and
observation statistical models are estimated using
a Maximum Likelihood algorithm. After this, a re-
estimation step is carried out using the Baum-Welch
algorithm. The likelihood between a trained HMM
and an input sequence (i.e., the matching score) is
computed by using the Viterbi algorithm. In [10],
the Viterbi path (that is, the most probable state tran-
sition sequence) is also used as a similarity measure.
A detailed description of Hidden Markov Models is
given in [13].
Within HMM-based dynamic signature verica-
tion, the existing approaches can be divided in regional
and local. In regional approaches, the extracted time
Signature Matching. Figure 1 (a) Optimal warping path between two sequences obtained with DTW. Point-to-point
distances are represented with different shades of gray, lighter shades representing shorter distances and darker
shades representing longer distances. (b) Example of point-to-point correspondences between two genuine
signatures obtained using DTW.
1194 S Signature Matching
sequences are further segmented and converted into
a sequence of feature vectors or observations, each
one representing regional properties of the signature
signal [9, 11]. Some examples of segmentation bound-
aries are null vertical velocity points [9] or changes in
the quantized trajectory direction [11]. On the other
hand, local approaches directly use the time functions
as observation sequences for the signature modeling
[8, 10, 14].
Finding a reliable and robust model structure for
dynamic signature verication is not a trivial task.
While too simple HMMs may not allow to model
properly the user signatures, too complex models
may not be able to model future realizations due to
overtting. On the other hand, as simple models have
less parameters to be estimated, their estimation may
be more robust than for complex models. Two main
parameters are commonly considered while selecting
an optimal model structure: the number of states and
the number of Gaussian mixtures per state [8]. Some
approaches consider a user-specic number of states
[10], proportional to the average signature duration or
a user-specic number of mixtures [14]. Most of the
proposed systems consider a left-ro-right congura-
tion without skips between states, also known as
Bakis topology (see Fig. 2).
Other Techniques
More examples of signature matching techniques in-
clude Neural Networks, in particular Bayesian,
multilayer, time-delay Neural Networks and radial-
basis functions among others have been applied for
signature matching. Other examples include Structural
approaches, which model signatures as a sequence, tree
or graph of symbols. Support Vector Machines have
also been applied for signature matching. The reader is
referred to [15] for an exhaustive list of references
related to these approaches.
Fusion of the feature- and function-based
approaches has been reported to provide better perfor-
mance than the individual systems [4].
Off-line Signature Matching
The proposed approaches for off-line signature match-
ing are notably heterogeneous compared to on-line
signature verication. These are mostly related to
image and shape recognition techniques and classical
statistical pattern recognition algorithms. They include
Neural Networks, Hidden Markov Models, Support
Vector Machines and distance-based classiers among
others. A summary of off-line signature matching tech-
niques can be found in [15].
Related Entries
Off-line Signature Verication
On-line Signature Verication
Signature Features
Signature Recognition
Signature Matching. Figure 2 Graphical representation of a left-to-right N-state HMM, with M-component GMMs
representing observations and no skips between states.
Signature Matching S1195
S
References
1. Nelson, W., Turin, W., Hastie, T.: Statistical methods for on-line
signature verication. Int. J. Pattern Recogn. Artif. Intell. 8(3),
749770 (1994)
2. Lee, L.L., Berger, T., Aviczer, E.: Reliable on-line human signa-
ture verication systems. IEEE Trans. Pattern Anal. Mach. Intell.
18(6), 643647 (1996)
3. Martinez-Diaz, M., Fierrez, J., Ortega-Garcia, J.: Universal Back-
ground Models for dynamic signature verication. In: Proceed-
ings IEEE Conference on Biometrics: Theory, Applications and
Systems, BTAS, pp. 16 (2007)
4. Fierrez-Aguilar, J., Nanni, L., Lopez-Penalba, J., Ortega-Garcia, J.,
Maltoni, D.: An on-line signature verication system based
on fusion of local and global information. In: Proceedings
of IAPR International Conference on Audio- and Video-Based
Biometric Person Authentication, AVBPA, Springer LNCS-3546,
pp. 523532 (2005)
5. Sato, Y., Kogure, K.: Online signature verication based on
shape, motion and writing pressure. In: Proceedings of sixth
International Conference on Pattern Recognition, pp. 823826
(1982)
6. Martens, R., Claesen, L.: Dynamic programming optimisation
for on-line signature verication. In: Proceedings fourth Inter-
national Conference on Document Analysis and Recognition,
ICDAR, vol. 2, pp. 653656 (1997)
7. Kholmatov, A., Yanikoglu, B.: Identity authentication using im-
proved online signature verication method. Pattern Recogn.
Lett. 26(15), 24002408 (2005)
8. Fierrez, J., Ramos-Castro, D., Ortega-Garcia, J., Gonzalez-
Rodriguez, J.: HMM-based on-line signature verication:
feature extraction and signature modeling. Pattern Recogn.
Lett. 28(16), 23252334 (2007)
9. Dolng, J.G.A., Aarts, E.H.L., van Oosterhout, J.J.G.M.: On-line
signature verication with Hidden Markov Models. In: Proceed-
ings of the International Conference on Pattern Recognition,
ICPR, pp. 13091312. IEEE CS Press (1998)
10. Van, B.L., Garcia-Salicetti, S., Dorizzi, B.: On using the Viterbi
path along with HMM likelihood information for online signa-
ture verication. IEEE Trans. Syst. Man Cybern. B 37(5),
12371247 (2007)
11. Yang, L., Widjaja, B.K., Prasad, R.: Application of Hidden
Markov Models for signature verication. Pattern Recogn.
28(2), 161170 (1995)
12. Sakoe, H., Chiba, S.: Dynamic programming algorithm optimi-
zation for spoken word recognition. IEEE Trans. Acoust. 26,
4349 (1978)
13. Rabiner, L.R.: A tutorial on Hidden Markov Models and selected
applications in speech recognition. Proceedings of the IEEE
77(2), 257286 (1989)
14. Richiardi, J., Drygajlo, A.: Gaussian Mixture Models for
on-line signature verication. In: Proceedings of ACM SIGMM
Workshop on Biometric Methods and Applications, WBMA.
pp. 115122 (2003)
15. Impedovo, D., Pirlo, G.: Automatic signature verication: The
state of the art. IEEE Trans. Syst. Man. Cybern. C Appl. Rev.
38(5), 609635 (2008)
Signature Recognition
OLAF HENNIGER
1
, DAIGO MURAMATSU
2
,
TAKASHI MATSUMOTO
3
, ISAO YOSHIMURA
4
,
MITSU YOSHIMURA
5
1
Fraunhofer Institute for Secure Information
Technology, Darmstadt, Germany
2
Seikei University, Musashino-shi, Tokyo, Japan
3
Waseda University, Shinjuku-ku, Tokyo, Japan
4
Tokyo University of Science, Shinjuku-ku, Tokyo,
Japan
5
Ritsumeikan University, Sakyo-ku, Kyoto, Japan
Synonyms
Handwritten signature recognition; signature/sign
recognition
Definition
A signature is a handwritten representation of name of
a person. Writing a signature is the established method
for authentication and for expressing deliberate deci-
sions of the signer in many areas of life, such as banking
or the conclusion of legal contracts. A related concept is
a handwritten personal sign depicting something else
than a persons name. As compared to text-independent
writer recognition methods, signature/sign recognition
goes with shorter handwriting probes, but requires to
write the same name or personal sign every time. Hand-
written signatures and personal signs belong to the
behavioral biometric characteristics as the person must
become active for signing.
Regarding the automated recognition by means of
handwritten signatures, there is a distinction between
on-line and off-line signature recognition. On-line sig-
nature data are captured using digitizing pen tablets,
pen displays, touch screens, or special pens and include
information about the pen movement over time (at
least the coordinates of the pen tip and possibly also the
pen-tip pressure or pen orientation angles over time).
In this way, on-line signature data represent the way a
signature is written, which is also referred to as signa-
ture dynamics. By contrast, off-line (or static) signa-
tures are captured as grey-scale images using devices
such as image scanners and lack temporal information.
1196 S Signature Recognition

You might also like