You are on page 1of 4

A New Financial Risk Management Model

Duan Yuezhong
Economics and Management School of
Beijing University of Posts and
Telecommunications
Duanyuezhong@ta139.com
Xie xiguo
Economics and Management School of
Beijing University of Posts and
Telecommunications
Xiexiguo@ta139.com

Jin Yongsheng
Economics and Management School of
Beijing University of Posts and
Telecommunications
jys1900@yahoo.com.cn
Cheng Che
China University of Petroleum (HD)
College of Economics and Management
Dongying , China
bybsh@hotmail.com


Abstract
To the problem of the financial risk management,
the paper brings forward a new model based on main
component analysis method and multi-classes support
vector machines. In one hand, the main component
analysis method was used to reduce the number of
indexes and to improve the efficiency. In the other
hand, the multi-classes support vector machines was
used to classify the warning accurately. Because the
new model can not only improve the efficiency but also
improve the precision, it is proved that the new model
is a more feasible method than any other methods
before.
Keyword: financial risk management, main
component analysis method, multi-classes SVM


I. INTRODUCTION

From 1938 to now, there are many models have
been used to study financial risk management. For
example, RAROC, VaR, Risk Metrics, Credit Metrics,
Enterprise total risk management and so on. But these
models all have some limitations, like complexity of
calculation and low precision [1].
This paper brings forward a new model based on
main component analysis method and multi-classes
support vector machines.
To the problem that there are many indexes to affect
financial risk management, the main component
analysis method is used to reduce the number of
indexes and play down the complexity of analysis.
Multi-classes Support vector machine is an important
area of statistical learning theory. Because it is based on
the structure risk minimize and can disposal the small
swatch, it holds the better extensive ability [4]. It is the
abreast of the times and the most accurate method.
Therefore, the new model based on main component
analysis method and multi-classes support vector
machines can not only improve the efficiency but also
improve the precision, it is proved that the new model is
a more feasible method than any other methods before.
The paper includes 4 sections. Section 2 introduces
theories related to rough set and multi-classes support
vector machine. Section 3 elaborates on the model
proposed in this study. Section 4 concludes the study.

II.Theories related to rough set and
multi-classes SVM

This section will introduce the theories related to
main component analysis method and multi-classes
support vector machine.
A. Main component analysis method
Main component analysis method is a multivariate
statistical analysis method, which changes several
indexes into few comprehensive ones. It adopts a
method to reduce indexes and find out some
comprehensive variables to represent the original ones.
The unrelated comprehensive ones can possibly reflect
2009 Third International Symposium on Intelligent Information Technology Application
978-0-7695-3859-4/09 $26.00 2009 IEEE
DOI 10.1109/IITA.2009.140
194
2009 Third International Symposium on Intelligent Information Technology Application
978-0-7695-3859-4/09 $26.00 2009 IEEE
DOI 10.1109/IITA.2009.140
194
the original information. The basic steps of
multivariable comprehensive evaluation by using main
component analysis method are as follows[6]:
1Indexes standardization
the number of sample is nthe number of index is
m
XX
ij

m

i12.....n.
j12..... m,
X
ij
represents the variable of index j of sample i .
The formula is Y
ij
( X
ij
X
j
) / S
j

X
j

n
1

=
n
i 1
X
ij

S
j
[
1
1
n

=
n
i 1
( X
ij
X
j
)
2
]
1/2

2Calculation of the covariance matrix R
R [r
ij
]
m

m

r
ij
is the correlation coefficient of X
i
and X
j
.
3Calculation of the eigenvalues and the eigenvectors
of the covariance matrix R.
We use the eigenequation to get eigenvalues in
numerical order and relevant eigenvectors.
Because of 0 ] [ = R I
We get eigenvalues 123m in
numerical order and relevant eigenvectors U1U2
U3Um
U
i
u
i1
u
i2
u
i3
.......... u
im

Then the main component
Z
i
u
i1
Y
i1
u
i2
Y
i2
.u
im
Y
im

i123..m
4. The process of the l main component analysis
method is as follows:
The weight of each main component is
Qi
i
/

=
m
i 1

i
.
According to

=
w
i 1

i
/

=
m
i 1

i
>85%,
We select the first w main components to conduct
calculation..
F

=
w
f 1
Ff

=
w
f 1
QfZ
f

f123..w
Finally, we can compare each F and then place
them in numerical order.
These are the theories related to the main
component analysis method.
B. Multi-classes SVM
The support vector machine is a new and promising
classification and regression technique proposed by
Vapnik and his group at AT&T Bell Laboratories.
The SVM learns a separating hyperplane to
maximize the margin and to produce a goad
generalization capability. Recent theoretical research
work has solved existing difficulties in using the SVM
in practical applications. Until now, it has been
successfully applied in many areas, such as face
detection, hand-written digit recognition, and data
mining, etc[8].
In theory, SVM classification can be traced back to
the classical structural risk minimization (SRM)
approach, which determines the classification decision
function by minimizing the empirical risk, as
R =
k
1

k
i
y x f
1
) (
where k and f represent the size of examples and the
classification decision function, respectively. For SVM,
the primary concern is determining an optimal
separating hyperplane that gives a tow generalization
error. Usually, the classification decision function in the
linearly separable problem is represented by[12]
f(x) = sign (w x + b)
In SVM, the optimal separating hyperplane is
determined by giving the largest margin of separation
between different classes. This optimal hyperplane
bisects the shortest line between the convex hulls of the
two classes. The optimal hyperplane is required to
satisfy the following constrained minimization, as
195 195
Min:
2
1
w
2

y
i
(w xi + b) 1
For the linearly non-separable case, the
minimization problem can be modified to allow
misclassified data points.
SVM can he applied to mufti-class classification by
combining SVMs[12].
Min (w, ) =
2
1
w
2
+ C

=
l
i 1

y m

m
i
s.t. (w
yi
x
i
) b
yi
(w
m
x
i
) b
m
2
m
i

m
i
0
i = 1,2 ....... l
m, y
i
{1,2,........,k}
m y
i

The decision function is as follow
f(x) = argmax [(w
i
x) b
i
],
i = 1,2 ....... k.
These are the theories related to the multi-classes
support vector machine.

III.The introduction of the new model

The paper divided the level of financial risk into
five grades: low, lower, middle, higher, high. Then we
use twelve indexes to value the financial risk:
diversification, liquidity, asset/ liability, valuation of
assets, trading control, transparency, economic
intelligence, capital adequacy, asset quality, financial
information, intellectual property of financial product,
financial intelligence system.
First, we used the main component analysis method
to delete some indexes which are not important to the
information system security value.
Second, the data set is divided into 5 parts. Each
data set is input to their opposite SVM. The scores of
different classes are added up and the class which has
the highest score is accepted as the class of the warning
system.
Third, the data set is divided into two parts: some
are training data, the other are testing data. The training
data are trained by SVRM, and the testing data are input
to the SVRM which has been trained to test the SVRM.
At last, the new data may be input into the model
and we may get the five grades of the financial risk
level.
The chart of the model based on fuzzy theory and
multi-classes support vector machine is as follow.

Figure1. The chart of the new model

IV. Conclusions










Main component analysis method











Multiclasses support vector
machine
The origin training data
The main component
analysis method
The second
training data
Testing
data
SVM
1
SVM
2
SVM
5
196 196
The paper brings forward a new model based on
main component analysis method and multi-classes
support vector machines.
To the problem that there are many indexes to affect
financial risk management, the main component
analysis method is used to reduce the number of
indexes and play down the complexity of analysis.
Multi-classes Support vector machine is an important
area of statistical learning theory. Because it is based on
the structure risk minimize and can disposal the small
swatch, it holds the better extensive ability[4]. It is the
abreast of the times and the most accurate method.
Therefore, the new model based on main component
analysis method and multi-classes support vector
machines can not only improve the efficiency but also
improve the precision, it is proved that the new model is
a more feasible method than any other methods before.

Reference

[1] HUANG Hai-feng, MA Hong-yi, the main methods of
Chinese financial risk management, Economic
Theories and Economic Management, 2005.
[2] ZHANG Xi-yu, the VAR method of financial risk
management,
Enterprise Economy, January 2003.
[3] GU Xiu-juan, the financial risk management: the
development of theories and technologies, Economic
Survey, No 1, 2007.
[4] XIAO Wenbind, FEI Qi, a Study of Personal Credit
Scoring Models on Support Vector Machine with
Optimal Choice of Kernel Function Parameters,
Systems Engineering Theory & Practice, the tenth,
2006
[5] TANG Jia-fu, LI Run-sheng, SHI Yong-gui, FAN
Chun-guang, Application of Principal Component
Analysis to Performance Evaluation of Telecom
Enterprises, Journal of Northeastern University
(Natural Science), No 4, 2008.
[6] Jiang Huiyuan, Wang Wanxiang, Application of
Principal Component Analysis in Synthetic Appraisal
for Multi-objects Decision-making, Journal of
Wuhan University of Technology , No 4, 2003.
[7] XIA Guoen, JING Weidong, ZHANG Gexiang,
Synthetic Evaluation Method Baste Support Vector
Classifier and Regression Machine, Journal of
Southwest Jiaotong University, Vol 41, 2006.
[8] XIAO Wenbind, FEI Qi, a Study of Personal Credit
Scoring Models on Support Vector Machine with
Optimal Choice of Kernel Function Parameters,
Systems Engineering Theory & Practice, the tenth,
2006.
[9] Steve Gunn, "Support Vector Machines for
Classification and Regression", Image Speech and
Intelligent Systems Group, 10 May 1998.
[10] Harris Drucker, Chris J.C. Burges, Linda Kaufman,
Alex Smola, Vladimir Vapnik, Support Vector
Regression Machines, Advances in Neural
Information Processing Systems, 1997: 9(S): 155-161.
[11] Dacheng Tao, Xiaoou Tang, Xuelong Li, and
Xindong Wu, Asymmetric Bagging and Random
Subspace for Support Vector Machines-Based
Relevance Feedback in Image Retrieval, IEEE
Transactions on Pattern Analysis and Machine
Intelligence, VOL. 28, NO.7, 2006, pp.1088-1099.
[12] GAO Shang, YANG Jingyu, Assessing the
effectiveness based on principal component analysis and
support vector machine, Systems Engineering and
Electronics, Vo1.28, No.6, 2006.

197 197

You might also like