Professional Documents
Culture Documents
Email: shuai_lian@qq.com
Abstract
HLLE is an effective nonlinear dimension reduction algorithm and is widely explored into machine learning, pattern recognition,
data mining and etc. However, HLLE is very sensitive to the neighborhood selection and non-uniformed data sampling. In this
paper, an improved HLLE based on weighted distance named WHLLE is proposed which can avoid the unreasonable
neighborhood selection by using weighted Euclidean distance. Furthermore, WHLLE not only can have a better effect of
dimension reduction but also can preserve the intrinsic geometry structure of the original manifolds. We validate the performances
of WHLLE on the two classical artificial manifolds. The experiments on artificial manifolds confirm that WHLLE can keep the
relationship of neighborhood of the data point, global distributions and intrinsic structures of the data better than other related
Algorithms.
Keywords: Machine Learning; Dimension Reduction; Hessian Locally Linear Embedding (HLLE) Algorithm; Weighted Distance
HLLE *
1 2 1
1. 5100061
2. 541004
HLLE
HLLE WHLLE
WHLLE WHLLE
WHLLE
(PCA)[1]
(MDS)[2](ISOMAP)[3](LLE)[4](LE)[5]Hessian
(HLLE)[6] HLLE
[7][8] HLLE
HLLE
*
(G61174163)
- 17 http://www.ivypub.org/cst
[9][10][11][12]
HLLE K<6
HLLE PCA
HLLE
HLLE (WHLLE)
WHLLE
f y
Y T
X a b
Y
d
2
1
2 yT y
Y a b 0
Y Y T Y X a b X Dd a, b,
2[13]weighted distance
x0 R d Dd a, b, x Rd
x0
D x0 , x
x x0
T
x x0
ab
x x0
D x0 , x D x, x0
- 18 http://www.ivypub.org/cst
3 HLLE
3.1 HLLE
A
M RD M d Rd M d
D M
: M x M 1 x x HLLE
fi : M R x M fi x 1 x
i
fi x 1 x i i 1,
, d : M
x M
1 x 1 x 1 , , 1 x d f1 x , , fd x
HLLE f1 ,
, f d d
B
x M Tx M M x dim Tx M d U x D d
U x Tx M U x x : Rd R D
Rd x U x x HLLE x x M
CHessian
M
W2,2
Sobolev
M
f W2,2
x M
iso
f x H euc f
1 x
H tan f x H euc f x
x M , H iso f x H tan f x
HLLE
2
3
D
M
f W2,2
iso f
iso
f m dm tan f
2
tan
f m dm
2
A Frobenius
1. iso f 0 x M , H iso f x 0 2. tan f 0 x M , H tan f x 0
3.2 WHLL
HLLE f : M R f
H f H f m dm M R d H f d 1
2
d H f
3.1 3.2 HLLE HLLE
X [ x1 , x2 ,
, xN ] xi R D , i 1, 2,
, N N
Y [ y1 , y2 ,
, yN ] yi Rd , i 1, 2,
, N N
d min k , D
- 19 http://www.ivypub.org/cst
xi , i 1,
1)
xi1 xi
, x 1
M M
i N
i
xik xi
i
, N k N i
jNi
1 k
xip M i kD k D
k p 1
2)
M i k D SVD M ki D U k k k DVDTD U k k d
3)
d d 1
Hessian X i U ki d X i X i k 1 d
2
1 k1 d U1i ,U 2i ,
,U di
d d 1
U1i ,U 2i ,
,U di d 2
4)
X i X i X i
H i H i rq X i
5)
q1 d r
, r 1, 2,
d d 1
2
Hessian H S S1 ,
, q 1, 2,
d d 1
2
, S N , Si S1icol ,
T
T
H i pcol , j i p , p 1, , k
i
, S ijcol
, S Ncol
0
j Ni
N
d d 1
H S T S SlT Sl S N
N H N N
2
l 1
6)
Rrs
V H d+1 d VN d
V jr V js ,1 r, s d W WN d
VR
1
2
jN j
4
WHLLE Swiss roll S-curve WHLLE
HLLELLEPCALE
(a)
(b)
- 20 http://www.ivypub.org/cst
(c)
(d)
(e)
(f)
4.2 WHLLES-curve
S-curve 3 2 S
S-curve S
(a)
(d)
(b)
(e)
(c)
(f)
3 (a) 3 S-curve 2
3 2
- 21 http://www.ivypub.org/cst
S-curve 2 WHLLE
3 S HLLELLEPCALE
WHLLE S
4.3
4.14.2 HLLE
SwissrollScurve
2(a)
WHLLEHLLELLEPCALE
2 HLLE
LLEPCALE Swiss roll WHLLE
3(a) S
WHLLEHLLELLEPCALE
2 HLLELLEPCALE
S WHLLE S
5
HLLE
HLLE HLLE
HLLE HLLE HLLE
WHLLE
WHLLE
REFERENCES
[1] Jolliffe, Principal Component Analysis, second ed., Springer, New York, 2002
[2] T.F. Cox, M.A.A. Cox, Multidimensional Scaling, Chapman & Hall, London, 1994.
[3] Joshua B Tenenbaum et al. A Global Geometric Framework for Nonlinear Dimensionality Reduction, Science, 2000
[4] S. T. Roweis and L. K. Saul, Nonlinear dimensionality reduction by locally linear embedding, Science, 2000
[5] M. Belkin, et al.. Laplacian eigenmaps and spectral techniques for embedding and clustering, NIPS, Vancouver, Canada, 2001.
[6] David
L. Donobo, et al Hessian eigenmaps: Locally Linear embedding techniques for high-dimensional data, Proceedings of the
the IEEE International Conference on Automation and Logistics Zhengzhou, China, August 2012, pp.190-195
[13] C.Y. Zhou, Y.Q. Chen, Improving nearest neighbor classification with cam weighted distance, Pattern Recognition, 2006, 39, pp. 1-11
2006 6
1985 7
1988 4
2006 7 -2009 6
1992 3
2009 9 -2011 6
1993 3 -1994 10
1995 4 1999 12
2011 9 -
2
200110-200210
20038
2006 6
2006 7 -20012
6
- 23 http://www.ivypub.org/cst