You are on page 1of 7

Transactions on Computer Science and Technology

June 2013, Volume 2, Issue 2, PP.17-23

HLLE Algorithm Based on the Weighted Distance


Shuaibin Lian1#, Qiuli Kong2, Xianhua Dai1
1. College of Information Science and Technology, Sun Yat-sen University, Guangzhou 510006, China
2. College of Mathematical Sciences, Guangxi Normal University, Guangxi 541004, China
#

Email: shuai_lian@qq.com

Abstract
HLLE is an effective nonlinear dimension reduction algorithm and is widely explored into machine learning, pattern recognition,
data mining and etc. However, HLLE is very sensitive to the neighborhood selection and non-uniformed data sampling. In this
paper, an improved HLLE based on weighted distance named WHLLE is proposed which can avoid the unreasonable
neighborhood selection by using weighted Euclidean distance. Furthermore, WHLLE not only can have a better effect of
dimension reduction but also can preserve the intrinsic geometry structure of the original manifolds. We validate the performances
of WHLLE on the two classical artificial manifolds. The experiments on artificial manifolds confirm that WHLLE can keep the
relationship of neighborhood of the data point, global distributions and intrinsic structures of the data better than other related
Algorithms.
Keywords: Machine Learning; Dimension Reduction; Hessian Locally Linear Embedding (HLLE) Algorithm; Weighted Distance

HLLE *
1 2 1
1. 5100061
2. 541004

Hessian Locally Linear EmbeddingHLLE

HLLE
HLLE WHLLE

WHLLE WHLLE
WHLLE

(PCA)[1]
(MDS)[2](ISOMAP)[3](LLE)[4](LE)[5]Hessian
(HLLE)[6] HLLE
[7][8] HLLE
HLLE
*

(G61174163)
- 17 http://www.ivypub.org/cst

[9][10][11][12]
HLLE K<6
HLLE PCA
HLLE
HLLE (WHLLE)

WHLLE

1 (deformed distribution) d N 0,1


Y Y1 , Y2 ,..., Yd
T

f y

Y T
X a b
Y

d
2

1
2 yT y

Y a b 0
Y Y T Y X a b X Dd a, b,

2[13]weighted distance
x0 R d Dd a, b, x Rd

x0
D x0 , x

x x0
T
x x0

ab

x x0

D x0 , x D x, x0
- 18 http://www.ivypub.org/cst

3 HLLE
3.1 HLLE
A
M RD M d Rd M d

D M

: M x M 1 x x HLLE

fi : M R x M fi x 1 x
i

fi x 1 x i i 1,

, d : M

x M

1 x 1 x 1 , , 1 x d f1 x , , fd x
HLLE f1 ,

, f d d

B
x M Tx M M x dim Tx M d U x D d

U x Tx M U x x : Rd R D

Rd x U x x HLLE x x M
CHessian

M
W2,2
Sobolev
M
f W2,2
x M

iso

f x H euc f

1 x

H tan f x H euc f x

x M , H iso f x H tan f x

HLLE

2
3

D
M
f W2,2

iso f

iso

f m dm tan f
2

tan

f m dm
2

A Frobenius
1. iso f 0 x M , H iso f x 0 2. tan f 0 x M , H tan f x 0

3.2 WHLL
HLLE f : M R f
H f H f m dm M R d H f d 1
2

d H f
3.1 3.2 HLLE HLLE
X [ x1 , x2 ,

, xN ] xi R D , i 1, 2,

, N N

Y [ y1 , y2 ,

, yN ] yi Rd , i 1, 2,

, N N

d min k , D

- 19 http://www.ivypub.org/cst

xi , i 1,

1)

xi1 xi
, x 1
M M
i N
i
xik xi
i

, N k N i

jNi

1 k
xip M i kD k D
k p 1

2)

M i k D SVD M ki D U k k k DVDTD U k k d

3)

d d 1

Hessian X i U ki d X i X i k 1 d

2

1 k1 d U1i ,U 2i ,

,U di

d d 1

U1i ,U 2i ,

,U di d 2

X i k 1 ,U1i ,U 2i , U1i , U 2i ,U1i U 2i

4)

X i X i X i

H i H i rq X i
5)

q1 d r

, r 1, 2,

d d 1
2

Hessian H S S1 ,

, q 1, 2,

d d 1
2

, S N , Si S1icol ,
T

T
H i pcol , j i p , p 1, , k
i
, S ijcol
, S Ncol
0
j Ni

N
d d 1

H S T S SlT Sl S N
N H N N
2
l 1

6)
Rrs

V H d+1 d VN d

V jr V js ,1 r, s d W WN d

VR

1
2

jN j

4
WHLLE Swiss roll S-curve WHLLE
HLLELLEPCALE

4.1 WHLLESwiss roll


Swiss roll 3 2 swiss roll 2

(a)

(b)
- 20 http://www.ivypub.org/cst

(c)

(d)

(e)

(f)

2(a) Swissroll 1200 10 .(b)WHLLE (c)HLLE


(d)LLE (e)PCA (f)LE

2 (a) 3 swiss roll 2


3 2 WHLLE
HLLELLELE 2 PCA
2 WHLLE
3 HLLELLEPCALE
WHLLE

4.2 WHLLES-curve
S-curve 3 2 S
S-curve S

(a)

(d)

(b)

(e)

(c)

(f)

3(a) Scurve 1200 10 .(b) HLLE (c) HLLE


(d) LLE (e)PCA (f)LE .

3 (a) 3 S-curve 2
3 2
- 21 http://www.ivypub.org/cst

S-curve 2 WHLLE
3 S HLLELLEPCALE
WHLLE S

4.3
4.14.2 HLLE
SwissrollScurve
2(a)
WHLLEHLLELLEPCALE
2 HLLE
LLEPCALE Swiss roll WHLLE
3(a) S
WHLLEHLLELLEPCALE
2 HLLELLEPCALE
S WHLLE S

5
HLLE
HLLE HLLE
HLLE HLLE HLLE
WHLLE
WHLLE

REFERENCES
[1] Jolliffe, Principal Component Analysis, second ed., Springer, New York, 2002
[2] T.F. Cox, M.A.A. Cox, Multidimensional Scaling, Chapman & Hall, London, 1994.
[3] Joshua B Tenenbaum et al. A Global Geometric Framework for Nonlinear Dimensionality Reduction, Science, 2000
[4] S. T. Roweis and L. K. Saul, Nonlinear dimensionality reduction by locally linear embedding, Science, 2000
[5] M. Belkin, et al.. Laplacian eigenmaps and spectral techniques for embedding and clustering, NIPS, Vancouver, Canada, 2001.
[6] David

L. Donobo, et al Hessian eigenmaps: Locally Linear embedding techniques for high-dimensional data, Proceedings of the

National Academy of Sciences, 2003


[7] Shanwen Zhang, Ying-KeLei. Modified locally linear discriminant embedding for plant leaf recognition. Neurocomputing, 74
(2011): 2284-2290.
[8] Carlotta Orsenigo, Carlo Vercellis. A comparative study of nonlinear manifold learning methods for cancermicroarray data
classification. Expert Systems with Applications 40 (2013): 2189-2197
[9] Zhang Zhenyue,Zha Hongyuan. Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. SIAM
Journal on Scientific Computing, 2004,26(1): 313-338
[10] Abdel-Mannan, O., Ben Hamza, A., Youssef, A., Incremental Hessian Locally Linear Embedding algorithm , International
Symposium on Signal Processing and Its Applications, 2007, pp.1- 4
[11] E. Pekalska, A. Harol, R. Duin, B. Spillmann, H. Bunke, Non-euclidean or non- metric measures can be informative, in: Structural,
Syntactic, and Statistical Pattern Recognition, 2006, pp.871-880
[12] Sumin Zhang, Qiuli Kong. An Improved HLLE Algorithm Based on the Midpoint-Nearest Neighborhood Selection. Proceeding of
- 22 http://www.ivypub.org/cst

the IEEE International Conference on Automation and Logistics Zhengzhou, China, August 2012, pp.190-195
[13] C.Y. Zhou, Y.Q. Chen, Improving nearest neighbor classification with cam weighted distance, Pattern Recognition, 2006, 39, pp. 1-11

2006 6

1985 7

1988 4

2006 7 -2009 6

1992 3

2009 9 -2011 6

1993 3 -1994 10

1995 4 1999 12

2011 9 -
2

200110-200210
20038

2006 6

2006 7 -20012
6

- 23 http://www.ivypub.org/cst

You might also like