You are on page 1of 7

International Journal of Rock Mechanics & Mining Sciences 60 (2013) 7581

Contents lists available at SciVerse ScienceDirect

International Journal of
Rock Mechanics & Mining Sciences
journal homepage: www.elsevier.com/locate/ijrmms

Technical Note

A new methodology to predict backbreak in blasting operation


M. Mohammadnejad a,n, R. Gholami a, F. Sereshki a, A. Jamshidi b
a
b

Department of Petroleum, Mining and Geophysics, Shahrood University of Technology, Shahrood, Iran
University of Tehran, Iran

a r t i c l e i n f o
Article history:
Received 12 December 2011
Received in revised form
19 August 2012
Accepted 20 December 2012
Available online 30 January 2013

1. Introduction
Breakages beyond the excavation limits have commonly happened in many types of blasting [1]. Backbreak is by far one of the
most important types of breakage in opencast mining as it has
profound effect on the nal contours and wall stability conditions.
Bauer [2] pointed out that if backbreak is not controlled, a decrease in
the overall pit-slope angle occurred which in turn resulted in
increasing of stripping ratio. A signicant amount of loose face rock
is produced and planned safety berms would be less effective due to
backbreak. In addition, destructive consequences of backbreak can
lead to considerable increase in the total production costs [3]. To
identify those parameters inuencing the intensity of backbreak,
many studies have been carried out [1,4,5]. These parameters can
be broadly divided into two categories: controllable and uncontrollable parameters (see Table 1). Controllable parameters can be
changed by the blaster in charge, while uncontrollable parameters
are natural and cannot be controlled.
For instance, severity of backbreak increases if burdens are too
large. If stemming distances are excessive, poor top breakage is
obtained and backbreak increases. On the other hand, long delay
time decreases the amount of backbreak. Gate et al. [6] believed
that the main reason of backbreak is insufcient delay timing
and/or increasing number of blasting rows.
In the past, empirical models have been developed for the blasting
design so as to predict the necessary parameters required for proper
fragmentation, decrease of backbreak, suitable muck pile prole,
reducing boulders, etc. However, there is no straightforward way to
predict backbreak using empirical models. Multivariate regression
analysis is a suitable mathematical method which can also be helpful
for prediction of backbreak phenomenon as it is able to establish a
relationship between independent and dependent variables. This type

of regression is more general than logistic regression since dependent


variables are not restricted into two categories.
Support vector machine is a robust machine learning methodology based on the structural risk minimization (SRM) principle [7],
introduced in the early 1990s as a non-linear solution for classication and regression tasks [8,9]. It stems from the framework of
statistical learning theory or the VapnikChervonenkis (VC) theory
[10] and was originally developed for pattern recognition problem
[11]. The VC theory is the most successful tool by now to accurately
describe the capacity of a learned model and can further tell us how
to ensure the generalized performance for future samples [12] by
controlling the capacity of the model. Its theory is mainly based on
the consistency of a learning process and how to control the
generalized performance of a learning process to construct learning
algorithms [9]. There are at least three reasons for the success of
SVM: its ability to learn well with only a very small number of
parameters, its robustness against the error of data, and its computational efciency compared to other intelligent computational
methods [13]. By minimizing the structural risk, SVMs work well
not only in classication [14,15] but also in regression [16,17]. It was
soon introduced into many other research elds, e.g. image analysis
[18,19], signal processing [20], remote sensing [21], and time series
analysis [22] and usually outperformed the traditional statistical
learning methods [23]. Thus, SVM has been receiving increasing
attention and quickly became quite an active research eld.
The aim of current study is to present a more accurate and practical
approach, based on regression analysis and the SVM method, for
prediction of backbreak phenomenon in mining site areas. In this
regard, Sungun copper mine was selected as a case study to verify
the strength of suggested algorithm.

2. Sungun copper mine


n

Corresponding author.
E-mail address: mmnmojtaba@gmail.com (M. Mohammadnejad).

1365-1609/$ - see front matter & 2012 Elsevier Ltd. All rights reserved.
http://dx.doi.org/10.1016/j.ijrmms.2012.12.019

The Sungun copper mine is located in Eastern Azerbaijan and


at 75 km distance to north-west of Ahar, Iran. Exact location of the

76

M. Mohammadnejad et al. / International Journal of Rock Mechanics & Mining Sciences 60 (2013) 7581

Table 1
Controllable and uncontrollable effective parameters affecting on intensity of backbreak.
Uncontrollable
variables

Controllable variables
Geometrical parameters

Explosive dependent
parameters

Operational parameters

Blast hole diameter Burden Spacing


Stemming Blast hole inclination

Explosive type Max.


charge/delay

Delay sequence Delay


intervals Powder factor

Rock mass properties


Structural terrain

Table 2
Blasting design parameters of the Sungun mine.
Parameter

Symbol

Min.

Max.

Burden (m)
Spacing (m)
Hole depth (m)
Specic drilling (m/m3)
Stemming
Powder factor (k/ton)

B
S
L
SD
T
Pf

2
2
10
0.04
1.8
0.2

5
6.5
14
0.28
4.5
0.93

Backbreak

Fig. 1. Sungun copper open pit mine.

mine is 381 380 2000 north latitude and 461 450 3500 east longitude. It is
the most important geologic and industrial feature in the area and
the largest open-cast copper mine in Iran. Fig. 1 shows a general
perspective of this open pit mine.
The reserves of this mine are estimated to be as much as 995
million tons of copper ore. The ore is processed directly at a
concentration plant in the mine site. The capacity of the concentration plant is now 170,000 tons of copper concentrates which is
going to expand to 300,000 tons.
Based on the exploration tasks through 19791993, estimated
reservoir of this mine is about 740 million tons with the copper
grade of 0.661% and molybdenum grade of 240 ppm. In 1993,
complete technical studies were started by Iranian and foreign
consultant companies. The result of these studies revealed that this
mine as an open-pit mine has 384 million tons ore reserve with the
grade of 0.665% copper, total overburden of 680 million tons, and
annual production of 7 million tons for the rst 5 years. Concurrent
pre-stripping operation was started in 1999 and constructions like
building, renery factory, etc. were built very fast.
Last blasting design parameters of the mine are listed in Table 1
whereas minimum and maximum values of the parameters listed
are given in Table 2. In this blasting operation, drill cuttings were
used as the stemming material and delay time between the rst and
second rows was 80 ms while it was 50 ms between the other rows.
Total number of available data of this study was 193 which consist
of those parameters listed in Table 2 and their respective backbreak.

3. Mechanism of backbreak
When an explosive charge conned with a blasthole is
initiated, reactions take place resulting in production of large
amount of gases at very high temperature and pressure in a very
short time. An important characteristic of high explosive is the

Shatter

Fig. 2. Backbreak result from excessive burden.

production of very large amount of energy per unit of time. The


gas pressure acts on the walls of the hole and thus subjects the
media beyond the hole to vast stresses and strain. In this
condition, when the free surface is close enough to the blasthole,
the rock breakage occurs. If the released energy from blasting is
not controlled, undesirable aspects like backbreak are revealed.
This can result in signicant problems in the blasting design
performance. Konya and Walter [1] described the causes of
backbreak as 1) excessive burden, 2) excessively stiff benches, 3)
long stemming depths on stiff benches, and 4) improper timing
delay.
Burden is the shortest distance of rock between the shot line
and the nearest free face or open face at the time of detonation.
Backbreak can result from excessive burden on the shot holes,
thereby causing the explosive to break and crack radially further
behind the last row of shot holes (Fig. 2).
The stiffness ratio (H/B) is the ratio of bench height (H) to
burden (B). Excessively stiff benches (H/Bo2) can occur because
of more uplift and backbreak near the collar of the shot hole (see
Fig. 3). According to Konya and Walter [1] to achieve better
blasting results, stiffness ratio should be kept in the range of 34.
Stemming (T) is with inert materials such as drill cuttings packed
around the collar of the blast hole to conne the gasses during
detonation. Long stemming depths on stiff benches promote
backbreak. Improper timing delay from row-to-row may cause

M. Mohammadnejad et al. / International Journal of Rock Mechanics & Mining Sciences 60 (2013) 7581

77

In the present study, multivariate regression analysis was


applied to make a predictive model based on the parameters
involved in the available database. The equation obtained from
this analysis is as follows

Backbreak

backbreak 0:857B 0:763L 0:72S 0:634T0:501SD0:216Pf 17:7

L/B<2

Fig. 3. Backbreak result from excessive bench stiffness ratio.

backbreak. If the timing is too short, excessive connement of the


gases in the last row of the shot occurs.

4. Proposing a new empirical equation to predict backbreak


In rock engineering practices, statistically based empirical
equations have been extensively used to predict a variable based
on other operational or geological parameters. Empirical equations have great importance during the early stages of rock
excavation and design works since they are more practical
compared to extensive theoretical analysis. In this study, regression analysis is used to establish a model between available
parameters (independent variables) and backbreak (dependent
variable) as well as identifying those parameters which have
much more effect on the backbreak. To do this, simple and
multivariable regression analyses were used to assess the effect
of each parameter separately and develop a new empirical
equation to predict backbreak.
4.1. Simple regression analysis
Simple regression analysis is a mathematical method often used
to evaluate the possible relationship of each independent parameter
with dependent variable. In this study, this method was used to
assess the effect of each blasting pattern parameter on the severity
of backbreak. Hence, simple regression analysis was applied on the
data collected from Sungun copper mine. Fig. 4 shows the results
obtained during simple regression analysis.
As it is shown in Fig. 4, burden is the most relative parameter
as it presents a correlation coefcient of 0.769 with backbreak.
It can be easily inferred from Fig. 4 that the effect of the powder
factor on the backbreak phenomenon is negligible. Remaining
parameters, however, play important roles in severity of backbreak. According to the results of linear regression, severity of
backbreak increases with increasing length of the hole, spacing
and stemming whereas increasing the specic drilling can
decrease the potential of backbreak event.
4.2. Multivariate regression analysis
Multivariate regression is an extension of the regression analysis
incorporating additional independent variables in the predictive
equation. Utilizing this method, one can easily determine the
relationship between the independent variables and the predictive (dependent) variables. This method was successfully used in
different mining elds to develop predictive models in different
rock engineering practices [24].

1
where B, S, L, T, SD and Pf are, respectively, burden, spacing,
hole depth, stemming, specic drilling and Powder factor. Fig. 5
shows the correlation coefcient obtained between measured and
predicted backbreak using the multivariate regression analysis
method.

5. Support vector machine


Support vector machine (SVM) used for regression analysis is
called support vector regression (SVR). The aim of SVR is to nd a
function like f(x) for approximation of the value y(x) according to
the available data, i.e.




 
x1 ,y1 ,. . ., xm ,ym D X D Rn  Y DR
2
To estimate the function, a small fraction of training samples
called support vectors (SVs) are taken into account. In addition, a
specic loss function that is e-insensitive is used to create a
sparseness property for SVR algorithm. This function is described
as
(
0
if 9yf x r e9
9yf x9e
3
otherwise
9yf x9e
In the above equation, f(x) is the estimated value of y and
corresponding errors less than e-boundary are not penalized.
To develop a regression algorithm, linear function estimation is
considered. Thus, the function of input vector x has the following
form [25,26]:
f x w:x b where w,x A X D Rn ,bA R

It should be noted that bracket here indicates the inner


product of two vectors in Hilbert space (i.e. a space in which
inner product of two vectors has a real value). To nd f(x), one can
minimize the regulated risk functional (Rreg) dened as follows:
m
  1
 
 
1X
2
9y f xi 9e
Rreg f 99w99 CReemp f where Reemp f
2
mi1 i

The term Remp in the above equation is the empirical error of


training data, dened in e-insensitive loss function framework.
Coefcient C in Eq. (5) represents the complexity of f(x). In fact,
minimization of the Rreg illustrates the idea of structural risk
minimization, which states that to achieve the minimum risk,
simultaneous control of the complexity and the error of the model
is necessary. This is the basic idea used to improve the generalization ability of the SVR.
It can be easily inferred that minimizing Eq. (5) is equivalent to the
following convex constrained quadratic optimization problem:
N 
X


1
2
0
L w, x, x 99w99 C
xi x0i
2
i1

8
T
>
< yi w xb r xi e
0
T
w
x

byi r xi e
subject to
>
0
: x , x , and x Z 0
i
i i
In Eq. (6), C is used to ensure that margin e is maximized and
error of the classication x is minimized. According to Eq. (6), any
0
error smaller than e does not require a nonzero xi or xi as it cannot
enter into the objective function [2729].

78

M. Mohammadnejad et al. / International Journal of Rock Mechanics & Mining Sciences 60 (2013) 7581

10

10
9

BackBreak

BackBreak

R=0.876

6
5
4

6
5
4

2
Data Points
Linear Fit

Data Points
Linear Fit

0
0

R=0.855

10

11

Burden (m)

R=0.141

14

15

R=0.857

BackBreak

BackBreak

13

10

10

6
5
4

6
5
4
3

2
Data Points
Linear Fit

Data Points
Linear Fit

1
0

0
0

0.2

0.4

0.6

0.8

Spacing (m)

Powder Factor (k/ton)


10

11
Data Points
Linear Fit

R=0.71

R=0.791

10
9

BackBreak

BackBreak

12

Hole Depth (m)

6
5
4
3

7
6
5
4
3

Data Points
Linear Fit

0
0

0.1

0.2

0.3

0.4

0.5

Stemming

Specific Drilling (m/m3)

Fig. 4. Correlation of blasting pattern parameters with the backbreak.

Introducing of Lagrange multipliers (a and a0 ) and allowing


parameters C and e to be greater than zero, equation of an optimum
hyper-plane can be obtained by maximizing following equations:

La, a0

N X
N 
N 


 X



1X
a a0 x0 x a a0
ai a0i yi  ai a0i e
2i1i1 i i i i i i
i1

7


subject to 0 r ai a0i rC

To get a better generalization in non-linear case, data points


are mapped into a space called feature space (i.e., Hilbert or inner
product space) through a replacement such as
 
xi xj -jxi j xj
9
The value of function j(xi) is not required to be known as it
can be dened by the choice of an appropriate kernel such that
k(xi, xj) j(xi)j(xj). Selecting a suitable kernel makes it possible
to separate data in feature space while original input space is still
non-linear. Thus whereas data for n-parity is not separated by a

M. Mohammadnejad et al. / International Journal of Rock Mechanics & Mining Sciences 60 (2013) 7581

hyper-plane in input space, it can be separated in the feature


space by a proper kernel [3035]. Table 3 presents several kernels
usually used for regression analysis.
According to the denition of kernel, the nonlinear regression
estimation problem of SVR can take the following form:
yi

N X
N 
X

N X
N 
X

 

ai a0i jxi T j xj b

i1j1

 

ai a0i K xi ,xj b

i1j1

10
where b can be determined using the fact that constrains of
0
Eq. (6) become xi 0 if 0 o ai oC, and xi 0 if 0 o a0i oC [36].

6. Prediction of backbreak using support vector machine


According to the results of multivariate regression analysis, ve
parameters are mainly effective in prediction of the backbreak
phenomenon. Thus burden, spacing, hole depth, specic drilling,
and stemming were considered to be the inputs of the SVM for
prediction of backbreak. To train the network, 70% of available data
(i.e., 135 data points) were used whereas remaining (i.e., 58 data
points) data were taken into account to test the SVM. It is widely
known that efciency of SVM depends mostly on a set of parameters. These are capacity parameter C, e of e-insensitive loss
function, kernel K and its corresponding parameters like s. Parameter C is used as a regularization parameter which can control the

Multivariate Regression Analysis


14
13.5

R=0.91

Predicted Backbreak

13
12.5
12
11.5
11
10.5
10

Data Points
Linear Fit

9.5
10

10.5

11

11.5

12

12.5

13

13.5

14

Measured Backbreak
Fig. 5. Results of multivariate regression analysis showing a correlation coefcient
of 0.91 between measured and predicted backbreak.

trade-off between maximum generalization and minimum training


error. Needle to say, if parameter C is considered to be too small,
tting the training data would be very hard to achieve. On the other
hand, if C is selected to be too large, SVR algorithm simply overts
the training data and the efciency of testing would be disappointing. To overcome this, Wang et al. [38] pointed out that a large value
like 100 should be assigned to parameter C.
Loss function parameter (e) is very sensitive to various types of
noise which can probably set into the database. Even with
sufcient knowledge about the noise, optimal value of e should
be selected with precaution. This parameter plays a critical role as
it can prevent the entire training set from meeting boundary
conditions, and so allows for the possibility of sparsity in the dual
formulations solution.
Another parameter that needs to be properly selected during the
training of SVM is kernel function. It has repeatedly been mentioned
in the literatures that Gaussian radial basis function (RBF) has
superior efciency compared to other kernel functions [39,40].
Gaussian kernel function has the following form (see also Table 3):


2
2
K xi ,xj e9xi xj 9 =2s
11
In the Gaussian kernel function, parameter s controls the
amplitude of the Gaussian function and the generalization ability
of the SVR. This parameter needs to be determined wisely to
achieve the best network.
To nd the optimum values of the parameters s and e, the data
set was separated into training and testing set of 135 and 58 data
points respectively. Leaving out one cross-validation was also
performed for selection of the optimum parameters during the
training set. The LOO method consists of removing one example
from the training set, constructing the decision function based on
remaining training data and testing on the removed example [41].
The root mean square error (RMSE) was used as an error criterion
to evaluate the quality of model during the iterations.
To obtain the optimal value of parameter s of the Gaussian
kernel, SVR was trained with different values of s, which varies from
0.01 to 1. RMSE corresponding to different s was calculated based
on the LOO cross-validation to achieve the best value. Fig. 6 shows
the curve of RMSE versus s during the training set. As shown in
Fig. 6, the optimal value of s was found to be 0.49. In addition, to
nd an optimal value for parameter e, RMSE corresponding to
various e was also calculated. The result is shown in Fig. 7. As
depicted in Fig. 7, the optimal value of parameter e is 0.63.
As mentioned, parameters s, e and C were found to be 0.49,
0.63 and 100, respectively. Moreover, support vector number of
the SVR used for current study was 30. Two conventional criteria
were considered to assess the efciency of SVR: root mean square
error (RMSE) and correlation coefcient (R). The (RMSE) is simply
calculated using the following equation:
s

Pn 
^ 2
i 1 yi yi
RMS
12
n

Table 3
Polynomial, normalized polynomial, radial basis function (Gaussian) and Pearson Universal (PUK) kernels.
Kernel function

Type of classier


 
r
K xi ,xj xi T xj 1
Tx 1 r


x
i j
K xi ,xj p
xi T xj yi T yj


2
K xi ,xj exp99xi xj 99 =2s2 


1
K xi ,xj "  pp 2 #o
2

99xi xj 992

21=o 1

79

Complete polynomial of degree r


Normalized polynomial kernel of degree r
Gaussian (RBF) with parameters s control the half-width of the curve tting peak
Pearson VII Universal Kernel (PUK) with two parameters of s and o which control the
Pearson width and the tailing factor of the curve tting peak

80

M. Mohammadnejad et al. / International Journal of Rock Mechanics & Mining Sciences 60 (2013) 7581

Support Vector Machine


14

R=0.96

13.5

Predicted backbreak

13

12.5

12

11.5

11

Fig. 6. Selection of parameter s according to RMSE of SVR.

10.5

10
9

10

11

12

13

14

Measured backbreak
Fig. 8. Results of SVM during testing step showing a correlation coefcient of 0.91
between measured and predicted backbreak.
Support Vector Machine
14
13.5
13

Backbreak

12.5
12
11.5
11

Fig. 7. Selection of parameter e according to RMSE of SVR.

10.5
10
9.5
0

in which yi and y^ i are respectively measured and predicted values


whereas n stands for the number of samples used for training or
testing the network. RMSE is routinely used as a criterion to show
the discrepancy between the measured and predicted values of
the network. The lower the RMSE, the more accurate the prediction.
Correlation coefcient, R, is also calculated by
v
u
Pn  _ 2
u
y y
R t1 Pn i 1 i Pni _2
13
2
y
y 1
i1

i1

The R criterion is widely used as a representation of the initial


uncertainty of the model. The best network model, which is
unlikely to build, would have RMS 0 and R 1. Finally, the SVR
model was run according to the obtained optimum parameters
obtained during various training phase. Fig. 8 shows the measured versus predicted values of backbreak obtained in testing
step of SVR.
As seen in Fig. 8, SVR is able to present a sophisticated
prediction based on the available data. Relative RMSE of the
SVR model during the training and testing of SVR was 0.25 was
0.34 respectively. Fig. 9 shows the discrepancy between measured
and predicted values of backbreak during the testing of SVR.
From Fig. 9, it can be concluded that SVR is a suitable approach
for prediction of backbreak since there is small discrepancy between
the measured and predicted values. Regarding the RMSE and
correlation coefcient of SVR as well as its reasonable running time
during the learning process, it is strongly recommended to use this
approach for prediction of backbreak phenomenon.

10

20

30

40

50

60

Samples

Fig. 9. Comparison of measured and predicted value obtained by SVM.

7. Conclusion
In this paper, an attempt has been made to present one of the
applications of support vector machine (SVM) in prediction of the
severity of the backbreak phenomenon during blasting operation.
To recognize the most relevant parameters related to backbreak,
simple and multivariate regression analyses were used. The
results obtained from these two regression methods revealed
that powder factor has a minor relationship with backbreak and
hence it was not used as an input for training the SVR. To train the
SVR, burden, spacing, hole depth, specic drilling, and stemming
were taken into account as the input whereas backbreak was
considered as the output. The results showed that the SVR is a
reliable and accurate method for prediction of backbreak as it can
predict the backbreak with correlation coefcient of 0.94.
References
[1] Konya CJ, Walter EJ. Rock blasting and overbreak control. 1st ed. USA:
National Highway Institute; 1991 p. 1903.
[2] Bauer A. Wall control blasting in open pits. CIM Special Vol. 30, Canadian
Institute of Mining and Metallurgy, 14th Canadian rock mechanics symposium; 1982, p. 310.
[3] Scoble MJ, Lizotte YC, Paventi M, Mohanty BB. Measurement of blast damage.
In: Proceedings of the SME annual meeting; 1996, p. 96103.

M. Mohammadnejad et al. / International Journal of Rock Mechanics & Mining Sciences 60 (2013) 7581

[4] Jenkins SS. Adjusting blast design for best results. In: Pit and quarry.
Rotterdam: Balkema; 1981.
[5] Monjezi M, Dehghani H. Evaluation of effect of blasting pattern parameters on
back break using neural networks. Int J Rock Mech Min Sci 2008;45:144653.
[6] Gate WC, Ortiz BLT, Florez RM. Analysis of rockfall and blasting backbreak
problems. In: Proceedings of the US rock mechanics symposium; 2005. p. 67180.
[7] Stitson M, Gammerman A, Vapnik V, Vovk V, Watkins C, Weston J. Advances
in kernel methodssupport vector learning. Cambridge, MA: MIT Press;
1999 p. 28592.
[8] Vapnik V. The nature of statistical learning theory. New York: Springer; 1995.
[9] Behzad M, Asghari K, Morteza E, Palhang M. Generalization performance of
support vector machines and neural networks in run off modeling. Expert
Syst Appl 2009;36:76249.
[10] Vapnik V. Statistical learning theory. New York: Wiley; 1998.
[11] Cristianini N, Shawe-Taylor J. An introduction to support vector machines.
Cambridge: Cambridge University Press; 2000.
[12] Cortes C. Prediction of generalization ability in learning machines. PhD thesis.
Department of Computer Science, University of Rochester, USA; 1995.
[13] Martinez-Ramon M, Cristodoulou Ch. Support vector machines for antenna
array processing and electromagnetic. Universidad Carlos III de Madrid,
Spain, Morgan and Claypool; 2006.
[14] Zhou D, Xiao B, Zhou H. Global geometric of SVM classiers. Technical report.
Institute of Automation, Chinese Academy of Sciences, AI Lab; 2002.
[15] Bennett KP, Bredensteiner EJ. Geometry in learning, geometry at work.
Washington, DC: Mathematical Association of America; 1998.
[16] Mukherjee S, Osuna E, Girosi F. Nonlinear prediction of chaotic time series
using a support vector machine. In: Proceedings of the 1997 IEEE workshop,
Amelia Island, Florida; 1997. p. 51120.
[17] Jeng JT, Chuang CC, Su SF. Support vector interval regression networks for
interval regression analysis. Fuzzy Sets Syst 2003;138(2):283300.
[18] Seo KK. An application of one-class support vector machines in content based
image retrieval. Expert Syst Appl 2007;33(2):4918.
[19] Trontl K, Smuc T, Pevec D. Support vector regression model for the estimation
of c-ray buildup factors for multi-layer shields. Ann Nucl Energy 2007;34(12):
93952.
[20] Widodo A, Yang BS. Wavelet support vector machine for induction machine
fault diagnosis based on transient current signal. Expert Syst Appl 2008;35(1-2):
30716.
[21] Sanchez-Hernandez C, Boyd DS, Foody GM. Mapping specic habitats from
remotely sensed imagery: support vector machine and support vector data
description based classication of coastal saltmarsh habitats. Ecol Inf
2002;2:838.
[22] Francis EH, Tay LJ. Modied support vector machines in nancial time series
forecasting. Neurocomputing 2002;48:84761.

81

[23] Steinwart I. Support vector machines. Los Alamos National Laboratory.


Information Sciences Group (CCS-3); 2008.
[24] Khademi Hamidi J, Shahriar K, Rezai B, Rostami J. Performance prediction of
hard rock TBM using Rock Mass Rating (RMR) system. Tunnell Undergr Space
Technol 2010;25:33345.
[25] Sanchez DV. Advanced support vector machines and kernel methods.
Neurocomputing 2003;55:520.
[26] Tran QA, Liu X, Duan H. Efcient performance estimate for one-class support
vector machine. Pattern Recognition Lett 2005;26:117482.
[27] Li Q, Jiao L, Hao Y. Adaptive simplication of solution for support vector
machine. Pattern Recognition 2007;40:97280.
[28] Eryarsoy E, Koehler GJ, Aytug H. Using domain-specic knowledge in
generalization error bounds for support vector machine learning. Decision
Support Syst 2009;46:48191.
[29] Lin HJ, Yeh JP. Optimal reduction of solutions for support vector machines.
Appl Math Comput 2009;214:32935.
[30] Scholkopf B, Smola AJ, Muller KR. Nonlinear component analysis as a kernel
eigenvalues problem. Neural Comput 1998;10:1299319.
[31] Walczack B, Massart DL. The radial basis functionspartial least squares
approach as a exible non-linear regression technique. Anal Chim Acta
1996;331:17785.
[32] Rosipal R, Trejo LJ. Kernel partial least squares regression in reproducing
kernel Hilbert space. J Mach Learning Res 2004;2:97123.
[33] Mika S, Ratsch G, Wetson J, Scholkopf B, Muller KR Fisher discriminant
analysis with kernels. In: Proccedings of the NNSP99, 1999, p. 418.
[34] Scholkopf B, Smola AJ. Learning with kernels. Cambridge: MIT Press; 2002.
[35] Gunn SR. Support vector machines for classication and regression. Technical
report. Image Speech and Intelligent Systems Research Group, University of
Southampton, Southampton, UK; 1997.
[36] Wang L. Support vector machines: theory and applications. Nanyang
Technological University, School of Electrical and Electronic Engineering;
2005.
[38] Wang WJ, Xu ZB, Lu WZ, Zhang XY. Determination of the spread parameter in
the Gaussian kernel for classication and regression. Neurocomputing
2003;55:64363.
[39] Dibike YB, Velickov S, Solomatine D, Abbott MB. Model induction with
support vector machines: introduction and application. J Comput Civ Eng
2001;15(3):20816.
[40] Han D, Cluckie I. Support vector machines identication for runoff modeling.
In: SY Liong, KK Phoon, V Babovic, (editors), Proceedings of the sixth
international conference on hydroinformatics, Singapore, 2004, p. 214.
[41] Liu H, Yao X, Zhang R, Liu M, Hu Z, Fan B. The accurate QSPR models to predict
the bioconcentration factors of nonionic organic compounds based on the
heuristic method and support vector machine. Chemosphere 2006;63:72233.

You might also like