Professional Documents
Culture Documents
Application
classical bits, the quantum bit with n bits can denote 2 n We rewrite the equation (9) as follows:
different classical information, so QPSO can search the
Ax = z ( A R mn , z R m ) (10)
broader space than that of basic PSO.
The pseudo code of the procedure is as follows: In LS-SVM algorithm, it is generally solved by least
For each particle square method [9]. However, as the dimension of AT A is
Initialize quantum particle x = (x11 , x12 , ", x mn ) large, it is difficult to solve by computer, therefore, the
iterative calculation method is applied to solve this
% where xij is the j -th quantum bit in particle problem.
encoding of i -th individual, denoted by ij and ij , which PSO algorithm is based on the idea of searching
optimal solution by iteration. Thus, we can solve a large
are initialized by1 2 matrix equation by it. The matrix equation (10) is
End regarded as a problem solved the optimal
Do particle X = ( x1 , x 2 ,..., x n ) T by QPSO algorithm
For each particle
iteratively. In the process of solving the detail matrix
Update the velocity and position of each quantum
equation, the fitness function is defined by the mean
bit in particles by the equation (1) and (2)
The position and velocity are adjusted as follows: square error of residual error (z Ax ) as below:
2
xi = X max , if x i > X max 1 m n
x
i = X , if x X
(6) f (x) = zi aij x j
mn i =1 j =1
(11)
max i max
vi = Vmax , if vi > Vmax Example: To testify the validity of solving the matrix
(7) equation by QPSO algorithm above, we use the improved
vi = Vmax , if vi Vmax PSO and QPSO algorithms to solve a typical example of
Evaluate fitness value Hilbert ill-conditioned linear systems respectively:
If f ( xi ) < f ( pbesti ) Ax = z . Where
then the best fitness value in history pbesti = xi ; 1 1
1 "
End 2 n
If f (xi ) < f (gbest i )
1 1 " 1 is a n-th order Hilbert
then the global optimal solution gbesti = xi ; A = 2 3 n +1
End
" " " "
While maximum iteration or minimum error criteria is 1 1 1
not attained "
n n + 1 2n 1
2.2. Improved LS-SVM algorithm matrix, we choose n=5, the solution x=[1 1 1 1 1], then
z=[2.2833 1.4500 1.0929 0.8845 0.7456]. The maximum
Suppose training set is S = {(xk , yk ) | k = 1,2, ", N } , and minimum singular value of this equation are
respectively max = 1.5671 , min = 3.2879 10 6 , its
where x k R n , y k R are input and output data
condition number k ( A) = max / min = 4.7661 105 .
respectively. However, unlike the classical SVM, the
Obviously, it is a serious ill-conditioned equation system.
minimum object function and constraint condition of LS-
The parameters of the algorithms are set as follows:
SVM are constructed by SRM rule, shown as follows:
the number of particles is 20, the dimension of the
1 1 N
min J (w, e) = wT w + ek2 solution space m =5, the number of quantum bit is 5, the
,b ,e 2 2 k =1
maximum iterative times Tmax =100, r1 = r2
3. Application example
Changqing oil field is a hidden lithologic oil reservoir
with low yield, low oil abundance, large area distribution.
It is difficult to evaluate well logging of oil layer by
conventional recognition method quantitatively.
Therefore, a certain key well segment (1200m~1290m) is
evaluated by the improved LS-SVM algorithm based on
QPSO algorithm. The improved LS-SVM recognition
model for oil layer is designed and shown in figure2.
(a) The improved PSO algorithm
Sample information Attribute Attribute LS-SVM
selecting & discretization & reduction training by
preprocessing generalization QPSO
Redundant attribute
O u tp u t
Unknown elimination & LS-SVM
information preprocessing recognition