You are on page 1of 17

See

discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/305325563

Codes in MATLAB for Training Artificial Neural


Network using Particle Swarm Optimization
Code August 2016
DOI: 10.13140/RG.2.1.2579.3524

CITATIONS

READS

3,182

1 author:
Mahamad Nabab Alam
Indian Institute of Technology Roorkee
8 PUBLICATIONS 9 CITATIONS
SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Application of operation research on solving electrical engineering problems View project

All in-text references underlined in blue are linked to publications on ResearchGate,


letting you access and read them immediately.

Available from: Mahamad Nabab Alam


Retrieved on: 22 October 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

Codes in MATLAB for Training Artificial Neural Network


using Particle Swarm Optimization
Mahamad Nabab Alam
Research Scholar, Department of Electrical Engineering, Indian Institute of Technology Roorkee, Roorkee-247667, India
E-mail: itsmnalam@gmail.com

Abstract
In this paper, codes in MATLAB for training artificial neural network (ANN) using particle swarm optimization
(PSO) have been given. These codes are generalized in training ANNs of any input features and single target
feature. The proposed training approach has been tested on chemical_dataset available in MATLAB.
Keywords: artificial neural network, particle swarm optimization, optimum training.

1. Introduction
Artificial neural network (ANN) serves the objective providing a model which has the ability to relate very
complex input and output datasets. This ANN model works extremely well for very complex data sets which are
normally very difficult to predict using mathematical modelling (equations).
2. Artificial Neural Network (ANN)
The ANN is a network of neuron connected among themselves through weights and biases. A typical ANN
model is shown in Figure 1. Once the structure of the ANN is formed then the next task is to train the network.
Hidden Layer Neurons

wij
wj1

Input #1

Output Layer Neuron


Input #2

b1

Input #3

Output Layer
Input #4

Input Layer

bj
Hidden Layer

Figure 1: A typical artificial neural network (ANN)

1
15 July, 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

Training of the networks means finding the optimum values of various weights and biases of the network.
Normally, various types of techniques are used to find the suitable values of weights and biases of the ANN. I
this work, optimum training of the network have been obtained through particle swarm optimization (PSO). The
details about the PSO can be found in [1], [2], [3], [4], [5]. The PSO algorithm used in this work is well
explained in [6]. Further, codes in MATLAB environment are available in [7], [8].
3. Proposed Artificial Neural Networks Training approach using Particle Swarm Optimization
The following seven steps have been used to train ANN using PSO.
Step 1) Collect data
Step 2) Create the network
Step 3) Configure the network
Step 4) Initialize the weights and biases
Step 5) Train the network using PSO
Step 6) Validate the network
Step 7) Use the network
4. Objective Function of Training Artificial Neural Networks
The objective function of the optimum training of ANN using PSO can be defined as given in the rectangular
box below. The codes given in the box must be saved as 'myfunc.m' and need to be placed in the same directory
of the MATLAB where the main PSO codes program is available which is given in the next section.
%These codes are part of research work done by Mahamad Nabab Alam
%Research Scholar, Indian Institute of Technology, Roorkee, India
function [f] = myfunc(x,n,m,o,net,inputs,targets)
k=0;
for i=1:n
for j=1:m
k=k+1;
xi(i,j)=x(k);
end
end
for i=1:n
k=k+1;
xl(i)=x(k);
xb1(i,1)=x(k+n);
end
for i=1:o
k=k+1;
xb2(i,1)=x(k);
end
net.iw{1,1}=xi;
net.lw{2,1}=xl;
net.b{1,1}=xb1;
net.b{2,1}=xb2;
f=sum((net(inputs)-targets).^2)/length(inputs);

2
15 July, 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

5. Main Program of Particle Swarm Optimization for Training Artificial Neural Networks
The main program file for training ANN using PSO is given in the rectangular box below. Save these codes as
'nn_pso.m' (any suitable name can be used to save this program). Before, running this program, the datasets
(input datasets) must be made in Microsoft Excel (.xlsx) which will be used to train the network. The details
about the making of the datasets are discussed in the next section.
%This codes are part of research work done by Mahamad Nabab Alam
%Research Scholar, Indian Institute of Technology, Roorkee, India
clc
tic
close all
clear all
rng default
filename = 'datafile.xlsx';
sheetname1 = 'Sheet1';
sheetname2 = 'Sheet2';
input = xlsread(filename,sheetname1,'A1:Z10000');
target = xlsread(filename,sheetname2,'A1:Z10000');
inputs=input';
targets=target';
m=length(inputs(:,1));
o=length(targets(:,1));
n=5;
net=feedforwardnet(n);
net=configure(net,inputs,targets);
kk=m*n+n+n+o;
for j=1:kk
LB(1,j)=-1.5;
UB(1,j)=1.5;
end
pop=10;
for i=1:pop
for j=1:kk
xx(i,j)=LB(1,j)+rand*(UB(1,j)-LB(1,j));
end
end
maxrun=1;
for run=1:maxrun
fun=@(x) myfunc(x,n,m,o,net,inputs,targets);
x0=xx;
% pso initialization----------------------------------------------start
x=x0;
% initial population
v=0.1*x0;
% initial velocity
for i=1:pop
f0(i,1)=fun(x0(i,:));
end
[fmin0,index0]=min(f0);

3
15 July, 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

pbest=x0;
% initial pbest
gbest=x0(index0,:);
% initial gbest
% pso initialization------------------------------------------------end
% pso algorithm---------------------------------------------------start
c1=1.5; c2=2.5;
ite=1; maxite=1000; tolerance=1;
while ite<=maxite && tolerance>10^-8
w=0.1+rand*0.4;
% pso velocity updates
for i=1:pop
for j=1:kk
v(i,j)=w*v(i,j)+c1*rand*(pbest(i,j)-x(i,j))...
+c2*rand*(gbest(1,j)-x(i,j));
end
end
% pso position update
for i=1:pop
for j=1:kk
x(i,j)=x(i,j)+v(i,j);
end
end
% handling boundary violations
for i=1:pop
for j=1:kk
if x(i,j)<LB(j)
x(i,j)=LB(j);
elseif x(i,j)>UB(j)
x(i,j)=UB(j);
end
end
end
% evaluating fitness
for i=1:pop
f(i,1)=fun(x(i,:));
end
% updating pbest and fitness
for i=1:pop
if f(i,1)<f0(i,1)
pbest(i,:)=x(i,:);
f0(i,1)=f(i,1);
end
end
[fmin,index]=min(f0);
ffmin(ite,run)=fmin;
ffite(run)=ite;

% finding out the best particle


% storing best fitness
% storing iteration count

% updating gbest and best fitness


if fmin<fmin0
gbest=pbest(index,:);
fmin0=fmin;
end

4
15 July, 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

% calculating tolerance
if ite>100;
tolerance=abs(ffmin(ite-100,run)-fmin0);
end
% displaying iterative results
if ite==1
disp(sprintf('Iteration
Best particle
Objective fun'));
end
disp(sprintf('%8g %8g
%8.4f',ite,index,fmin0));
ite=ite+1;
end
% pso algorithm-----------------------------------------------------end
xo=gbest;
fval=fun(xo);
xbest(run,:)=xo;
ybest(run,1)=fun(xo);
disp(sprintf('****************************************'));
disp(sprintf('
RUN
fval
ObFuVa'));
disp(sprintf('%6g %6g %8.4f %8.4f',run,fval,ybest(run,1)));
end
toc
% Final neural network model
disp('Final nn model is net_f')
net_f = feedforwardnet(n);
net_f=configure(net_f,inputs,targets);
[a b]=min(ybest);
xo=xbest(b,:);
k=0;
for i=1:n
for j=1:m
k=k+1;
xi(i,j)=xo(k);
end
end
for i=1:n
k=k+1;
xl(i)=xo(k);
xb1(i,1)=xo(k+n);
end
for i=1:o
k=k+1;
xb2(i,1)=xo(k);
end
net_f.iw{1,1}=xi;
net_f.lw{2,1}=xl;
net_f.b{1,1}=xb1;
net_f.b{2,1}=xb2;
%Calculation of MSE
err=sum((net_f(inputs)-targets).^2)/length(net_f(inputs))
%Regression plot
plotregression(targets,net_f(inputs))
disp('Trained ANN net_f is ready for the use');
%Trained ANN net_f is ready for the use
%Kindly, write your feedback and cite the paper in your work using its DOI

5
15 July, 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

6. Making of Datasets for Training ANN using PSO


For running the proposed program, raw data must be given in Microsoft Excel file (.xlsx). The first sheet
(Sheet1) must have the input data where all features of an input and output data set must be placed in one row of
Sheet1 whereas, the corresponding output data (target data) of the input feature data must be placed in the first
column of the first row of the second sheet (Sheet2). This complete file must be saved as 'datafile.xlsx' and
placed in the same directory. Kindly, see Figure 2 and Figure 3 for details of raw data entries. Here, input
features are 8 to produce a single output data.

Figure 2: Input data entries

Figure 3: Target (output) data entries

6
15 July, 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

The complete datasets of chemical_dataset available in MATLAB are given the Table I. Kindly, save this data
in the excel file as mentioned above. Copy and paste this data in Shee1 and Sheet2 of the excel file saved as
datafile.xlsx.
TABLE I: Raw Datasets for chemical_dataset available in MATLAB
Input data (Sheet1)
157
155
154
154
152
153
151
156
154
155
152
153
152
155
152
155
150
156
154
152
155
155
153
152
150
151
152
152
151
152
152
154
155
153
149
150
154
153
151
154
151
154
148
155
149
145
142
143
145
144
147
147

9596
9487
9551
9637
9486
9633
9637
9668
9482
9516
9628
9512
9586
9429
9888
9873
9957
9652
9836
9710
9336
9451
9325
9452
9778
9793
9677
9739
9728
9691
9914
9870
9927
9908
9943
9807
10211
9957
10073
10063
10146
9910
9966
9549
9978
9746
9620
9628
9659
9805
9557
9647

4714
5049
5070
5087
5065
5319
4987
5009
5017
5205
4771
5093
4942
4838
4877
4810
4822
4882
4617
4788
4472
4756
4539
4649
4986
4975
4977
4895
4824
4663
4789
5045
4953
5000
5233
5258
5258
4688
4850
5028
5221
4961
5600
5385
7809
6868
6984
7125
7078
7202
7008
6994

376
381
374
382
380
377
379
387
381
388
383
389
382
390
384
386
381
387
381
380
383
381
386
383
381
380
387
380
382
383
380
383
384
384
381
378
380
379
380
376
381
381
373
377
370
376
376
380
377
378
378
379

2.58
2.28
2.98
2.57
3.04
2.72
3.07
2.59
2.97
2.19
3.02
2.2
2.87
1.85
2.3
2.02
2.85
2.36
2.98
2.93
2.89
2.71
2.96
2.94
2.92
2.89
2.43
2.78
2.84
2.97
2.73
2.57
2.85
2.94
2.91
2.99
2.56
2.87
3
2.85
2.03
2.64
3.06
2.47
1.86
1.9
2.15
2.12
2.23
1.97
1.94
1.85

407
411
406
408
408
408
408
410
407
412
405
410
405
413
410
413
406
414
411
409
411
411
410
409
406
406
409
408
408
410
408
411
413
411
406
407
407
405
401
405
405
404
392
400
395
396
396
395
400
396
404
404

564
567
563
565
567
565
563
570
565
574
563
568
565
569
563
568
563
569
566
565
567
569
566
565
563
564
566
565
564
566
564
568
571
569
563
563
572
564
562
569
564
563
558
565
563
562
564
560
565
563
564
564

510354
504718
456972
512311
489312
495420
474552
470842
421062
522021
426581
509713
440815
524621
515205
550595
484103
522328
491599
526915
508672
493589
524036
438412
468385
456064
511060
447508
473712
477995
447144
491580
510992
512794
495965
471592
474487
430232
421869
439464
502083
476764
443955
505913
649442
557851
572242
654240
634647
663116
619637
644845

Output data
(Sheet2)
514
516
512
516
515
513
512
517
515
518
512
509
513
508
513
507
512
517
514
514
518
517
517
515
511
511
516
515
514
515
513
517
519
520
513
509
515
511
509
511
504
512
505
507
501
500
505
502
508
502
504
503

7
15 July, 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

147
148
148
146
149
144
149
147
150
137
141
147
146
146
145
147
144
142
142
141
142
144
145
144
145
147
147
145
144
142
144
137
142
145
143
146
146
146
152
144
144
144
145
143
141
147
147
149
144
142
147
154
156
145
145
147
145
147
148
149
147
144
148
145

9415
9388
9545
9606
9547
9504
9625
9616
9576
9677
9858
9766
9768
9518
9769
9583
9616
9560
9710
9561
9666
9590
9542
9577
9494
9375
9538
9485
9543
9410
9541
9415
9244
9162
9079
9101
9278
9344
9250
9228
9396
9535
9510
9619
9463
9575
9596
9626
9633
9408
9883
10009
9753
9924
10018
9962
9936
9960
9971
10081
9751
9904
10008
10070

6700
7387
7082
7013
6697
6126
6324
6165
6196
6364
7597
6536
6595
6623
6857
6692
6763
6762
6945
6793
6828
6682
6670
6683
6927
6594
6509
6400
6663
6596
6721
6667
6892
6773
6811
6745
6875
6933
6763
6818
6682
6863
6895
6836
7104
6955
6790
6989
6671
6783
6737
6769
6535
6703
6647
6409
6581
6545
6603
6506
6833
6632
6622
6762

379
376
374
376
376
381
379
380
380
377
373
376
379
375
376
373
375
372
374
374
374
374
374
373
373
374
374
374
374
375
374
373
373
373
374
375
375
375
377
376
375
373
374
376
374
376
372
375
377
376
376
375
376
373
376
376
379
376
379
377
378
375
376
377

1.83
1.69
1.86
1.8
1.83
1.87
1.91
1.83
1.85
1.78
1.73
1.69
1.76
1.74
1.77
1.77
1.67
1.76
1.79
1.84
1.76
1.81
1.68
1.74
1.75
1.7
1.71
1.77
1.68
1.76
1.77
1.84
1.67
1.8
1.78
1.82
1.87
1.84
1.89
1.8
1.91
1.85
1.94
2.26
1.92
2.02
2.03
2.05
1.87
1.9
1.87
1.96
1.9
1.96
1.8
1.85
1.76
1.76
1.77
1.87
1.68
1.79
1.92
1.82

404
405
405
403
407
401
407
405
408
393
394
401
400
401
399
402
397
398
396
396
395
398
398
400
401
403
402
400
400
398
400
392
395
399
398
399
399
398
403
395
397
394
395
396
394
401
397
397
395
393
396
404
408
395
398
402
400
401
401
403
403
397
402
399

565
564
562
563
563
559
563
563
565
554
561
562
563
563
565
563
563
562
562
560
561
562
562
563
561
565
564
561
562
560
562
551
561
560
561
560
563
562
566
561
562
560
561
560
560
565
564
564
565
563
567
573
576
563
569
570
570
571
572
574
575
568
573
575

611236
683235
598951
662400
626245
621051
617893
599292
599371
641902
666013
579424
589871
569214
606231
587871
616520
573663
596343
566924
596178
589518
631865
564024
634826
592892
597395
574966
610112
590367
587762
551945
602791
559580
569416
592637
618905
634371
614605
628316
609158
635744
624588
605164
634747
650400
583574
630825
596329
581677
565803
556875
581021
538269
566942
546577
626417
573635
608496
571357
620174
550677
554418
574874

504
501
500
499
500
498
501
499
503
489
494
500
498
499
499
498
497
497
495
496
495
498
498
498
498
498
497
496
496
499
498
490
495
497
500
501
504
503
508
501
502
497
500
507
497
507
504
506
503
503
504
507
509
499
502
506
506
505
506
508
508
503
510
505

8
15 July, 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

150
147
149
150
147
149
152
153
152
150
149
150
156
150
149
146
149
149
151
154
155
150
160
152
151
157
158
162
156
136
147
149
146
147
148
147
146
148
148
144
145
149
144
147
144
149
159
160
153
152
150
156
151
149
149
152
153
152
148
149
152
151
152
148

10020
10029
10003
9993
10092
10038
10255
10248
10314
10317
10305
10287
10113
10456
10802
9502
10583
10584
10665
10638
10730
10821
10448
10657
10705
10793
10804
10886
10884
11232
10837
10946
10750
10726
10831
10782
11097
11046
10956
11255
11122
11094
11226
11253
10981
11234
10902
10862
10835
10792
10952
10687
10829
10734
10901
10436
10724
10301
10554
10641
10531
10566
10706
10515

6638
6655
6571
6502
6503
6068
5889
6018
6257
6280
6279
6278
6296
5896
5902
6080
5415
5316
5117
5137
5105
5191
5381
5782
5515
5958
5782
6088
7096
7755
6814
6609
6754
7007
6484
6521
6368
6662
6276
6286
6202
5604
6014
5875
6460
6394
6072
6062
6141
6109
6438
6237
6229
6556
6841
6209
6298
5979
6076
6692
6536
6572
6871
6329

378
377
375
376
375
379
379
381
379
381
379
380
381
378
377
372
380
382
381
385
381
379
385
385
380
384
382
381
371
375
379
378
378
377
378
386
382
385
384
385
385
384
378
385
372
376
380
379
378
379
378
378
379
378
376
383
378
380
379
377
378
377
377
378

1.89
1.88
1.94
1.87
1.9
1.85
1.95
1.94
1.95
1.89
1.96
1.89
1.83
2.21
2.3
1.46
2
1.86
2.01
2.32
3
2.96
2.47
2.38
2.69
1.92
2.52
2.29
2.65
1.97
1.93
1.86
1.9
1.91
1.97
1.82
2.1
1.89
2.24
2.04
1.9
2.04
2.62
2.01
1.98
2.17
2.04
2.04
1.92
1.92
1.86
1.9
1.89
1.82
2
1.96
2
1.93
2.01
1.96
1.9
1.99
1.92
2.01

404
400
401
404
400
403
404
406
404
404
403
405
409
406
398
399
405
405
407
409
409
403
415
407
404
412
413
410
405
385
401
395
398
397
398
405
403
405
407
403
404
408
402
404
385
392
404
405
400
398
397
402
397
395
396
398
400
400
394
397
397
397
396
397

577
574
573
577
569
573
570
574
572
582
573
588
575
572
567
570
567
568
570
571
572
566
579
573
571
577
576
577
570
556
573
571
572
571
571
579
572
573
573
568
571
571
567
573
558
566
576
580
571
569
572
576
572
567
571
566
570
573
565
565
567
567
567
568

532037
543685
522276
536575
527752
532899
513691
530751
536424
555191
504650
532391
547064
520278
472533
523875
516101
526927
518985
521784
482819
442823
466446
514692
457048
545965
483714
489375
511422
662340
581926
514168
551373
586225
515336
603074
578394
661374
632118
650223
650123
549458
519950
523827
445077
492896
483281
482487
531234
527355
552486
534539
550519
565477
551289
521898
501760
488747
523490
556228
542835
539402
548937
522763

511
508
509
510
507
507
507
510
508
507
509
508
512
507
504
500
500
502
504
512
515
510
523
517
513
513
523
516
517
488
506
501
507
506
504
508
506
508
510
503
504
506
510
505
487
500
510
511
507
504
506
507
504
501
503
503
507
506
498
507
503
502
501
503

9
15 July, 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

151
153
156
151
155
153
155
153
154
153
156
154
151
152
151
154
152
151
154
157
158
163
156
158
159
160
156
154
149
155
155
154
150
157
154
157
149
151
147
147
150
151
159
153
151
153
151
154
148
148
152
149
147
149
147
148
145
143
139
136
139
140
137
143

10853
10620
10539
10568
10767
10862
10554
10583
10535
10460
10571
10370
10697
10687
10653
10681
10674
10916
10519
10780
10780
10405
10538
10503
10537
10513
10744
10794
10843
10895
10567
10739
11032
10819
10639
10712
10684
10786
10850
10903
10966
10816
10928
10973
11009
10971
10915
10552
10388
10844
10265
10550
10545
10433
10619
10635
10418
10449
10901
10944
10837
10772
10841
10665

6873
6455
6572
6510
6661
6449
6462
6511
6554
6479
5897
6519
6583
6663
6438
6407
6463
6624
6419
6344
6359
6084
6198
6164
6181
6051
6294
6226
6395
6448
5884
6040
6130
6351
6784
6964
6915
6742
6686
6501
6164
6043
6231
6688
6346
6416
6406
6274
6883
6440
6342
6396
6404
6232
6578
6499
6581
6703
6751
6508
6371
6725
6982
7147

377
378
380
379
380
381
382
382
385
383
386
386
383
384
383
382
382
382
383
385
383
389
384
387
386
388
385
387
380
386
384
384
381
379
378
377
377
378
382
384
384
388
383
379
381
383
382
383
378
382
389
385
388
385
386
384
384
383
378
380
378
380
378
380

1.99
2.18
2.04
2.16
2.11
2.35
2.06
2.12
2.11
2.32
2.35
2.05
2.38
2.12
2.33
2.08
2.42
2.36
2.17
2.14
2.32
1.96
2.07
2.04
2.09
2.06
2.22
2.1
2.82
2.1
3.09
3.22
3.23
1.94
1.94
1.89
1.93
1.89
2.11
2.06
2.65
2.53
2.57
1.98
2.12
2.06
2.12
2.06
2.27
2.15
2.07
2.25
2.14
2.6
2.11
2.29
2
2.08
2.03
2.09
1.98
2.13
2.23
2.08

394
397
398
396
397
397
398
400
401
400
408
404
397
399
397
399
397
396
399
405
405
412
406
405
406
407
403
404
392
402
401
399
393
390
392
392
389
389
394
395
397
396
405
394
397
402
398
404
399
397
409
406
404
406
403
404
401
400
395
392
396
396
393
400

566
565
567
567
568
566
568
568
566
570
575
571
564
564
565
565
565
562
564
572
570
576
574
572
573
575
570
572
561
572
570
565
563
570
573
571
568
572
571
572
566
572
574
572
573
575
572
574
572
571
573
574
572
572
571
572
572
571
565
563
564
567
566
571

537231
513004
518591
521437
524507
521783
525915
553069
566039
534597
524557
569023
546361
560831
536339
524172
537954
559481
527590
576033
556457
601237
580112
587034
575080
590538
573665
600707
523410
589483
496916
505860
486380
497947
544239
569690
561221
566505
604602
602956
550305
569707
556604
561665
561724
586253
579546
563207
609709
571010
650170
622437
665333
608629
664231
629363
658139
657804
630754
673099
611200
629734
648482
676616

500
505
506
505
505
506
506
508
509
510
517
511
506
505
505
504
508
503
505
511
513
517
513
511
512
513
509
510
509
508
515
513
509
504
507
506
503
502
502
503
509
511
517
504
506
508
506
510
506
505
514
511
509
515
507
509
506
504
494
490
494
499
499
502

10
15 July, 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

137
139
138
139
140
133
138
139
144
145
141
145
143
143
139
141
137
142
143
143
147
141
141
141
143
144
143
142
146
144
143
142
142
142
142
138
140
136
137
134
139
137
140
143
145
146
144
146
148
147
146
146
146
147
150
149
145
146
149
152
148
151
148
149

10830
10787
10838
10725
10952
10212
10311
10430
10340
10482
10835
10346
10574
10657
10721
10643
10735
10712
10701
10742
10109
10319
10676
10376
10431
10273
10286
10251
10293
10488
10385
10481
10416
10323
10063
10322
10360
10405
10414
10391
10393
10421
10446
10364
10409
10425
10251
10121
10299
10478
10484
10501
10451
10312
10261
10281
10371
10408
10315
10242
10119
10009
10022
10085

6755
6674
6562
6680
6884
7029
5899
6438
6623
6642
6657
6605
6857
7226
7691
7430
7634
6695
6723
6675
6381
6423
6487
6423
6371
6226
6386
6425
6495
6683
6583
6332
6433
6342
6067
6891
7094
7721
7087
6950
6615
6805
6552
6700
6671
6563
6893
6862
6532
6661
6672
6706
6808
6616
6564
6590
6687
6691
6553
6854
6851
6856
7069
6594

378
381
379
381
381
376
381
379
381
379
378
378
378
379
377
380
377
380
379
381
381
374
376
378
379
379
380
379
381
380
381
381
379
381
380
381
380
380
379
381
382
381
380
379
382
382
380
379
383
383
382
383
384
384
383
382
382
382
382
384
382
384
383
384

2.19
2.08
2.13
2.01
2.09
2.01
2.11
2.12
2.14
2.07
2.06
1.89
1.96
1.97
1.98
1.97
1.99
2.07
2.01
2.02
1.92
1.77
1.84
1.8
1.82
1.83
1.89
1.9
1.88
1.87
1.87
1.85
1.86
1.9
1.99
1.99
2.1
1.97
2.09
2.06
2.39
2.13
2.1
2.03
2.03
2.17
1.92
1.89
2
2.07
2.01
2.06
1.93
1.96
2
2.03
2.08
2.01
2.02
2.04
1.95
1.95
1.9
1.93

393
396
395
397
397
390
395
396
401
401
398
402
400
401
397
398
395
399
400
400
405
397
398
399
401
402
401
400
404
402
401
400
400
400
400
396
398
394
395
391
397
393
395
400
402
404
401
404
406
405
403
404
403
404
406
405
402
402
405
409
405
408
405
407

565
566
564
565
567
557
567
564
571
570
569
573
571
576
571
571
570
569
573
572
574
567
573
569
571
569
567
564
572
571
569
568
565
566
562
561
566
564
562
562
563
563
565
568
575
570
575
576
575
576
574
573
575
576
577
574
575
575
575
581
576
576
576
577

644713
664041
655974
690864
704458
671453
613494
654029
676351
695000
666254
684433
693429
738507
767592
784814
769061
700747
671642
699929
654909
592629
635448
660347
679799
667428
697258
667329
704454
694948
707227
705534
673037
712974
685957
729430
751902
816958
746543
731479
680434
678534
640524
642204
673271
640351
686920
697381
678106
676806
682653
686008
686761
672014
641357
641483
643324
655242
644898
665446
667581
705218
732525
694182

495
497
496
496
496
491
493
494
504
501
499
503
499
503
498
500
497
499
499
500
505
496
496
497
499
499
497
496
503
501
500
497
496
497
497
494
496
494
493
490
500
495
498
502
505
508
506
506
507
507
504
505
506
508
509
512
506
507
510
514
509
513
510
509

11
15 July, 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

146
146
146
143
143
146
146
144
146
133
145
138
141
148
146
145
144
142
140
138
143
137
138
143
140
140
141
137
138
142
143
142
143
145
143
150
148
155
143
142
143
143
142
141
141
143
146
144
145
147
146
147
149
144
146
142
147
147
145
147
146
150
147
147

10595
10576
10272
10569
10609
10571
10354
10520
9699
10745
10764
10691
10658
10555
10565
10450
10457
10007
10485
10420
10557
10704
10701
9475
10043
10264
10107
10576
10562
10525
9733
9958
10447
10455
10326
10041
10108
9868
10089
10111
10188
10096
10330
10218
10150
10103
9633
9913
9949
9988
9917
9953
9923
9921
9954
9740
9865
9837
9995
9850
9848
9825
9668
9735

6663
6664
6542
6931
7068
6618
6389
6666
6209
6478
6222
6553
6467
6151
5823
6352
6244
6709
6666
6545
6758
6396
6530
6405
6796
6489
6752
6576
6408
6569
5736
6186
6356
6137
6371
6373
6427
6409
6494
5948
6332
5991
6216
6316
6404
6317
5867
5928
5735
5620
5695
5492
5720
5404
5635
5337
5654
5365
5623
5441
5422
5026
5601
5507

383
383
378
380
382
383
385
383
376
372
381
378
379
383
382
383
385
379
381
381
380
381
381
379
380
381
381
379
380
380
379
381
381
381
382
381
382
380
380
382
381
381
381
380
380
379
383
380
383
384
384
386
387
384
385
384
385
384
385
384
384
388
387
384

2.04
2
2.07
1.93
2.22
1.96
1.96
1.91
1.72
1.72
2.08
1.79
1.9
2.01
1.99
2
2.02
1.79
1.92
1.92
1.96
2.12
2.06
1.9
1.93
2.09
1.91
2.05
2.13
2
1.82
2.03
2.1
2.27
2.15
2.13
2.09
2.02
2
2.12
2.03
2.09
2.11
2.15
2.06
2.12
1.97
2.07
2.11
2.2
2.15
2.58
2.17
2.26
2.22
2.57
2.57
2.58
2.63
2.57
2.18
2.49
2.19
2.29

404
404
404
400
400
403
405
402
404
392
402
395
398
404
403
401
400
399
396
395
399
393
394
402
397
397
398
394
396
399
402
400
399
402
399
407
404
412
399
398
399
400
398
398
398
400
405
402
404
406
405
406
408
403
405
401
406
406
405
406
405
415
407
406

573
573
574
572
573
576
571
572
574
558
573
565
566
575
571
571
571
572
568
566
565
561
563
563
567
566
568
559
563
567
562
565
568
570
566
573
571
577
568
564
567
564
567
564
568
567
564
563
564
564
564
563
567
563
564
561
564
564
562
564
564
562
562
563

712857
710643
682581
694937
672173
710099
710204
720046
639499
727421
644462
690828
686095
658686
633938
671344
651344
713224
686282
661232
689200
644867
656015
638496
701678
665299
699757
666091
656019
674585
606729
668203
673161
653325
679053
676766
686769
714041
713743
656045
678033
655965
664612
636470
661806
636127
688733
659269
696455
671229
700459
687509
744333
666208
670818
625694
637704
608056
678213
618718
671709
674943
745809
680467

505
505
503
501
505
503
506
501
505
485
502
490
493
503
505
500
503
501
497
493
498
494
496
497
498
499
499
492
495
497
493
498
500
503
500
506
504
510
498
499
499
499
499
499
498
499
499
496
498
500
499
508
503
499
500
504
506
507
505
506
499
510
503
502

12
15 July, 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

146
144
147
150
146
146
148
148
149
146
145
146
145
145
155
144
148
145
144
145
148
147
145
148
146
145
142
145
144
145
147
148
142
153
143
152
149
146
154
147
147
149
143
147
142
144
143
148
137
137
143
143
142
139
144
145
141
139
139
155
149
145
147
152

9709
9860
9756
9528
9581
9634
9588
9656
9720
9817
9771
9634
9747
9695
9683
9726
9657
9665
9764
9813
10001
9880
9853
9905
9766
9860
9844
9866
9737
10082
10148
10299
10175
10215
10260
9789
9754
9734
9427
9575
9909
10016
10104
10055
10164
10164
10264
9907
10273
10098
10110
9787
10195
10189
10098
9828
10033
10031
9963
10060
9974
10001
9365
9690

5613
5500
5650
5377
5597
5423
5237
5296
5236
5457
5263
5480
5595
5619
5383
5542
5478
5625
5519
5423
5398
5662
5480
5552
5430
5494
5532
5602
5503
5576
5531
5440
5503
5983
6041
5804
5238
5427
5398
5444
5422
5418
5398
5263
5459
5441
5468
5247
5464
5645
5543
5684
5534
5491
5363
5435
5959
5574
5471
5350
5514
5491
5572
5450

386
385
386
387
388
386
387
387
385
387
385
385
385
386
386
387
384
386
385
386
388
388
387
387
386
388
387
387
386
389
390
391
388
391
393
395
395
396
397
396
395
393
393
394
394
395
395
398
391
389
394
393
393
392
395
391
393
391
390
392
392
394
390
395

2.18
2.47
2.16
2.59
2.15
2.05
2.59
2.08
2.17
2.08
2.08
2.3
2.24
2.14
2.55
2.21
2.54
2.46
2.57
2.49
2.62
2.58
2.56
2.1
2.59
2.17
2.58
2.21
2.58
2.23
2.58
2.62
2.57
2.45
2.61
2.14
2.72
2.2
2.58
2.2
2.1
2.21
2.12
2.57
2.41
2.58
2.19
2.55
3.25
3.27
2.67
2.12
2.13
2.58
2.51
2.4
2.6
2.43
2.53
2.05
2.48
2.42
2.55
1.87

405
403
406
409
405
405
409
408
408
405
404
405
405
404
405
403
407
404
403
404
410
407
404
407
405
404
401
404
403
404
407
408
402
412
403
412
409
406
416
407
406
409
403
407
401
404
402
408
396
395
402
401
401
398
403
403
400
397
398
413
408
403
404
411

563
561
566
564
562
562
561
563
562
563
560
557
562
561
560
557
562
560
558
562
563
562
560
563
561
562
558
562
558
562
565
566
565
571
567
568
568
568
572
566
567
574
563
565
562
566
567
570
558
562
566
565
565
558
562
566
563
563
558
578
566
567
570
566

740615
695846
732681
689818
719739
697119
692788
701720
674615
706595
637784
619143
693278
698401
652196
697085
689532
691960
659386
657026
687089
696518
655516
689660
658516
692012
671519
662285
634700
686384
659701
681147
660468
709938
695099
729781
651968
678401
673127
671793
680577
666403
677164
659816
657402
668653
670152
628369
602003
600731
631159
639241
636914
649914
673814
618181
626330
570846
548267
582090
579934
585133
569381
564961

498
505
501
509
501
497
507
500
502
497
497
497
500
496
504
496
505
506
504
505
506
507
505
499
506
498
503
497
504
497
507
507
504
507
508
509
512
505
516
505
503
506
497
507
498
506
502
513
501
500
507
499
497
501
506
505
506
502
502
508
509
507
511
501

13
15 July, 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

144
148
145
145
149
155
147
147
148
148
152
152
146
150
151
148
152
153
154
153
149
148
159
161
155
159
153
147
157
158
150
149
151
152
153
153
159
154
151
157
149
150
153
153
152
156
155
171
160
159
156
140
140
146
146
145
146
148
154
159
153
164
157

9793
8995
9014
8946
9019
8978
8990
9349
8916
9124
8926
8749
8809
8820
8931
8830
9089
9050
9064
9167
9146
9006
8949
8998
9109
8676
8818
8701
8145
8139
9339
8961
9051
9000
9004
9313
9239
8380
9271
8555
9678
10008
9360
9436
9828
10184
10161
9431
9200
9207
9296
10141
10318
10575
10603
10307
10745
11036
11092
10081
10102
9827
9596

5681
5651
5611
5712
5925
6025
5815
5977
6082
5861
6103
5552
5689
5637
5574
5679
5748
5494
5638
5755
5591
5431
5533
5140
5133
5188
5071
5035
5115
5248
6320
6191
6235
6296
6237
6048
5876
6623
7650
5711
7407
8236
7563
6640
6944
6878
6781
6080
6271
6302
6540
6470
6749
6665
6907
6868
7138
7099
6945
7038
7201
7252
4714

386
388
388
388
387
392
391
388
387
388
390
391
388
390
386
386
386
388
386
385
389
387
390
389
390
390
381
382
389
388
376
377
376
376
376
378
382
378
374
380
377
377
378
384
380
377
378
383
384
384
382
376
378
380
385
383
380
384
388
386
384
386
376

2.53
2.55
2.54
2.59
2.59
2.59
2.61
2.57
2.61
2.59
2.59
2.81
3.04
2.74
2.76
3.12
2.57
2.17
2.34
2.26
3.17
3.12
2.34
2.96
2.64
3.34
3.3
3.08
2.35
2.3
2.28
2.27
2.35
2.31
2.33
3.04
3.33
2.22
2.49
2.29
3.01
2.24
2.29
3.3
3.34
2.89
2.34
2.03
2.04
2.09
2.11
2.68
2.76
3.08
2.27
2.52
2.98
3.23
3.27
2.48
2.79
2.08
2.58

403
406
404
403
407
414
402
402
404
404
408
408
402
406
402
402
402
404
404
399
403
399
404
406
400
409
396
404
413
414
404
404
405
406
407
406
417
409
406
411
404
406
408
410
405
401
399
417
415
415
412
396
394
400
400
400
399
403
410
414
406
420
407

563
560
563
557
568
575
565
570
572
568
575
573
564
571
567
566
570
572
571
567
570
565
569
572
568
576
564
566
576
576
571
566
572
574
574
575
578
573
577
577
575
579
578
573
575
574
575
592
587
581
578
561
565
572
573
576
574
580
587
584
579
589
564

576749
625439
582622
607854
559445
664085
599161
577016
542630
619350
630756
618408
626545
628673
589620
587219
574045
619339
603585
576941
622328
590049
592044
519306
558757
562491
507944
557286
580131
575053
598304
562126
553503
561737
552218
506776
542984
534676
534451
532251
512733
584828
532908
535237
482004
471643
485891
464236
493526
490972
503316
452661
511390
468670
534159
520671
465033
502424
491275
512259
503718
537719
510354

507
508
509
508
514
519
515
517
518
515
521
520
517
519
517
517
516
511
513
508
517
513
512
521
516
520
513
516
520
520
508
509
510
514
515
518
526
516
516
518
516
515
516
517
518
515
509
527
517
515
515
499
501
509
506
508
510
513
522
521
520
520
514

14
15 July, 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

Note: These data sets can easily be loaded in MATLAB workspace by writing the following codes in command
window:
[x,t] = chemical_dataset;
inputs = x';
outputs = t';
7. Training ANN using PSO and Application of the Trained ANN (Result and discussion)
All the three files ('myfunc.m', 'nn_pso.m', and 'datafile.xlsx') must be placed in the same directory of MATLAB
to train the ANN using PSO. Now, run 'nn_pso.m' file to train ANN using PSO. The trained ANN is 'net_f'.
Once the training is completed the regression plot will be displayed. Figure 4 shows the regression plot of the
trained ANN (net_f). From this figure, it is observed that regression coefficient R is 0.96394.

: R=0.96394
525

Data
Fit
Y=T

Output ~= 0.93*Target + 36

520
515
510
505
500
495
490
485
485

490

495

500

505

510

515

520

525

Target
Figure 4: Regression plot of the trained ANN (net_f)

Now, the trained network can be applied to know the output of unknown input features. Let input feature for the
trained ANN of the mentioned dataset is test_input which is given below.

test_input = [155, 9516, 5205, 388, 2.19, 412, 574, 522021];

15
15 July, 2016

This article is prepared and made available on ResearchGate by M. N. Alam


https://www.researchgate.net/profile/Mahamad_Alam

Now, to know the output corresponding to this input, the following need to be written in the MATLAB's
command window;

test_output = net_f(test_input')

This is giving the output as 515.1436.


Actually, the test_input is the 10th dataset taken from Table I whose target is 518. Thus, the trained network is
given the output (test_output) with 0.55% error.
8. Conclusion
This paper presents codes in MATLAB for training artificial neural network (ANN) using particle swarm
optimization (PSO). The presented codes have been tested for training ANN using cheminal_dataset available in
MATLAB.
References
[1] J. Kennedy and R. Eberhart, "Particle swarm optimization," in IEEE International Conference on Neural
Networks, Vol. 4, 1995, pp. 1942-1948.
[2] R. Eberhart and J. Kennedy, "A new optimizer using particle swarm theory," in IEEE Proceedings of the
Sixth International Symposium on Micro Machine and Human Science, 1995, pp. 39-43.
[3] R. Eberhart and Y. Shi, "Comparing inertia weights and constriction factors in particle swarm optimization,"
in Evolutionary Computation, 2000. Proceedings of the 2000 Congress on, Vol. 1, 2000, pp. 84-88.
[4] M. Clerc and J. Kennedy, "The particle swarm - explosion, stability, and convergence in a multidimensional
complex space," IEEE Transactions on Evolutionary Computation, 6 (1) (2002) 58-73.
[5] Y. del Valle, G. Venayagamoorthy, S. Mohagheghi, J.-C. Hernandez, and R. Harley, "Particle swarm
optimization: Basic concepts, variants and applications in power systems," IEEE Transactions on
Evolutionary Computation, 12 (2) (2008) 171-195.
[6] M. N. Alam, B. Das, and V. Pant, "A comparative study of metaheuristic optimization approaches for
directional overcurrent relays coordination," Electric Power Systems Research 128 (2015) 39-52.
doi: http://dx.doi.org/10.1016/j.epsr.2015.06.018
[7] M. N. Alam, "Particle Swarm Optimization: Algorithm and its Codes in MATLAB," ResearchGate (2016)
1-10. doi: http://dx.doi.org/10.13140/RG.2.1.4985.3206
[8] M. N. Alam, "Codes in matlab for particle swarm optimization," ResearchGate (2016) 1-3. doi:
http://dx.doi.org/10.13140/RG.2.1.1078.7608

Enjoy ANN using PSO!


Kindly, put your feedback and suggestions about the article. Cite the article using its DOI.
******************************************************************************************

16
15 July, 2016

You might also like