You are on page 1of 38

Instructions on Using the tool ( Clustering using Self Organizing Maps)

Step 1: Enter Your Data


(A) Enter your data in The Data worksheet, starting from the cell C13
(B) The observations should be in rows and the variables should be in columns.
(C) Above each column, choose appropriate Type (Use , Omit)
To drop a column from the clustering process - set the type = Omit
To use a column for clustering, set type = Use
You can use atmost 50 variables for Clustering. Application will automatically treat them all as continuous variables.
Make sure that the number variables entered in Input sheet exactly matches total no. of columns in your data
Make sure that the number of observations entered in Input sheet is <= no. of rows in your Data Sheet
(D) Please make sure that your data does not have blank rows or blank columns.
(E) All the variables to be used in the Clustering needs to be necessarily numeric
Any non-number in the clustering variables will be treated as missing value.
Application will replace it by the respective column mean

Step 2: Fill up the inputs in the Inputs page


(A) Note that the SOM is a square grid of n-sqaure neurons, arranged in n rows and n columns
You need to specify n. n should be between 2 and 10
(B) Once cycle consists of presenting ALL the observation to the map - once.
You need to specify number of cycles
(C) In each cycle all the observations are presented to the map once.
Order in which they are presented could be random - hence varies from cycle to cycle
or could be in the original order in which you have entered the data in Data sheet. Select the option you want.
(D) Make sure that the end value of the learning parameter is less than the start value and both the values are strictly betw
(E) Make sure that the end value of Sigma is less than the start value and both the values are strictly between 0% and 10
(F) As the training of the map progreses, both Learning Parameter and Sigma is decreased from the start value to the end
you may choose the rate of this decay as linear or exponential.

Step 3: Click on the Build Clusters button


(A) While training the map actual variables in the data are scaled so that for each variable - values are between -1 and 1.
This is called Normalization of data. Normalization takes a long time - specially for large data sets.
If you are training the map same data set on two succesive runs - you may skip the normalization for the second run.
Application will ask you whether to skip normalization or not. Skipping normalization saves a lot of time.
However application only checks the number of rows and columns of your data to determine whether it has changed since
It does not check the individual values in the data set.
So if you are not sure whether you have changed the data since last run - it's a better idea to normalize the data again.

(B) If you are training the map on same variables and the map size is same as that in the just previous run
the application will ask you whether to start with the weights obtained in the previous run.
Starting with previous weights provide incremental learning.
Using this option you can build upon what has been already learned by the map so far.
However - if you have changed your data set or the variables or the ordering of the variable you should start with fresh we

Step 4: Output of Clustering


(A) Outputs will be generated in Output sheet. You can only see it but can't modify since the sheet is protected.
(B) Application gives you an option of saving the results in a separate workbook where you can edit the output
(C) In that new workbook, the data along with Cluster labels, cluster means and variances will be saved
Also, a Radar plot will be created to let you visually compare the means of the variables across different clusters.
(D) In that Weights sheet a plot will be generated to visually depict which portion of the map captured how many data poin
zing Maps)

hem all as continuous variables.


ches total no. of columns in your data with type = Use
no. of rows in your Data Sheet

nd n columns

Select the option you want.


ue and both the values are strictly between 0 and 1
values are strictly between 0% and 100%
creased from the start value to the end value

ariable - values are between -1 and 1.


arge data sets.
normalization for the second run.
n saves a lot of time.
etermine whether it has changed since last run

er idea to normalize the data again.

in the just previous run

variable you should start with fresh weights

since the sheet is protected.


ere you can edit the output
iances will be saved
bles across different clusters.
the map captured how many data points.
Neural Network based Clustering
( Using Kohonen's Self Organizing Maps (SOM) )
Learning parameter (should be >0 and <1 )
Number of observations 80 Start value 0.9
( Needs to be between 5 and 5,000 ) End value 0.1
Decay Exponential
Number of Variables 3
( Needs to be between 3 and 50 ) Sigma for the Gaussian neighborhood
as % of map width (should be > 0% and < 100% )
Enter n , where n-Square = # neurons in the map 3 Start value 50.0%
( n needs to be between 2 and 10 ) End value 1.0%
[ Note: By entering n you are specifying that the maximum Decay Exponential
number of clusters will be at most n-square.
e.g. if you enter n = 4, you will get less than or equal to 16 clusters]

Number of training cycles 100


( Needs to be between 1 and 1000 )

Randomize the order in which inputs


are presented to the map ? No
uld be >0 and <1 )

neighborhood
d be > 0% and < 100% )
Enter your Data in this sheet
Instructions:
Start Entering your data from cell C13. Specify variable names in row 11.
Specify variable type in row 10. Use - to use variable for clustering, Omit - not to use the variable for clus
Please make sure that there are no blank row or column in your data.
The variables that you use for clustering needs to be numeric
Any non-number in your data will be treated as missing value and will be replaced by the respective column mean

Use Use Omit Omit Omit


Y1 Y2 Hdr3 Y4 Y5
28.203 28.542 30.452 30.524 29.351
28.084 28.702 27.433 30.149 32.230
29.280 28.234 30.202 30.724 29.730
29.513 27.750 29.677 29.065 31.415
30.615 29.782 29.735 29.191 32.045
30.736 30.584 29.979 29.098 30.409
31.725 31.487 33.844 30.304 28.862
30.104 28.101 30.887 29.655 30.504
30.641 31.762 30.972 31.222 29.539
30.387 29.305 28.570 29.959 27.959
31.616 31.177 29.419 29.044 31.618
30.697 29.285 30.872 28.065 30.406
29.381 30.180 30.070 29.609 30.254
28.688 30.889 30.879 30.195 28.677
27.591 31.072 30.973 29.239 32.211
30.300 28.210 31.349 30.695 31.449
29.933 30.248 29.918 29.843 29.218
29.551 31.646 28.435 30.445 29.057
29.094 30.318 31.378 30.048 28.663
29.097 29.671 29.307 29.934 32.906
27.753 29.177 31.078 29.939 31.963
29.535 30.538 30.234 28.614 32.205
31.407 29.609 30.769 31.025 30.910
31.413 29.771 30.459 31.389 29.404
30.552 29.989 31.379 31.389 29.496
30.143 30.063 30.303 29.572 29.374
8.957 10.766 10.513 10.992 9.695
9.076 10.260 11.067 9.896 11.015
9.105 7.981 10.433 11.825 10.334
11.026 11.482 9.603 9.915 9.326
11.396 9.945 12.167 10.760 8.131
10.595 10.776 9.531 11.029 9.958
10.089 10.718 11.645 9.524 11.452
10.856 10.714 9.003 12.566 10.472
8.350 8.777 9.820 9.286 8.176
9.491 11.114 8.136 9.430 11.035
10.337 9.963 9.847 9.324 9.530
9.828 9.775 10.739 10.704 10.476
10.942 9.814 10.659 11.383 8.325
9.058 9.083 9.558 9.666 10.253
10.461 10.181 9.154 9.397 8.856
7.436 8.188 12.276 8.610 9.707
10.039 11.137 9.017 8.425 8.938
10.238 8.966 10.988 10.774 9.044
11.238 10.417 8.781 9.912 10.411
9.218 9.000 10.552 8.869 10.751
19.721 19.566 19.827 19.981 20.808
19.752 20.123 19.051 19.564 19.280
19.124 19.130 19.684 20.122 20.829
21.334 19.939 19.434 19.867 20.172
21.674 19.406 20.419 20.781 19.331
18.149 18.181 20.690 20.270 20.373
19.422 18.450 19.314 20.414 19.257
19.092 17.854 20.069 21.802 20.474
19.680 19.290 20.208 19.041 20.327
20.757 19.308 19.793 21.498 20.841
18.489 21.147 22.120 18.875 19.366
19.809 21.537 19.692 19.729 20.320
20.248 19.507 21.243 19.277 19.963
19.084 19.200 21.642 20.732 18.351
19.836 20.006 18.721 20.396 21.404
20.379 20.231 19.547 19.738 21.313
21.154 18.340 20.894 18.355 20.080
18.934 19.340 17.930 22.027 21.425
23.257 20.456 20.924 19.092 18.181
20.839 19.732 18.546 19.306 21.121
22.406 21.861 19.082 17.930 18.848
20.989 18.535 19.501 19.387 17.877
19.972 17.487 18.136 19.883 18.757
20.862 20.740 20.088 20.448 19.463
19.695 19.495 18.932 19.779 19.350
19.736 21.057 20.323 20.024 21.848
20.534 21.212 19.348 20.427 21.062
19.984 19.310 19.124 20.021 21.420
19.816 20.943 21.043 20.265 19.707
20.285 20.872 20.679 19.848 19.740
21.398 20.219 19.945 17.803 20.664
19.637 21.532 19.968 22.254 19.865
19.468 18.779 19.900 21.875 21.327
19.497 19.909 21.239 19.974 19.544
- not to use the variable for clustering

espective column mean

Use Omit Omit Omit Omit Omit


Y6 Y7 Y8 Y9 X10 X11
30.426 32.272 31.330 30.139
30.511 30.231 29.138 29.272
29.792 28.848 30.866 29.201
28.676 29.891 28.627 28.505
30.614 29.110 29.434 30.580
30.919 29.224 30.522 30.145
30.142 31.320 31.243 28.730
30.835 28.760 32.061 29.395
31.044 29.496 30.497 29.765
31.012 29.406 29.960 30.528
28.577 28.740 29.409 29.666
28.251 29.263 30.961 29.804
28.933 32.135 29.841 30.253
28.411 28.165 31.197 29.848
29.660 32.894 30.067 29.403
29.328 29.486 31.355 29.538
29.680 30.525 31.185 30.244
30.560 29.103 29.948 30.454
29.521 30.347 29.505 29.866
29.340 28.779 28.560 30.156
30.122 30.298 30.924 30.725
30.056 29.476 30.269 30.518
30.505 28.281 28.098 29.268
29.944 29.154 31.902 30.576
29.919 30.762 30.247 29.807
29.675 29.374 29.457 31.720
9.079 9.043 9.110 10.373
10.188 9.080 10.261 9.005
8.980 10.759 10.310 12.330
10.960 9.422 9.794 9.849
10.892 10.365 9.760 9.309
10.517 10.692 11.422 10.361
9.716 10.898 10.436 8.850
10.135 10.594 12.792 9.982
10.650 10.014 9.004 9.282
10.437 8.336 10.255 9.162
8.772 11.201 9.762 8.838
9.202 11.896 10.925 9.919
8.801 11.093 9.405 10.096
9.606 12.320 9.873 9.984
10.174 11.496 9.984 8.670
8.404 10.124 10.154 9.060
9.960 9.209 9.431 10.564
9.523 9.803 8.161 11.163
9.644 10.347 9.661 8.834
7.949 10.806 9.483 11.200
18.937 19.795 19.745 18.784
18.619 18.578 18.780 19.907
17.683 21.017 21.410 20.462
19.781 21.782 19.046 19.557
21.464 20.100 18.378 19.471
18.991 20.723 21.075 18.551
20.951 19.452 18.707 19.850
18.931 20.218 19.044 19.952
19.242 19.003 19.611 20.624
19.470 18.587 19.506 20.446
20.230 19.691 18.396 21.847
17.959 21.073 19.900 19.733
19.492 19.573 19.648 21.736
18.437 20.737 20.757 20.012
19.326 20.668 20.670 20.681
19.650 19.581 21.029 19.226
19.392 19.999 19.403 18.921
21.228 20.328 21.046 18.809
19.951 21.440 20.535 20.172
21.341 20.044 19.633 22.708
20.445 20.115 19.896 19.479
21.496 19.105 20.702 20.150
19.656 18.584 20.618 19.622
19.113 20.007 20.602 21.797
20.894 19.060 21.567 20.044
20.951 18.898 20.360 21.014
18.600 19.140 20.303 20.903
18.369 21.947 19.156 18.170
20.274 19.739 20.847 20.986
20.882 21.537 19.746 19.171
19.297 20.212 19.852 21.053
21.220 19.957 18.726 19.476
20.066 21.377 21.776 18.078
21.485 18.537 20.043 22.531
Omit Omit Omit Omit Omit Omit
X12 X13 X14 X15 X16 X17
Omit Omit Omit Omit Omit Omit
X18 X19 X20 X21 X22 X23
Omit Omit Omit Omit Omit Omit
X24 X25 X26 X27 X28 X29
Omit Omit Omit Omit Omit Omit
X30 X31 X32 X33 X34 X35
Omit Omit Omit Omit Omit Omit
X36 X37 X38 X39 X40 X41
Omit Omit Omit Omit Omit Omit
X42 X43 X44 X45 X46 X47
Omit Omit Omit
X48 X49 X50
Table of Winner Nueron
Scale for graph
Major Unit 0.33 Obs X Y Nrn ID
Old Settings 1 0.18179335 0.56495897 2
Grid Size 3 2 0.26902046 0.43991096 2
# Obs 80 3 0.14676457 0.46137833 2
# Vars 3 4 0.09743002 0.26001267 1
Original Col Numbers 1 5 0.1288016 0.41324088 2
2 6 0.13958814 0.59863434 2
6 7 0.2426485 0.44089777 2
7 8 0.17402468 0.43476971 2
7 9 0.27424073 0.38445929 2
7 10 0.05332548 0.5585779 2
8 11 0.20935101 0.4178573 2
8 12 0.26194492 0.13127919 1
9 13 0.12056843 0.45457449 2
14 0.1990183 0.57005091 2
15 0.11261103 0.54504013 2
16 0.25284113 0.09125658 1
17 0.09263822 0.39902593 2
18 0.20034821 0.47867509 2
19 0.06556333 0.5997086 2
20 0.23159286 0.41006321 2
21 0.25436611 0.60939891 2
22 0.16589271 0.54205138 2
23 0.11207185 0.5669376 2
24 0.12630885 0.52003958 2
25 0.19206395 0.57675786 2
26 0.10579175 0.61309317 2
27 0.11006302 0.8562812 3
28 0.05832699 0.84050539 3
29 0.06443942 0.74164646 3
30 0.28174783 0.82335975 3
31 0.08152576 0.85177253 3
32 0.22554767 0.92775939 3
33 0.25259604 0.81075407 3
34 0.1570259 0.83305612 3
35 0.09584484 0.79525733 3
36 0.08576709 0.75841295 3
37 0.07669982 0.94024936 3
38 0.14645345 0.78103323 3
39 0.15313421 0.80306047 3
40 0.10667734 0.91272322 3
41 0.245979 0.752447 3
42 0.099754 0.768121 3
43 0.06757 0.849319 3
44 0.2492 0.924593 3
45 0.175392 0.71914 3
46 0.118213 0.840463 3
47 0.050725 0.88265 3
48 0.084151 0.798238 3
49 0.202292 0.878268 3
50 0.204679 0.872846 3
51 0.128234 0.942816 3
52 0.075578 0.89152 3
53 0.069761 0.71725 3
54 0.146895 0.918705 3
55 0.256461 0.810725 3
56 0.064088 0.865946 3
57 0.209945 0.922984 3
58 0.100837 0.86012 3
59 0.205831 0.825097 3
60 0.083521 0.736668 3
61 0.227385 0.931847 3
62 0.240705 0.757818 3
63 0.144168 0.832465 3
64 0.244276 0.755819 3
65 0.182351 0.913027 3
66 0.175598 0.734502 3
67 0.053356 0.77051 3
68 0.149983 0.829176 3
69 0.240842 0.718527 3
70 0.232844 0.858191 3
71 0.079343 0.784722 3
72 0.180834 0.866165 3
73 0.091253 0.933162 3
74 0.174665 0.792215 3
75 0.182137 0.878405 3
76 0.21025 0.89652 3
77 0.227709 0.891225 3
78 0.248182 0.845217 3
79 0.252479 0.796009 3
80 0.188215 0.720292 3
Table of Nueron weights

80
Nrn ID Row Col X Y #Obs % Obs
1 1 1 0.167 0.17 3 0.87 0.72 0.80
2 1 2 0.166667 0.5 23 0.84 0.76 0.86
3 1 3 0.166667 0.83333 54 0.66 0.63 0.68
4 2 1 0.5 0.16667 0 0.87 0.72 0.80
5 2 2 0.5 0.5 0 0.84 0.76 0.86
6 2 3 0.5 0.83333 0 0.66 0.63 0.68
7 3 1 0.833333 0.16667 0 0.87 0.72 0.80
8 3 2 0.833333 0.5 0 0.84 0.76 0.86
9 3 3 0.833333 0.83333 0 0.66 0.63 0.68
Clustering using Self Organizing Maps Note:

Number of variables used for clustering 3


Number of observations used for clustering 80

Number of Clusters 3

Cluster Assignment Cluster Sizes


Observation ID Cluster ID Cluster 1
1 2 3
2 2 Cluster Position on the grid
3 2 Cluster 1
4 1 Row 1
5 2 Column 1
6 2
7 2
8 2 Cluster Means
9 2 Overall Cluster 1
10 2 Y1 20.7 30.2
11 2 Y2 20.6 28.4
12 1 Y6 20.5 28.8
13 2
14 2 Cluster Variances
15 2 Overall Cluster 1
16 1 Y1 58.4 0.4
17 2 Y2 58.3 0.6
18 2 Y6 59.6 0.3
19 2
20 2
21 2
22 2
23 2
24 2
25 2
26 2
27 3
28 3
29 3
30 3
31 3
32 3
33 3
34 3
35 3
36 3
37 3
38 3
39 3
40 3
41 3
42 3
43 3
44 3
45 3
46 3
47 3
48 3
49 3
50 3
51 3
52 3
53 3
54 3
55 3
56 3
57 3
58 3
59 3
60 3
61 3
62 3
63 3
64 3
65 3
66 3
67 3
68 3
69 3
70 3
71 3
72 3
73 3
74 3
75 3
76 3
77 3
78 3
79 3
80 3
Cells marked with blue (if any) in the Cluster Means and Variances tables
denote that there are some missing values for that varaible in that cluster.
All missing values have been replaced by overall means for computation
of cluster means and variances

Cluster 2 Cluster 3
23 54
osition on the grid
Cluster 2 Cluster 3
1 1
2 3

Cluster 2 Cluster 3
29.8 16.3
30.0 16.1
30.0 16.1

Cluster 2 Cluster 3
1.5 26.1
1.1 24.1
0.5 25.4
X1 X2 X3 X4 X5 X6 X7 X8 X9
1 9.909 10.738 12.707 10.757 9.430 9.325 9.136 9.673 9.770
2 9.275 9.578 9.658 11.398 11.933 9.637 9.502 9.789 12.057
3 9.359 11.039 7.752 9.318 8.131 11.522 9.763 10.228 10.519
4 9.298 10.003 9.760 10.162 10.447 11.076 10.151 11.200 9.158
5 10.150 11.010 10.397 8.854 10.048 8.694 9.199 9.845 11.070
6 10.277 9.985 10.432 9.281 10.185 10.603 9.549 9.801 9.575
7 9.428 10.330 10.457 8.840 9.830 8.864 9.637 10.682 10.613
8 9.674 7.890 8.520 11.376 11.669 10.223 10.275 9.948 10.483
9 8.740 12.671 11.389 10.723 10.343 9.401 9.706 10.328 10.064
10 8.892 10.721 9.548 10.065 7.310 8.909 9.131 10.846 10.858
11 9.323 9.183 11.829 10.931 10.065 8.757 10.837 10.861 11.293
12 9.937 9.869 11.142 9.643 11.090 8.986 8.840 9.160 10.580
13 10.077 9.141 10.909 10.006 9.937 10.611 10.929 8.864 8.624
14 9.087 8.824 11.706 9.358 9.756 10.680 9.861 8.882 11.052
15 9.360 10.872 9.438 9.083 7.881 8.669 11.906 8.771 10.320
16 9.992 9.450 11.654 10.801 10.227 8.448 9.418 9.521 10.800
17 8.894 10.574 9.749 11.251 8.964 9.893 9.839 8.764 10.232
18 9.545 8.613 11.205 9.186 9.840 9.519 9.330 10.713 9.921
19 9.509 10.646 9.624 11.577 10.069 9.319 9.362 7.527 11.198
20 10.030 12.308 10.221 10.192 10.180 11.733 11.377 9.354 9.601
21 19.583 17.847 18.535 22.055 18.700 20.774 19.130 21.923 20.100
22 18.940 18.873 18.941 20.154 20.783 19.164 19.566 20.719 19.807
23 21.185 20.225 18.877 19.554 20.110 19.894 20.867 19.973 20.092
24 18.857 19.297 21.113 19.926 20.272 21.387 19.603 21.213 21.068
25 19.608 20.365 18.794 19.586 20.071 19.221 18.063 20.158 19.523
26 19.512 19.726 20.425 20.224 17.834 19.827 19.066 19.311 19.604
27 19.473 20.250 19.903 21.585 18.881 20.059 18.283 19.174 18.889
28 20.834 19.062 19.929 19.561 20.967 20.120 20.395 19.199 17.947
29 20.054 19.903 20.075 19.337 19.703 20.786 20.738 20.183 19.331
30 19.616 20.719 20.437 19.188 20.669 20.885 21.547 20.040 20.212
31 22.271 21.474 19.697 18.613 19.067 20.885 18.472 20.394 19.441
32 20.894 22.558 19.335 20.191 21.231 19.328 18.866 20.675 21.374
33 21.457 21.814 18.718 19.487 18.180 19.375 18.825 20.244 19.865
34 20.369 20.316 22.051 20.094 20.591 20.362 20.057 19.569 18.082
35 19.663 19.414 18.487 19.705 19.672 19.494 21.031 21.246 20.433
36 21.393 19.520 20.498 20.383 18.610 20.420 20.651 19.733 19.999
37 20.186 21.046 21.838 19.609 19.937 20.605 19.285 19.501 18.289
38 19.855 21.530 19.107 21.936 22.720 18.139 21.401 19.308 21.111
39 20.984 20.416 20.063 19.954 22.659 20.190 18.625 18.511 20.518
40 19.892 18.582 22.227 18.845 19.624 20.704 19.942 20.901 18.319
41 22.634 20.158 20.946 19.640 18.732 19.820 20.240 20.287 20.044
42 20.981 21.355 20.636 21.473 19.072 18.183 19.311 20.279 19.626
43 19.899 21.183 20.156 20.469 20.926 20.446 20.018 21.811 20.611
44 20.459 20.214 20.895 18.469 18.969 19.131 20.266 21.106 21.381
45 19.942 19.937 20.465 19.333 20.324 20.455 20.221 19.623 19.788
46 21.803 19.904 20.094 21.739 19.486 20.097 21.567 20.927 20.276
47 19.608 20.209 19.790 19.829 20.592 19.750 19.314 19.093 20.011
48 20.312 21.284 19.865 19.224 20.714 18.475 21.212 21.613 19.795
49 18.224 19.991 19.617 18.932 19.881 21.531 20.624 20.023 21.461
50 19.886 20.830 18.150 19.856 18.333 20.221 20.799 17.678 19.515
51 19.532 19.881 21.863 18.933 20.657 19.805 19.968 20.096 20.574
52 21.908 18.048 18.650 18.102 21.303 18.002 22.478 18.754 19.519
53 19.468 20.048 19.830 18.809 20.613 18.126 20.630 20.769 23.672
54 20.353 18.890 18.800 18.483 20.424 18.368 20.710 19.551 20.239
55 30.841 30.549 29.307 28.104 30.166 30.364 31.103 29.761 28.198
56 29.556 28.835 30.442 29.755 29.687 31.667 30.412 28.738 28.610
57 29.514 31.555 30.639 29.366 30.313 28.674 31.122 28.191 28.854
58 32.037 29.719 28.243 28.356 30.590 29.255 30.013 29.424 30.008
59 28.087 29.481 29.823 30.257 29.201 28.590 28.301 30.441 30.797
60 28.431 29.857 29.283 29.230 28.804 30.001 31.450 29.373 30.769
61 30.788 29.570 27.952 28.592 29.348 29.928 28.783 29.407 29.775
62 30.306 28.732 30.800 30.818 30.215 30.718 29.305 31.357 28.637
63 29.281 28.575 30.090 30.230 30.220 30.892 29.901 29.950 31.478
64 30.961 27.726 28.810 30.039 30.309 30.423 28.791 28.984 30.288
65 30.141 30.824 31.059 28.916 29.824 30.294 29.377 31.800 30.147
66 30.709 30.771 29.390 30.571 29.647 29.635 29.859 30.621 29.697
67 30.888 29.639 28.853 29.469 28.873 30.304 30.514 31.838 29.368
68 29.627 29.603 29.604 29.566 30.999 30.073 28.969 31.382 30.533
69 31.052 28.564 29.602 30.720 32.716 30.065 29.907 31.877 28.445
70 30.142 31.088 31.532 31.118 31.293 30.638 29.706 29.487 30.182
71 30.679 30.333 28.447 29.124 29.284 29.830 30.607 29.794 30.359
72 29.830 30.925 30.503 28.541 28.580 29.063 28.477 30.444 30.660
73 29.827 29.491 31.335 27.047 30.287 30.195 29.517 29.478 30.134
74 29.506 32.226 29.432 28.356 30.164 29.522 31.177 29.628 27.827
75 31.329 30.857 30.183 31.070 30.735 29.628 28.689 27.957 30.245
76 30.346 30.818 29.885 28.077 30.846 30.782 29.495 29.273 32.622
77 29.496 30.083 28.883 28.524 30.570 31.139 30.690 30.694 29.543
78 30.805 30.109 30.222 29.383 30.726 31.460 29.448 29.695 28.525
79 31.537 28.681 31.304 30.831 31.888 32.372 30.107 30.218 31.767
80 30.244 31.474 30.123 30.398 30.784 29.595 31.003 28.381 30.154
81 38.898 40.703 38.055 42.341 37.780 39.339 39.723 40.295 40.521
82 41.317 41.062 41.900 41.661 38.310 40.002 41.138 39.536 40.186
83 40.264 39.990 39.606 39.237 40.524 40.992 42.085 40.407 41.472
84 39.061 39.248 42.177 41.353 40.458 37.353 38.720 41.594 40.191
85 40.239 39.306 40.366 40.133 41.183 41.420 41.150 41.721 40.121
86 38.916 40.678 39.710 40.337 39.325 38.928 41.031 38.985 39.978
87 41.174 40.033 40.562 41.256 38.441 42.331 38.694 40.881 39.475
88 38.486 39.869 40.006 41.680 41.401 39.476 39.676 39.187 40.870
89 38.316 41.104 39.840 40.217 40.441 40.746 39.258 40.492 41.001
90 41.360 39.840 39.393 40.330 40.227 38.536 39.765 41.844 39.562
91 41.362 39.770 39.025 38.085 40.906 40.211 38.372 40.469 38.946
92 39.975 38.792 39.550 41.107 40.633 39.466 39.940 41.112 42.272
93 40.040 40.161 39.939 41.269 39.747 39.861 41.632 39.486 41.186
94 37.332 39.749 42.028 40.505 40.595 39.751 38.352 41.196 40.990
95 38.932 40.201 38.717 41.902 39.480 40.709 40.376 40.362 38.776
96 40.727 38.702 37.623 39.193 39.952 39.379 40.450 39.045 40.985
97 39.621 41.281 40.866 39.924 38.046 38.202 40.574 40.824 38.398
98 40.768 39.545 37.537 38.538 41.078 40.133 40.151 38.764 40.204
99 39.272 39.290 39.047 41.397 40.435 40.460 38.105 39.056 41.044
100 40.588 37.983 39.348 38.613 39.658 40.992 38.160 39.842 38.254
101 40.590 39.637 40.199 41.705 40.918 39.260 40.152 40.864 40.614
102 39.897 40.728 41.049 41.744 39.085 39.325 39.687 39.104 38.324
103 50.880 48.993 51.414 50.890 50.151 51.282 50.120 49.386 51.192
104 48.702 50.267 49.455 49.862 49.419 51.032 50.059 50.053 49.971
105 50.143 49.272 49.619 51.051 50.329 49.455 50.200 49.977 51.342
106 50.920 51.132 51.227 47.913 49.303 50.687 49.833 50.093 50.882
107 49.689 50.305 50.972 47.950 49.740 51.671 50.029 50.048 51.210
108 49.132 50.368 49.383 51.554 51.213 51.224 51.324 51.482 49.525
109 49.830 49.615 48.711 51.556 49.015 50.939 50.869 49.929 50.463
110 50.882 49.658 49.795 49.944 51.093 50.757 50.832 49.467 50.089
111 51.243 49.515 50.105 50.297 50.847 50.570 51.755 50.387 50.177
112 49.660 51.040 50.349 49.899 49.343 50.825 47.969 49.596 50.388
113 49.631 50.436 50.742 50.222 49.470 49.963 49.701 50.989 51.123
114 51.649 50.961 48.496 49.439 48.921 49.521 51.309 48.131 49.472
115 49.941 48.719 47.865 50.521 51.450 50.895 51.267 50.078 49.715
116 49.799 49.937 49.569 51.936 52.286 50.743 51.028 48.548 49.018
117 49.841 50.706 51.286 51.028 50.270 49.256 48.554 49.212 50.742
118 47.604 51.353 49.974 50.265 51.308 47.357 51.358 48.622 51.067
119 49.898 49.852 51.597 50.937 49.569 49.551 50.652 49.656 47.797
120 50.405 50.027 49.750 49.191 50.173 50.416 50.205 49.385 50.203
121 50.519 48.106 49.981 49.540 47.746 51.088 50.539 49.780 51.261
122 47.985 49.287 48.851 50.363 49.123 48.759 50.606 49.871 49.365
123 49.404 49.140 48.405 50.340 50.765 48.303 50.875 49.390 49.703
124 49.958 49.691 51.126 50.866 49.515 50.746 51.497 50.278 49.546
125 50.708 50.299 48.642 50.723 50.798 50.648 50.598 48.561 49.465
126 49.078 48.683 50.814 50.138 49.955 49.983 49.932 50.461 51.059
127 50.043 49.227 50.900 50.193 47.844 49.514 50.723 50.373 50.454
128 51.450 49.334 47.438 49.022 49.861 50.468 50.439 50.781 49.141
129 48.217 50.585 48.989 49.957 51.658 49.897 52.001 50.555 49.552
130 49.731 49.338 50.850 50.372 48.840 50.338 50.398 47.924 50.538
131 50.237 50.468 47.624 50.073 47.901 48.844 48.645 50.649 50.197
132 50.409 50.770 48.030 51.103 49.532 48.856 49.160 50.453 50.401
133 49.078 50.253 48.415 48.530 48.593 51.643 50.203 49.581 50.660
134 50.348 49.006 49.767 48.884 48.950 50.146 49.222 48.626 49.174
135 48.972 50.872 51.242 49.134 49.045 47.762 50.296 50.889 48.901
136 52.781 50.314 50.434 50.633 51.265 50.063 48.973 48.766 50.336
137 49.382 51.229 50.050 50.847 49.999 52.216 48.502 49.320 47.958
138 48.505 48.556 51.520 48.723 50.898 51.969 50.422 49.349 51.457
139 50.689 47.405 49.772 49.906 50.635 49.876 49.569 49.477 50.745
140 48.938 50.948 49.648 50.999 49.731 50.925 50.317 50.649 51.651
141 50.219 50.552 49.840 50.257 48.471 51.692 50.535 50.242 50.787
142 48.197 49.850 47.893 50.099 50.390 50.138 48.963 49.680 49.824
143 50.601 48.512 50.630 50.129 49.190 51.032 48.281 50.168 48.704
144 48.811 49.060 49.002 50.637 49.440 49.358 49.280 50.254 51.583
145 48.634 46.017 47.577 48.613 49.784 49.174 52.063 50.983 51.208
146 48.492 50.695 50.628 49.089 48.845 48.551 50.724 50.928 49.855
147 50.246 50.098 52.779 51.747 50.659 50.316 49.795 49.155 48.797
148 51.854 49.170 50.110 51.761 50.309 50.402 50.926 50.153 48.800
149 50.732 52.334 48.984 48.052 50.513 51.275 50.955 50.030 50.055
150 51.459 49.190 48.163 48.263 50.138 52.579 51.346 50.302 49.604
0.1667 0.5 0.8333
0.16667
0.5
0.83333

You might also like