You are on page 1of 11

14/10/2017 Lm th no xy dng mt mng neural n gin trong 9 dng m Python

hongthaiphi Follow
Oct 18, 2016 8 min read

Lm th no xy dng mt mng neural n gin trong 9 dng


m Python

Bi ting Anh: How to build a simple neural network in 9 lines of


Python code

L mt phn ca nhim v ca ti tm hiu v AI, ti t ra cho


mnh mc tiu xy dng mt mng li thn kinh n gin trong
Python. m bo ti thc s hiu n, ti phi xy dng n t u
m khng s dng mt th vin mng thn kinh. Nh mt bi ng
blog tuyt vi ca Andrew Trask ti t c mc tiu ca ti. Di
y l ch trong vng 9 dng m:

1 from numpy import exp, array, random, dot


2 training_set_inputs = array([[0, 0, 1], [1, 1, 1], [1, 0, 1]
3 training_set_outputs = array([[0, 1, 1, 0]]).T
4 random.seed(1)
5 synaptic_weights = 2 * random.random((3, 1)) - 1
6 for iteration in xrange(10000):
7 output = 1 / (1 + exp(-(dot(training_set_inputs, synapti

Trong bi vit ny, ti s gii thch cch ti to ra mng neural , v


bn c th da theo t dng cho mnh mt mng neural. Ti cng s
cung cp mt bn code di hn, nhng r rng hn cui bi vit.

Nhng trc tin, mt mng li thn kinh l g? B no con ngi bao


gm 100 t t bo gi l t bo thn kinh (neuron), kt ni vi
nhau bng cc khp thn kinh (synaptic). Nu mt khp thn kinh
ca mt t bo c nng lng v pht sng, t bo cng s pht
sng truyn tip d liu i (If su cient synaptic inputs to a neuron
re, that neuron will also re). Chng ta gi qu trnh ny l suy ngh.

https://medium.com/@hongthaiphi/l%C3%A0m-th%E1%BA%BF-n%C3%A0o-%C4%91%E1%BB%83-x%C3%A2y-d%E1%BB%B1ng-m%E1%B 1/11
14/10/2017 Lm th no xy dng mt mng neural n gin trong 9 dng m Python

Hnh 1

Chng ta c th m hnh qu trnh ny bng cch to ra mt mng li


thn kinh trn mt my tnh. Nhng khng cn thit m hnh
phc tp sinh hc ca no b con ngi mc phn t, ch quy nh
mc cao hn ca n. Chng ta s dng mt k thut gi l ma trn
ton hc, l li ca cc con s. lm cho n thc s n gin,
chng ti s ch m hnh mt t bo thn kinh duy nht, vi ba u
vo v mt u ra.

Chng ti s o to cc t bo thn kinh gii quyt cc vn


di y. Bn v d u tin c gi l mt tp hun luyn. Bn c
th tm ra quy lut? Ch du ? l 0 hay 1?

https://medium.com/@hongthaiphi/l%C3%A0m-th%E1%BA%BF-n%C3%A0o-%C4%91%E1%BB%83-x%C3%A2y-d%E1%BB%B1ng-m%E1%B 2/11
14/10/2017 Lm th no xy dng mt mng neural n gin trong 9 dng m Python

Hnh 2

Bn c th thy, m u ra lun lun bng vi gi tr ca cc ct


u vo tn cng bn tri. V vy cu tr li l ? Nn c 1.

Qu trnh o to

Nhng lm th no chng ta dy cho t bo thn kinh tr li cu


hi mt cch chnh xc? Chng ta s cung cp cho mi u vo mt
trng s, c th l mt s dng hoc m. Mt u vo vi mt trng
s cc ln hoc mt trng s m cc ln, s c mt tc ng mnh
n u ra ca t bo thn kinh. Trc khi chng ta bt u,
chng ta thit lp mi trng s bng mt s ngu nhin. Sau
chng ta bt u qu trnh o to:

Ly inputs t tp d liu hun luyn, iu chnh chng bng


trng s, nm chng qua mt cng thc c bit tnh neurons
output.

Tnh li, l s khc bit ca neurons output v output m ta


mong mun, out mong mun ny c sn trong tp hun
luyn.

Ty thuc vo s ch o ca cc li, iu chnh trng s mt


cch nh nhng.

Lp li qu trnh ny 10, 000 ln.

https://medium.com/@hongthaiphi/l%C3%A0m-th%E1%BA%BF-n%C3%A0o-%C4%91%E1%BB%83-x%C3%A2y-d%E1%BB%B1ng-m%E1%B 3/11
14/10/2017 Lm th no xy dng mt mng neural n gin trong 9 dng m Python

Hnh 3

Cui cng trng s ca cc t bo thn kinh s t c ti u vi


cc gi tr trong tp hun luyn. Nu chng ta dng mng li neural
ny x l mt tnh hung mi, chng s p dng cc trng s, v
a ra mt d on.

Qu trnh ny c gi l lan truyn ngc (back propagation).

Cng thc tnh kt qu ca t bo thn kinh

Bn c th t hi, cng thc c bit tnh output l ci qui g? u


tin hy ly trng s nhn vi d liu inputs:

Tip theo chng ta bnh thng ha n, kt qu nm gia 0 v 1.


lm iu ny, chng ta s dng mt hm ton hc, c gi l hm
sigmoid:

Nu v trn mt th, hm sigmoid l mt ng cong hnh ch S.

https://medium.com/@hongthaiphi/l%C3%A0m-th%E1%BA%BF-n%C3%A0o-%C4%91%E1%BB%83-x%C3%A2y-d%E1%BB%B1ng-m%E1%B 4/11
14/10/2017 Lm th no xy dng mt mng neural n gin trong 9 dng m Python

Hnh 4

V vy, bng cch thay th phng trnh u tin vo phng trnh


th hai, cng thc cho output ca cc t bo thn kinh l:

Bn c th thy rng chng ta khng s dng mt ngng bn ti


thiu (minimum ring threshold) hay cn gi l hm kch hot", ct
l gi cho mi th n gin.

Cng thc iu chnh trng s

Trong sut chu trnh o to (Hnh 3), chng ta iu chnh cc trng


s. Nhng cu hi l iu chnh nh th no? Chng ta c th s
dng cng thc Error Weighted Derivative:

Ti sao li l cng thc ny? u tin, chng ta mun thc hin vic
iu chnh t l thun vi kch thc ca cc li (tc l sai lch cng
nhiu th anh s iu chnh cng mnh tay). Th hai, chng ta nhn
vi d liu u vo, l hoc l 0 hoc l 1. Nu u vo l 0, trng
s ch cn iu chnh qui g. Cui cng, chng ta nhn vi
gradient ( dc) ca ng cong Sigmoid (Hnh 4). hiu cu
cui, hi kh y, hy ti gii thch r:

https://medium.com/@hongthaiphi/l%C3%A0m-th%E1%BA%BF-n%C3%A0o-%C4%91%E1%BB%83-x%C3%A2y-d%E1%BB%B1ng-m%E1%B 5/11
14/10/2017 Lm th no xy dng mt mng neural n gin trong 9 dng m Python

1. Chng ta s dng ng cong Sigmoid tnh output of the


neuron.

2. Nu output l mt s dng hoc m ln, n c ngha l t


bo thn kinh kh t tin mt cch ny hay cch khc.

3. T Hnh 4, chng ta c th thy nu gi tr trc x tin ti m v


cng hoc dng v cng, ng cong Sigmoid c mt dc
thp (i thoi, gn nh bng phng).

4. Nu cc t bo thn kinh l t tin rng trng s hin ti l


ng, n khng mun iu chnh nhiu. Nhn vi dc ca
ng cong Sigmoid l t c iu ny.

dc ca ng cong sigmoid, c th c tm thy bng cch


ly o hm:

V vy, bng cch thay th cc phng trnh th hai vo phng trnh


u tin, cc cng thc cui cng iu chnh trng s l:

C cc cng thc khc, s cho php cc t bo thn kinh hc nhanh


hn, nhng cng thc chng ta s dng c li th l kh n gin.

Xy dng m Python

Mc d chng ta khng s dng th vin neural network no c, chng


ta s s dng 4 phng thc t lib numpy ca Python. l:

exp tnh lu tha t nhin (the natural exponential)

array to ma trn ( creates a matrix)

dot nhn ma trn (multiplies matrices)

random sinh s ngu nhin (gives us random numbers)

V d chng ta c th s dng phng thc array() biu din tp d


liu hun luyn nh sau:

https://medium.com/@hongthaiphi/l%C3%A0m-th%E1%BA%BF-n%C3%A0o-%C4%91%E1%BB%83-x%C3%A2y-d%E1%BB%B1ng-m%E1%B 6/11
14/10/2017 Lm th no xy dng mt mng neural n gin trong 9 dng m Python

1 training_set_inputs = array([[0, 0, 1], [1, 1, 1], [1, 0, 1]


2 training_set_outputs = array([[0, 1, 1, 0]]).T

Hm .T, transpose-chuyn i ma trn t ngang sang dc. Do my


tnh s lu tr nh th ny:

c. Ti ngh rng chng ta sn sng cho phin bn p hn ca


m ngun. Sau khi show code, ti s kt thc bi vit ny vi mt
s cm ngh ca mnh.

Ti thm cc comment vo m ngun gii thch tt c mi th,


tng dng mt, hy c nh. Lu rng trong mi ln lp, chng ta
x l ton b tp hun luyn cng mt lc. Do bin ca chng ta l
cc ma trn. Di y l mt v d lm vic hon ton c vit bng
Python:

https://medium.com/@hongthaiphi/l%C3%A0m-th%E1%BA%BF-n%C3%A0o-%C4%91%E1%BB%83-x%C3%A2y-d%E1%BB%B1ng-m%E1%B 7/11
14/10/2017 Lm th no xy dng mt mng neural n gin trong 9 dng m Python

1 from numpy import exp, array, random, dot


2
3
4 class NeuralNetwork():
5 def __init__(self):
6 # Seed the random number generator, so it generates
7 # every time the program runs.
8 random.seed(1)
9
10 # We model a single neuron, with 3 input connection
11 # We assign random weights to a 3 x 1 matrix, with
12 # and mean 0.
13 self.synaptic_weights = 2 * random.random((3, 1)) -
14
15 # The Sigmoid function, which describes an S shaped cur
16 # We pass the weighted sum of the inputs through this f
17 # normalise them between 0 and 1.
18 def __sigmoid(self, x):
19 return 1 / (1 + exp(-x))
20
21 # The derivative of the Sigmoid function.
22 # This is the gradient of the Sigmoid curve.
23 # It indicates how confident we are about the existing
24 def __sigmoid_derivative(self, x):
25 return x * (1 - x)
26
27 # We train the neural network through a process of tria
28 # Adjusting the synaptic weights each time.
29 def train(self, training_set_inputs, training_set_outpu
30 for iteration in xrange(number_of_training_iteratio
31 # Pass the training set through our neural netw
32 output = self.think(training_set_inputs)
33
34 # Calculate the error (The difference between t
35 # and the predicted output).
36 error = training_set_outputs - output
37
38 # Multiply the error by the input and again by
39 # This means less confident weights are adjuste
40 # This means inputs, which are zero, do not cau
41 adjustment = dot(training_set_inputs.T, error *
42
43 # Adjust the weights.
44 self.synaptic_weights += adjustment
45

https://medium.com/@hongthaiphi/l%C3%A0m-th%E1%BA%BF-n%C3%A0o-%C4%91%E1%BB%83-x%C3%A2y-d%E1%BB%B1ng-m%E1%B 8/11
14/10/2017 Lm th no xy dng mt mng neural n gin trong 9 dng m Python

45
46 # The neural network thinks.
47 def think(self, inputs):

Bn c th ti code t: https://github.com/miloharper/simple-neural-
network

Cm ngh cui cng

chy th mng neural s dng dng lnh ny:

python main.py

Bn s nhn c kt qu trng ging th ny:

1 Random starting synaptic weights:


2 [[-0.16595599]
3 [ 0.44064899]
4 [-0.99977125]]
5
6 New synaptic weights after training:
7 [[ 9.67299303]
8 [-0.2078435 ]
9 [ 4 62963669]]

Chng ta lm c ri! Chng ta xy dng xong mt mng neural


n gin s dng Python!

u tin mng neural gn cc trng s mt cch ngu nhin, sau


s dng tp hun luyn o to tm ra trng s tt nht. Khi
a vo mt tnh hung mi kim tra: [1, 0, 0] chng ta nhn c
d on 0.99993704. Cu tr li ng l 1. Rt rt gn vi kt qu!

Cc chng trnh my tnh truyn thng khng th hc nh th ny.


Tht n tng v nhng g mng neural c th hc, thch nghi v ng
ph vi cc tnh hung mi.

Tt nhin ch l 1 t bo thn kinh thc hin mt nhim v rt


n gin. Nhng nu chng ta ni hng triu nhng t bo thn
kinh vi nhau? Chng ta c th mt ngy no to ra mt ci g c
thc?

https://medium.com/@hongthaiphi/l%C3%A0m-th%E1%BA%BF-n%C3%A0o-%C4%91%E1%BB%83-x%C3%A2y-d%E1%BB%B1ng-m%E1%B 9/11
14/10/2017 Lm th no xy dng mt mng neural n gin trong 9 dng m Python

https://medium.com/@hongthaiphi/l%C3%A0m-th%E1%BA%BF-n%C3%A0o-%C4%91%E1%BB%83-x%C3%A2y-d%E1%BB%B1ng-m%E1% 10/11
14/10/2017 Lm th no xy dng mt mng neural n gin trong 9 dng m Python

https://medium.com/@hongthaiphi/l%C3%A0m-th%E1%BA%BF-n%C3%A0o-%C4%91%E1%BB%83-x%C3%A2y-d%E1%BB%B1ng-m%E1% 11/11

You might also like