You are on page 1of 42

Neural Networks and Fuzzy Logic

Neural networks and fuzzy logic are two


complimentary technologies
Neural networks can learn from data and
feedback
It is difficult to develop an insight about the
meaning associated with each neuron and each
weight
Viewed as black box approach (know what the
box does but not how it is done conceptually!)
*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

Online (pattern mode) VS Batch


mode of BP learning
Two ways to adjust the weights using
backpropagation
Online/pattern Mode: adjusts the weights based
on the error signal of one input-output pair in the
trainning data.
Example: trainning set containning 500 input-output
pairs, this mode BP adjusts the weights 500 times for
each time the algorithm sweeps through the trainning
set. If the algorithm sweeps converges after 1000
sweeps, each weight adjusted a total of 50,000 times.

*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

Online (pattern mode) VS Batch


mode of BP learning (cont.)
Batch mode (off-line): adjusts weights based on
the error signal of the entire training set.
Weights are adjusted once only after all the trainning
data have been processed by the neural network.
From previous example, each weight in the neural
network is adjusted 1000 times.

*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

Neural Networks and Fuzzy Logic (cont)


Fuzzy rule-based models are easy to comprehend

(uses linguistic terms and the structure of if-then rules)


Unlike neural networks, fuzzy logic does not come
with a learning algorithm
Learning and identification of fuzzy models need
to adopt techniques from other areas

Since neural networks can learn, it is natural to


marry the two technologies.
*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

Neuro- Fuzzy System


Neuro-fuzzy system can be classified into
three categories:
A fuzzy rule-based model constructed using a
supervised NN learning technique
A fuzzy rule-based model constructed using
reinforcement-based learning
A fuzzy rule-based model constructed using
NN to construct its fuzzy partition of the input
space
*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

ANFIS: Adaptive Neuro-Fuzzy


Inference Systems
A class of adaptive networks that are
functionally equivalent to fuzzy inference
systems.
ANFIS architectures representing both the
Sugeno and Tsukamoto fuzzy models

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

A two-input first-Order Sugeno Fuzzy


Model with two rules

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

Equivalent ANFIS architecture

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

ANFIS Architecture
Assume - two inputs X and Y and one output Z
Rule 1: If x is A1 and y is B1, then f1 = p1x + q1y +r1
Rule 2: If x is A2 and y is B2, then f2 = p2x + q2y +r2

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

ANFIS Architecture: Layer 1

Every node i in this layer is an adaptive node with a node function


O 1,i = Ai (x), for I = 1,2, or
O 1,i = Bi-2 (x), for I = 3,4
Where x (or y) is the input to node i and A i (or Bi) is a linguistic label
** O 1,i is the membership grade of a fuzzy set and it specifies the
degree to which the given input x or y satisfies the quantifies
*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

ANFIS Architecture: Layer 1 (cont.)

Typically, the membership function for a fuzzy set can


be any parameterized membership function, such as
triangle, trapezoidal, Guassian, or generalized Bell
function.
Parameters in this layer are referred to as Antecedence Parameters
*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

Triangular MF

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

Trapezoidal MF

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

Gaussian MF

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

Generalized Bell MF

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

Generalized Bell MF (Cont.)

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

ANFIS Architecture: Layer 2

Every node i in this layer is a fixed node labeled , whose output


is the product of all the incoming signals:
O 2,i = Wi = min{Ai (x) , Bi-2 (x)}, i = 1,2
Each node output represents the firing strength of a rule.
*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

ANFIS Architecture: Layer 3

Every node in this layer is a fixed node labeled N. The ith node
calculates the ratio of the ith rules firing strength to the sum of all
rulesfiring stregths:
__
O 3,i = Wi = Wi /(W1+W2) , I =1,2 (normalized firing strengths]
*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

ANFIS Architecture: Layer 4

Every node i in this layer is an adaptive node with a node function


__
__
O 4,i = wi fi = wi (pix + qiy +ri)
Consequent parameters

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

ANFIS Architecture: Layer 5

The single node in this layer is a fixed node labeled , which


computes the overall output as the summation of all incoming
signals:
__
O 5,1 = i wi fi
*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

ANFIS Architecture: Alternate

ANFIS architecture for the Sugeno fuzzy model, weight


normalization is performed at the very last layer
*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

ANFIS Architecture: Tsukamoto model

Equivalent ANFIS architecture using the Tsukamoto fuzzy model

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

ANFIS Architecture: 2 input Sugeno with 9 rules

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

Backpropagation Learning Applied


to zero-order TSK Fuzzy Modeling
The algorithm becomes somewhat simple
when it is applied to the fuzzy model
Let Assume that we will apply BP learning to
the parameter identification of TSK fuzzy
model with
Antecedent MFs are of Gaussian type
Antecedent parameters mij and ij
and the consequents Ci

*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

Backpropagation Learning Applied


to zero-order TSK Fuzzy Modeling
In the forward pass, for a given input pattern, the actual
response of the model is computed directly from the
equation below

*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

Backpropagation Learning Applied


to zero-order TSK Fuzzy Modeling
In the backward pass, the error signal resulting
from the difference between the actual output
and the desired output of the model is
propagated backward and the parameters mij,
ij, and Ci are adjusted using the errorcorrection rule.

*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

On-line/Pattern Mode updating

Where 1, 2, 3 are the learning-rate parameters


*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

Summary - Pattern Mode BP applied to


zero-order TSK models with Gaussian MFs

1. Set mij (0), ij (0), and Ci (0) to small random numbers.


2. For k=1,2, ., N, do the following steps:
3. (Forward Pass) Compute v i(k), where mij, ij, and Ci are
replaced by m ij (k-1) and ij (k-1) respectively; compute
y^(k) where Ci is replaced by C i (k-1).
4. (Backward Pass) From error signal e(k) update Ci, mij,
and ij.
5. If some stopping criterion is satisfied, stop; otherwise
set k=k+1 and goto step2.

*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

Off-line/batch Mode updating

Where 1, 2, 3 are the learning-rate parameters


*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

Summary - Batch Mode BP applied to

zero-order TSK models with Gaussian MFs


1. Set mij (0), ij (0), and Ci (0) to small random numbers,
and set n = 1.
2. (Forward Pass) Compute v i(k) for k=1, 2, , N, where m ij,
ij, and Ci are replaced by m ij (n-1) and ij (n-1)
respectively; compute y^(k) for k = 1, 2, . . ., N, where Ci
is replaced by Ci (n-1).
3. (Backward Pass) From error signal e(k) for k = 1, 2, , N;
update Ci, mij, and ij.
4. If some stopping criterion is satisfied, stop; otherwise
set n=n+1 and goto step2.
*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

Summary - Batch Mode VS Pattern Mode


Pattern mode of learning is preferred over
batch mode because it requires less local
storage for updating each parameter.
Batch mode of learning, however, provides a
more accurate estimate of the gradients.
Pattern learning approaches Batch learning
provided that learning-rate parameters are
small.
*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

Example 1: Modeling nonlinear Function


y = sin(x)/x
100 training data pairs (even distributed grid
points of input range [-10,10]
ANFIS zero-order TSK contains 30 rules, whose
antecedent and consequent parameters were
trained using the BP algorithm with the initial
parameter values Ci(0) [-1,1], mij(0) [-10,10], and
ij (0) [1,2]

*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

Example 1: Modeling sin(x)/x Function

*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

Example 1: Modeling sin(x)/x Function

*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

Example 2: Modeling a Two-Input Sinc Function


z = sinc(x,y) = sin(x) sin(y)/xy
121 training data pairs (even distributed grid
points of input range [-10,10] x [-10,10]
ANFIS first-order TSK contains 16 rules with 24
antecedent parameters (using generalized Bell
MFs) and 48 consequent parameters were
trained using Hybrid learning algorithm.

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

Example 2: Modeling a Two-Input Sinc Function


Two passes in the hybrid learning procedure for ANFIS

Forward Pass Backward Pass


Antecedent parameters

Fixed

Gradient descent

Consequent parameters

Least-squares
estimator

Fixed

Node outputs

Error signals

Signals

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

Example 2: Modeling a Two-Input Sinc Function

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

Example 2: Modeling a Two-Input Sinc Function

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

Example 2: Modeling a Two-Input Sinc Function

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

Example 2: Modeling a Two-Input Sinc Function

*Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997

Neuro-Fuzzy Versus Neural Network


Neuro-fuzzy system differs from NN in four
major ways
1. Nodes and links in a neuro-fuzzy system
correspond to a specific component in
a fuzzy system
Example: First Layer describes the antecedent MF

2. Node is usually not fully connected to


nodes in an adjacent layer
*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

Neuro-Fuzzy Versus NN (cont.)


3. Nodes in different layers of neuro-fuzzy
system typically perform different
operations
4. A neuro-fuzzy system typically has more
layers than neural network

*Fuzzy Logic: Intelligence, control, and Information - J.Yen and R. Langari, Prentice Hall 1999

You might also like