Professional Documents
Culture Documents
October, 2007
Outline
Protocol testing
Concepts, fault models, related definitions, general approach Methodologies based on FSM
T-Method (Transition Tour method) D-Method (Distinguishing sequences) W-Method (Characterizing sequences) U-Method (Unique input/output sequences)
Why do we test ?
for detecting errors in the implementation /debugging for demonstrating conformance to a specification or users needs
e.g. protocol conformance testing
The answer will help test team to establish a clear relationship between the system under test, the specification and the objective to satisfy.
program proving
(using theorem prover)
exhaustive Testing
The choice among these alternatives is based on: cost (function of # parameters: time, resources , human expertise,..) feasibility of proof or exhaustive testing the target quality
abstract model of I
assumptions/ test hypothesis
precise specification S
conformance relation
implementation I
conforming
all possible implementations
Question: How to choose a small (finite) test suite TS and obtain the maximum power of error detection?
Fault Models
A fault model is a hypothetical model of what types of faults may occur in an implementation
Most fault models are "structural", i.e. the model is a refinement of the specification formalism (or of an implementation model)
E.g. mutations of the specification or of a correct implementation
It may be used to construct the fault domain used for defining what "complete test coverage" means
E.g. single fault hypothesis (or multiple faults)
Petri Nets
Input or output arc fault: one of the input or output arcs is connected to the wrong place, missing, or exists in addition to those specified Missing or additional transition: the number of transitions is not the same as in the specification
Three basic steps for checking a transition (si, sy; L), L=ak/oi
Step 1: The FSM implementation is put into state s*; (e.g. reset+transfer)
Difficulty in realizing this is due to the limited controllability of the implementation
Step 2: Input a* is applied and the output is checked to verify that it is oi, as expected; Step 3: The new state of the FSM implementation is checked to verify that it is Sj, as expected
Difficulty in verifying this is due to the limited observability of the implementation
FSM
S1 is an initial state
t1: 1/1 S1 S2
T1: 1/1
t4: 2/2
Is a transition
it has a starting state S1, and an ending state S2
t2: 2/2
t8: 2/2 S4 t6: 2/2 t3:1/1
S3 t5: 1/2
Its label is t1
The input is 1 and an output 1 / separates the input from the output
t7: 1/2
An FSM Example
t1: 1/1 Mealy Machine
state set input set spec. output domain function
S2 t4: 2/2
S3 t5: 1/? ?
Ds S x X : Ds --> S : Ds --> Y
1) Output fault: point a in FSM fault model. 2) Transfer fault: point b in FSM fault model. 3) Transfer fault with additional states: point c in FSM fault model. 4) Additional or missing transitions: point d in FSM fault model. 5) additional or missing states
t1: 1/2 t1: 1/1 S1 t2: 2/2 t8: 2/2 S4 t7: 1/2 t6: 2/2 t3:1/1 S2 S1 S2
t4: 2/2
t8: 2/2 S3 t5: 1/2 S4
t2: 2/2
t3:1/1 t6: 2/2
t4: 2/2
S3 t5: 1/2
t7: 1/2
Specification
Transfer fault on t2
The ending state is now S3
t1: 1/1 t1: 1/1 S1 S2 S1 t2: 2/2 t4: 2/2 t8: 2/2 S4 S3 t5: 1/2 t6: 2/2 t3:1/1 S2 t4: 2/2
t2: 2/2
t8: 2/2 S4 t6: 2/2 t3:1/1
S3 t5: 1/2
t7: 1/2
t7: 1/2
Specification
IUT
t6: 2/2
S3
S3 t5: 1/? ?
Specification
IUT
Impl. 1
b/f c/f a/e S1 a/f
b/e c/e I0
b/f
c/e b/e S2
c/e b/f
a/f
Impl. 2
Io b/f I3 a/e c/e b/e c/e
I2 a/f
t3:1/1
A test suite is a set of input sequences starting from the initial state of the machine
S3
TS = { r.1.1.2.1, r.2.2.1.2.2}
MS MI
1.1.2.2
2.2.1.2.2 Conforming
Test Case
r.1.1.2.1
r.2.2.1.2.2
MI
1.1.2.2
2.2.2.2.2 Non-conforming
1.1.2.2
2.2.1.2.2
Pass TS
Fail to pass TS
No limitation on the number of such changes allows for an infinite set of possible implementations !!!
S3 t5: 1/? ?
s1
CR/ICONind
s2
ICONresp/CC DT0/AK0
s3
DT1/IDATind,AK1 DT1/AK1
s4
For the given transition: change the output (output fault) change the next state (transfer fault) if a new state can be added, then assume an upper bound on the number of states in implementations.
mutations
For the example above, there are (SxO)SxI = 4x74x5=2820 mutants with up to 4 states. Among them, 36 mutants represent single (output or transfer) faults, as only 9 transitions are specified. An example of a very specific fault domain: Only the transitions related to data transfer may be faulty. These are 4 transitions. This results in only 284 mutants (faulty implementations in mplf).
DT0/AK0 DT0/IDATind,AK0
s3
DT1/IDATind,AK1
DT1/AK1
s4
<qe
t1: 1/1
S1 t2: 2/2 t8: 2/2 S4 t7: 1/2 t6: 2/2
S3
not <qe
2/2.1/1.2/1
t1: 1/1
M2
S2 t3:1/1 t4: 2/2 S1 t2: 2/2 t8: 2/2 S4 t7: 1/2 t5: 1/2
t6: 2/1
S3
S3 t5: 1/2
M1
M1
The problem of generating a minimum-cost test sequence using the transition tour method is equivalent to the so-called "Chinese Postman" problem in graph theory
First studied by Chinese mathematician Kuan Mei-Ko in 1962
T-Method Example -1
The implementation I1 contains an output error. Our transition tour will detect it
The implementation I2 contains a transition error. Our transition tour will not detect it.
Detects all output errors, Detects all transfer errors, A DS may not be found for a given FSM.
32
DS method Example
S
b/y a/x a/x b/y
2
a/y
2
b/y a/x a/x b/y a/y
Impl. I2
1
b/x
3
b/x
The specification S A distinguishing sequence is : b.b If we apply it from : state 1 we obtain y.y state 2 we obtain y.x state 3 we obtain x.y
A test case which allow the detection of the transfer error is : a.b.b.b If we apply it from the initial state of : the specification we obtain x.x.y.y the implementation we obtain x.x.x.x
33
DS method
2
b/y a/x a/x b/y a/y
1
b/x
Phase 1: Identification of all states/ State cover From state 1, we can reach state 2 with b/y and state 3 with a/x We assume that the reset exist, Q = { , a, b} DS = b.b Test suite = {r.b.b, r.a.b.b, r.b.b.b} Phase 2, to cover all transitions for output faults and transfer faults P = { , a, b, a.b, a.a, b.b, b.a} Test suite:{r.b.b, r.a.b.b, r.b.b.b, r.a.b.b.b, r.a.a.b.b, r.b.b.b.b, r.b.a.b.b}
34
B) Fault detection
B-1) Apply the generated test suites to the specification to obtain Expected Outputs B-2) Apply the generated test suites to the implementation to obtain Observed Outputs Compare the expected and observed outputs (test results) If they are different then the verdict is fail otherwise it is a pass for the applied test suites.
35
UIO Example
2
a/y a/x a/x b/y
S
b/y
1
b/x
UIO sequences are : state 1 : a.b state 2 : a.a state 3 : a A transition cover set is : P={e, a, a.b, a.a, b, b.a, b.b} The test sequences generated by the UIOmethod are : r.a.b, r.a.a, r.a.b.a.b, r.a.a.a.a, r.b.a.a, r.b.a.a.b, r.b.b.a
37
The specification S
We assume the existence of a reset transition with no output (r/-) leading to the initial state for every state of S
38
W method Example
b/f a/e
1
a/f c/e b/e c/e
2
c/f
A characterization set is W={a, b} W1 state 1 : a/e, W2 state 2 : a/f, b/f W3 state 3 : b/e W = Union of all Wi
b/f
a/f
A transition cover set for the specification S is : P={e, a, b, c, b.a, b.b, b.c, c.a, c.b, c.c}
P set is not unique you may select b as preamble instead of a
The specification S
We assume the existence of a reset transition with no output (r/-) leading to the initial state for every state of S
The W-method generates the following test sequences: (P.W) = r.a, r.b, r.a.a, r.a.b, r.b.a, r.b.b, r.c.a, r.c.b, r.b.a.a, r.b.a.b, r.b.b.a, r.b.b.b, r.b.c.a, r.b.c.b, r.c.a.a, r.c.a.b, r.c.b.a, r.c.b.b, r.c.c.a, r.c.c.b 39
This method is a generalization of the UIOv method which is always applicable. It is as the same time an optimization of the W-method. The main advantage of the Wp-method, over the W-method, is to reduce the length of the test suite. Instead of using the set W to check each reached state si, only a subset of W is used in certain cases. This subset Wi depends on the reached state si, and is called an identification set for the state si.
40
1 e f
2 f f e f
3 f e e
Derivation of W
a b c
1
a/f c/e b/e c/e
2
c/f
b/f
a/f
The specification S We assume the existence of a reset transition with no output (r/-) leading to the initial state for every state of S
for state 1 : a/e for state 2 : c/f for state 3 : b/e The identification sets are : W1={a}, distinguishes the state 1 from all other states W2={c}, distinguishes the state 2 from all other states W3={b}, distinguishes the state 3 from all other states
2
a/f c/e c/e b/f a/f c/f
The application of the test sequences obtained in Phase 2 leads to the following sequences of outputs :
r.a.c2, r.b.c.c2, r.b.a.a1, r.b.b.b3, r.c.a.b3, r.c.c.c2, r.c.b.a1 S: -.e.f -.f.f.f -. f.f.e -. f.f.e -.e.f.e -. e.e.f -.e.e.e I: -.e.f -.f.f.f -. f.f.e -. f.f.e -.e.f.f -. e.e.f -.e.e.e
b/e
A faulty implementation I I contains a transfer error 2a/f->1 (fat arrow) instead of 2-a/f->2 as defined in the specification S
The output printed in bigger size is different from the one expected according to the specification. Therefore, the transfer error in the implementation is detected by this test sequence.
2 f f
3 f
b/f a/e
a
2
e
f
1
a/f c/e b/e c/e
b
c/f
b/f
A characterization set is W={a, b} for W method for state 1 : a/e for state 2 : {a/f, b/f } this will increase the size of the test suite
That why c/f should be selected as W for the state 2.
a/f
The specification S We assume the existence of a reset transition with no output (r/-) leading to the initial state for every state of S
The identification sets are : W1={a}, distinguishes the state 1 from all other states W2={a, b}, distinguishes the state 2 from all other states but it is not optimized W3={b}, distinguishes the state 3 from all other states
44
Examples
45
Distinguishing Sequence, UIO, W Test hypothesis H1) Strongly connected machine H2) Contain no equivalent states H3) deterministic H4) Completely specified machine H5) the failure which increases the number of states doesnt occur The method is applied in two phases from the initial state phase 1) -sequence to check that each state defined by the specification also exist in the implementation. phase 2) -sequence to check all the individual transitions in the specification for correct output and transfer in the implementation.
46
DS Method
Assume that Reset transition r/- exist Q1) Verify if a.a is a DS for S and explain why? Q2) Find a DS different from a.a with length 2 for S
S0 b/1
S4 a/0
a/0
S1
b/1
b/0
a/0 a/1 S2 b/0
47
W method
Assume that the reset exist and it brings the machine from any state to the initial state. a) Find characterization set W and generate the set of test cases for the specification S using the W method. b) Does S have a DS sequence? If not explain why?
S2
48
W method
a/0 S0 b/0 S1 b/1 b/0
S0 : b/1 S1 : a/1 S2 : a/0, b/0 W= U Wi W={a, b} Q = { ,a,b} State Cover P = { ,a, b, a.b, a.a, b.a, b.b} transition Cover P-Q = { a.b, a.a, b.a, b.b}
a/0
a/1
S2
P-Q is used for the 2 steps with alpha and beta sequences to avoid redundancy.
Phase 1 , Q.W = {r.a, r.b, r.a.a, r.a.b, r.b.a, r.b.b} Phase 2, (P-Q).W = {r.a.b.a, r.a.b.b, r.a.a.a, r.a.a.b, r.b.a.a, r.b.a.b, r.b.b.a, r.b.b.b}
49
Examples Suite
Transition tour:
S0 b/0 b/0
a/0
Input
a.b.a.b.a.b 0.1.0.0.1.0
b/1 a/0
Output
a/1
S1
S2
Specification S
a
0
a
1
a
0
b
0
b
0
Comment: a as input at each state will loop on the state, sequence of a.a. cannot be a DS, the output will be 0.0.. or 1.1 50
Q set: permits to reach each state from the initial state Q = { , b,b.b} The first b to reach the state S2
S0 b/0 b/0
a/0
b.b to reach the state S1. P set is transition cover, permits to execute each transition at least one starting from the
b/1 a/0
a/1
S1
S2
initial state
S
0
a b
How to derive P set: find all Path starting from the size1 and up and each transition should be traversed at least once b
b b b
a b a
51
The goal of the Phase 1 is identification of the states in the implementation DS = a.b, Q = { , b,b.b}, P = {, a, b, b.a, b.b, b.b.a, b.b.b} Phase 1 Q.DS = {r.a.b, r.b.a.b, r.b.b.a.b} Expected output of phase 1is: {-.0.1, -.1.0.0, -.1.0.1.0}
Phase 2 ( DS in bold) P.DS= {r.a.b, r.a.a.b,r.b.a.b, r.b.a.a.b, r.b.b.a.b, r.b.b.a.a.b, r.b.b.b.a.b} {-,0.1, -.0.0.1, -.1.0.0.0, -.1.0.1.0, -.1.0.1.1.0, -.1.0.0.0.1}
S0 b/0 b/0
a/0
b/1
Note that, the test suites for phase 1 and 2 should be Derived from the specification and applied to the implementation to check it for output and transfer faults.
a/0
a/1
S1
S2
52
Specification S
Implementation I
a/0
S0 b/0 b/0
a/0
S0
a/1
S1
S2
b/0
The implementation I has a transfer fault, the fault is not detected by Transition tour. Transition tour detects all output faults but Doesnt guarantee the detection of transfer faults
53
S0
a/0
Implementation I
The goal of the Phase 1 is identification of the states in the implementation DS = a.b, Q = { , b,b.b}, P = {, a, b, b.a, b.b, b.b.a, b.b.b}
Phase 1 Q.DS = {r.a.b, r.b.a.b, r.b.b.a.b} Expected output of phase 1is: {-.0.1, -.1.0.0, -.1.0.1.0} {-.0.1, -.1.0.0, -.1.0.1.0} ) observed outputs from I
Phase 2 ( DS in bold) P.DS= {r.a.b, r.a.a.b, r.b.a.a.b, r.b.b.a.b, r.b.b.a.a.b, r.b.b.b.a.b} {-.0.1, -.0.0.1, -.1.0.0.0, -.1.0.1.0, -.1.0.1.1.0, -.1.0.0.0.1} expected output {-.0.1, -.0.0.1, -.1.0.0.0, -.1.0.1.0, -.1.0.1.1.0, -.1.0.0.0.0}observed output from I, transfer fault detected
54
Specification S
S0
c/1
a/0
a/1
C/0
b/0 a/0
c/0
S2
b/0
b/0
S1
a.b.a.b.c.a.c.b.c
0.0.0.0.0.1.1.0.0
0 0
1 0
c/1
a/1
a/0.c/0
55
Analysis
Fault Testing Coverage
Fault coverage for D-, W-, and U-methods is better than of T-method Fault coverage for D-, W-, and U-methods are the same
Summary
All of these four methods assume minimal, strongly connected and fully specified Mealy FSM model of protocol entities On average, T-method produces the shortest test sequence, W-method the longest. D- and U- methods generate test sequence of comparable lengths T-method test sequences are able to detect output faults but not transition D-, W-, and U-methods are capable of detecting all kinds of faults and give the same performance. U-method attracts more and more attentions and there are several approaches based on the basic idea with some improvements
Examples
B) Fault detection
B-1) Apply the generated test suites to the specification to obtain Expected Outputs B-2) Apply the generated test suites to the implementation to obtain Observed Outputs
Compare the expected and observed outputs (test results) If they are different then the verdict is fail otherwise it is a pass for the applied test suites.
DS method Example
S
b/y a/x a/x b/y
2
a/y
2
b/y a/x a/x b/y a/y
Impl. I2
1
b/x
3
b/x
The specification S A distinguishing sequence is : b.b If we apply it from : state 1 we obtain y.y state 2 we obtain y.x state 3 we obtain x.y
A test case which allow the detection of the transfer error is : a.b.b.b If we apply it from the initial state of : the specification we obtain x.x.y.y the implementation we obtain x.x.x.x
DS method
2
b/y a/x a/x b/y a/y
1
b/x
Phase 1: Identification of all states/ State cover From state 1, we can reach state 2 with b/y and state 3 with a/x We assume that the reset exist, Q = { , a, b} DS = b.b Test suite = {r.b.b, r.a.b.b, r.b.b.b} Phase 2, to cover all transitions for output faults and transfer faults P = { , a, b, a.b, a.a, b.b, b.a} Test suite:{r.b.b, r.a.b.b, r.b.b.b, r.a.b.b.b, r.a.a.b.b, r.b.b.b.b, r.b.a.b.b}
DS method Example
The test cases are : state 1: a.b.b b.b.b state 3 : a.a.b.b a.b.b.b state 2 : b.a.b.b b.b.b.b
t4: b/2
S3
t7: a/2
Transition tour TT: t1, t4, t3, t9, t2, t3, t6, t7, t8 TT (input/expected output): a/1.b/2.a/1.a/2.b/2.a/1.b/2.a/2.b/2
DS Method
Assume that Reset transition r/- exist Q1) Verify if a.a is a DS for S and explain why? Q2) Find a DS different from a.a with length 2 for S
S0 b/1
S4 a/0
a/0
S1
b/1
b/0
a/0 a/1 S2 b/0
W method
Assume that the reset exist and it brings the machine from any state to the initial state. a) Find characterization set W and generate the set of test cases for the specification S using the W method. b) Does S have a DS sequence? If not explain why?
S2
W method
a/0 S0 b/0 S1 b/1 b/0
S0 : b/1 S1 : a/1 S2 : a/0, b/0 W= U Wi W={a, b} Q = {e ,a,b} State Cover P = {e ,a, b, a.b, a.a, b.a, b.b} transition Cover P-Q = { a.b, a.a, b.a, b.b}
a/0
a/1
S2
Phase 1 , Q.W = {r.a, r.b, r.a.a, r.a.b, r.b.a, r.b.b} Phase 2, (P-Q).W = {r.a.b.a, r.a.b.b, r.a.a.a, r.a.a.b, r.b.a.a, r.b.a.b, r.b.b.a, r.b.b.b}
Examples Suite
Transition tour:
S0 b/0
a/0
Input
a.b.a.b.a.b 0.1.0.0.1.0
b/1 b/0
Output
a/1
S1
S2
a/0
Specification S
a
0
a
1
a
0
b
0
b
0
a.b a.b
0.0
0.1 1.0
Comment: a as input at each state will loop on the state, sequence of a.a. cannot be a DS, the output will be 0.0.. or 1.1
Q set: permits to reach each state from the initial state Q = { , b,b.b} The first b to reach the state S2
S0 b/0
a/0
b.b to reach the state S1. P set is transition cover, permits to execute each transition at least one starting from the
b/1 b/0
a/1
S1
S2
a/0
initial state
S0
a b b a b a b b b b
How to derive P set: find all Path starting from the size1 and up and each transition should be traversed at least once
The goal of the Phase 1 is identification of the states in the implementation DS = a.b, Q = { , b,b.b}, P = {, a, b, b.a, b.b, b.b.a, b.b.b} Phase 1 Q.DS = {r.a.b, r.b.a.b, r.b.b.a.b} Expected output of phase 1is: {-.0.1, -.1.0.0, -.1.0.1.0}
Phase 2 ( DS in bold) P.DS= {r.a.b, r.a.a.b,r.b.a.b, r.b.a.a.b, r.b.b.a.b, r.b.b.a.a.b, r.b.b.b.a.b} {-,0.1, -.0.0.1, -.1.0.0.0, -.1.0.1.0, -.1.0.1.1.0, -.1.0.0.0.1}
S0 b/0
a/0
b/1 b/0
Note that, the test suites for phase 1 and 2 should be Derived from the specification and applied to the implementation to check it for output and transfer faults.
a/0
a/1
S1
S2
Specification S
Implementation I
a/0
S0 b/0
a/0
S0
a/1
S1
S2
b/0
The implementation I has a transfer fault, the fault is not detected by Transition tour. Transition tour detects all output faults but Doesnt guarantee the detection of transfer faults
S0
a/0
Implementation I
The goal of the Phase 1 is identification of the states in the implementation DS = a.b, Q = { , b,b.b}, P = {, a, b, b.a, b.b, b.b.a, b.b.b}
Phase 1 Q.DS = {r.a.b, r.b.a.b, r.b.b.a.b} Expected output of phase 1is: {-.0.1, -.1.0.0, -.1.0.1.0} {-.0.1, -.1.0.0, -.1.0.1.0} ) observed outputs from I Phase 2 ( DS in bold) P.DS= {r.a.b, r.a.a.b, r.b.a.a.b, r.b.b.a.b, r.b.b.a.a.b, r.b.b.b.a.b} {-.0.1, -.0.0.1, -.1.0.0.0, -.1.0.1.0, -.1.0.1.1.0, -.1.0.0.0.1} expected output {-.0.1, -.0.0.1, -.1.0.0.0, -.1.0.1.0, -.1.0.1.1.0, -.1.0.0.0.0}observed output from I, transfer fault detected
Specification S
S0
c/1
a/0
a/1
C/0
b/0
c/0
S2
b/0
b/0
S1
a/0
a.b.a.b.c.a.c.b.c 0.0.0.0.0.1.1.0.0
0 0
1 0
c/1
a/1
a/0.c/0
13
r/r/t1: 1/1 S1 S2
t2: 2/2
r/- t3:1/1
t6: 2/2 S3
t4: 2/2
t7: 1/2
15
Independency (suite)
The independency relation is a reasonable assumption in certain cases.
Example:
Equipment to test Entity N+1
SAP Entity N
SAP Entity N
SAP
Entity N
a/2
s2
Test Architectures
How do we stimulate protocol entities for testing purposes ?
OSI Terminology
The PCO has two FIFO queues: Send (from tester to IUT) Receive (by tester from IUT )
Lower tester (LT) controls and observes the ILJT's lower service boundary, indirectly, via the underlying service provider
In single-party testing, behaves as the peer entity to IUT In multi-party testing, act as peer entities working in parallel
Test coordination procedures (TCPs) are used to ensure cooperation between the UTs and LTs
How tester shall respond Passing (preliminary) results Synchronisation TCP is NOT Transport Control Protocol, as in TCP/IP
ATM Classification
ATMs for multi-party testing
Several parallel upper and lower testers In complex situation a upper tester control function (UTCF) is needed Special cases include only one upper tester, or even no upper tester at all
Test Case
Upper Tester is located in Test System Requires an upper interface on IUT IUT is built in the tester No ATSs for this method Good for the testing of a hardware component Example: Ethernet driver
UT in SUT, LT remote Requires synchronization Suitable for upper layer protocols / protocols offering API Example: socket communication
UT in SUT but no access, LT remote No assumption of upper interface to the IUT Use only one PCO below the LT Uses Test Management Protocol (TMP) embedded in ASPs Suitable for mid-layer protocols
No Upper Tester Upper Tester can be native application or a user accessible interface Manual co-ordination Limited, but easy to use
References (1/2)
C. E. Chow. Introduction to protocol engineering. 2004. cs.uccs.edu/~cs522/pe/ G O. Chistokhvalov, Communication software and architecture, lecture notes. 2002. www.it.lut.fi/kurssit/02-03/010607000/index_eng.html G.J. Holzmann. Design and validation of computer protocols. Chapter 8-11. PrenticeHall. 1991. ISBN 0-13-539925-4, spinroot.com/spin/Doc/Book91.html A. Petrenko, Introduction to the theory of experiments on finite state machines, lecture notes. 2003. www.bretagne.enscachan.fr/DIT/People/Claude.Jard/sem_13_05_2003_petrenko_trans.pdf Igor Potapov . Protocol engineering, lecture notes. 2004. www.csc.liv.ac.uk/~igor/COMP201/ Chris Ling. The Petri Net method, lecture notes. 2001. www.csse.monash.edu.au/courseware/cse5510/Lectures/lecture2b.ppt Gabriel Eirea, Petri nets, lecture notes, UC Berkeley, 2002, www.cs.unc.edu/~montek/teaching/spring-04/petrinets.ppt T.-Y. Cheung. Petri nets for protocol engineering. Elsevier Computer Communications. 19. 1996: 1250-1257 R.Zurawski and M.C. Zhou, Petri Nets and industrial applications: a tutorial, IEEE Trans. Industrial Electronics, vol. 41, no. 6, 1994: 567-583
References (2/2)
T. Murata. Petri nets: properties, analysis and applications. Proceedings of the IEEE. vol. 77. no. 4, 1989:541-580 G.V. Bochmann and R. Gotzhein, Deriving protocol specifications from service specifications, ACM Trans, on Computer Systems, vol. 8, no. 4, 1990: 255-283 R.L. Probert and K. Saleh, Synthesis of communication protocols: survey and assessment, IEEE Trans. Computers, vol. 40, no. 4, 1991: 468-476 Mark Claypool, Modeling and performance evaluation of network and computer systems, lecture notes, 2004, www.cs.wpi.edu/~claypool/courses/533-S04/ R. Dssouli and F. Khendek, Test development for distributed system, 2000, www.ece.concordia.ca/~dssouli/Testing.pdf R. Lai. A survey of communication protocol testing. Elsevier Journal of Systems and Software. 62,2002:21-46 G.V. Bochmann and A. Petrenko. Protocol testing: review of methods and relevance for software testing, Proc. ACM ISSTA, Seattle Washington, USA, 1994: 109-124 A.T. Dahbura, K.K. Sabnani, and M.U. Uyar, Formal methods for generating protocol conformance test sequences. Proceedings of the IEEE, vol. 78, no. 8, 1990: 13171326 D.P. Sidhu and T.-K. Leung, Formal methods for protocol testing: a detailed study, IEEE Trans. Software Engineering, vol. 15, no. 4, 1989: 413-426