Professional Documents
Culture Documents
DOCUME
NTATION
TRAINING
INSTALLATION
AND
TESTING IMPLEMENTATION
STAGES OF IMPLEMENTATION
Step 1 Select location / site
●
Program testing
Involves processing test through all programs
Fully documented – to use if modifications are required
Cover the following areas:
▪ Input validity checks
▪ Program logic functioning
▪ Interfaces with related modules / systems
▪ Output format and validity
TYPES OF TESTING Contd
System testing
Testing conducted on a complete, integrated system to
evaluate the system's compliance with its specified
requirements. Wider focus than program testing. System
testing falls within the scope of black box testing, and as
such, should require no knowledge of the inner design of
the code or logic.
▪ Input documentation and practicalities of input
▪ Flexibility to allow amendments
▪ Ability to produce timely information
▪ Ability to cope with peak system requirements
▪ Viability of operating procedures
Occurs both before and after implementation
TYPES OF TESTING Contd
Performance testing
Performance testing can be applied to understand your application or
WWW site's scalability, or to benchmark the performance in an
environment of third party products such as servers and middleware
for potential purchase. This sort of testing is particularly useful to
identify performance bottlenecks in high use applications.
Performance testing generally involves an automated test suite as this
allows easy simulation of a variety of normal, peak, and exceptional
load conditions.
Evaluate compliance of a system or component with specified
performance requirements
TYPES OF TESTING Contd
Usability testing
Technique used to evaluate a product by testing it on users. This can be seen as an
irreplaceable usability practice, since it gives direct input on how real users use the
system. Establishment of users satisfaction.
The aim is to observe people using the product to discover errors and areas of
improvement. Usability testing generally involves measuring how well test subjects
respond in four areas: efficiency, accuracy, recall, and emotional response. The results of
the first test can be treated as a baseline or control measurement; all subsequent tests
can then be compared to the baseline to indicate improvement.
▪ Performance -- How much time, and how many steps, are required for people to complete basic
tasks? (For example, find something to buy, create a new account, and order the item.)
▪ Accuracy -- How many mistakes did people make? (And were they fatal or recoverable with the
right information?)
▪ Recall -- How much does the person remember afterwards or after periods of non-use?
▪ Emotional response -- How does the person feel about the tasks completed? Is the person
confident, stressed? Would the user recommend this system to a friend?
TYPES OF TESTING Contd
●
What
TEST PLAN ●
●
When
Under which environment
TEST DESIGN ●
The logic and reasoning
●
Detailed procedures
PERFORMING TESTS ●
Consistent testing at different time period
●
Documentation of the results and how to be done
DOCUMENTATION ●
●
Record of errors
Correction of errors procedure
Re-testing procedure
RE-TESTING
●
●
Re-testing of all modules and all aspects of the software
LIMITATIONS OF SOFTWARE TESTING
Poor testing process (Bad test plan, Testers are not well trained)
TRAINING METHODS
Individual
Classroom
Computer based
Case studies
Software reference material
DOCUMENTATION
DOCUMENTATION
files
Establish
controls Transcribe onto
total ●
Input forms = data entry screen
input forms
check
check system
Controls
total Print reports ●
Standing records for a starting point
Changeover
After satisfactory testing, changeover shall take place
Four approaches
Direct changeover
▪ Old system completely replaced
Parallel running
▪ Old and new system run in parallel within a definite delay
Pilot operation
▪ One department run the two system in parallel on a pilot basis. If
satisfied, whole changeover
Phased or staged changeover
▪ Section of the system for a direct changeover over phases
Advantages and limitations
METHOD ADVANATGES DISADVANTAGES
Direct changeover Quick at minimal cost Risky
Minimises workload Possible disruptions of
operations
Failure - costly
Pilot operations Less risky than direct changeover Long time to achieve total
Less costly than complete parallel change
running Less safe than parallel running
Phased changeover Less risky than single direct Long time to change
changeover Unpractical - Interfaces between
Problems in one section does not parts of system -
affect others
SYSTEM MAINTENANCE
Types of Maintenance
Features of maintenance
Flexibility and adaptability
Types of maintenance
Corrective
▪ Reaction to system failure
Perfective
▪ Process of making the software perfect
Adaptive
▪ Take account of changing environment
Causes of system maintenance
Errors
Changes in requirements
Poor documentation
27
SYSTEM PERFORMANCE AND
EVALUATION
PERFORMANCE MEASUREMENT
INDIRECT MEASURES
Significant task relevance (observe the results of
system use, earlier or late)
Willingness to pay
▪ Pay as you satisfy
System logs
User information satisfaction
▪ A survey of users on several criteria
Adequacy of documentation
PERFORMANCE REVIEWS
Conflicting demands
Time, cost, quality, resources
Appointment of project managers
Good manager but good specialist
Other factors
Unrealistic deadline
Non-existent planning (fail to plan is plan to fail)
Poor time table and resourcing
Inexistence control
Changing requirements
Establishment of steering committee
Definite objectives
Approve projects
Recommend projects
Establishing priority projects
Establish company guidelines
Coordination and control
Evaluation
System review after implementation