Professional Documents
Culture Documents
Software Quality and Testing study structure, lists and important terms
Why test provide confidence, provide understanding overall, provide sufficient information for
objective decisions, establish extent of requirements met, establish degree of quality
Error/mistake (by people), Fault/Defect/Bug (result of error), Failure (when fault executes), Incident
(behavior of fault)
Quality (total characteristics of ability to satisfy stated or implied needs, testing does not build but
determines) Testability, Maintainability, Modularity, Reliability (mean time before failure (MTBF)),
Efficiency, Usability, Reusability, Legal requirements/standards, etc. External (10%) vs internal quality
(90%) like an iceberg.
Software testing certify quality, finding faults/failures, process to verify software satisfies
requirements, examination of the behavior of program
Testing not debugging, static and dynamic, process, set of techniques, generally not possible to prove
no faults, helps locate errors, should be repeatable
Test data one element of input domain, test data set finite set of test data, exhaustive testing when
all of test data input domain tested, power of test data context dependent not just based on number
of elements, nave random testing (on uniform distribution) is hopeless generally
Cost of testing single failure may incur little cost or millions, extreme cases error can cost lives, safety
critical systems are tested rigorously
Product lifecycle phases requirements, design, coding, testing, maintenance bugs more expensive at
later phase
Testing principles: Exhaustive testing impossible, Early testing (start as soon as possible), Testing is
context dependent, Pesticide paradox (same set of tests will not find new defects), Pareto
principal/defect clustering (80% of faults occur in 20% of code), Absence of errors fallacy (just because
testing results are good does not mean software is good), Presence of defects (testing only shows
presence of bugs not absence Dijkstra), Testing process to add value to product, All tests should be
traceable, Tests must be ranked (do best testing in time available)
How to prioritize test where failures most visible, most likely, ask customer to prioritize, critical to
customers business, areas changed most often, areas with most past problems, most complex or
technically critical areas
Typical life-cycle phase: requirement specification, conceptual plan, architectural design, detailed
design, component development, integration, system qualification, release, system operation &
maintenance, retirement/disposal
Agile manifesto Individuals and interactions > Processes and tools, Working Product > Comprehensive
Documentation, Customer Collaboration > Contract Negotiation, Responding to change > Following a
plan.
XP Principles Planning game, Small releases, Metaphor, Test before coding (test driven development
(TDD)), Refactoring, Pair programming, Common ownership of code, Continuous integration, 40 hours
work per week, Internal client, Coding regulations
Test driven development cycle Red (write test, watch fail), Green (implement, watch pass), Refactor
(refactor, watch pass)
SCRUM principles Split organization into small, cross-functional, self-organizing teams, Split work int
small concrete deliverables, Split time into short-fixed length iterations (usually 1-4 weeks), Optimize
the release plan and update priorities in collaboration with customer, Optimize the process with a
retrospective
Test scenario high level classification of test requirements grouped by functionality, identifier,
conditions satisfied, scenario description, output
Test case Identifier, test case owner/creator, version & date, name of test case, requirement
identifier/link, purpose, priority, dependencies, testing environment/configuration, initialization
(precondition), finalization (postcondition), executed by, expected average execution duration, actions,
input data (precondition), expected results (postcondition), actual results (after running), status
pass/fail, note
Testing-by-contract (based on design-by-contract, create test conditions only where preconditions are
met), defensive test case design (tests both normal and abnormal preconditions)
Good test case properties accurate, economic, effective, exemplary, evolvable, executable, repeatable,
reusable, traceable
Test condition item or event that can be verified by one or more test cases, Test basis source of
information or document to write tests, Test script/procedure sequences of actions of a test, testware
artifacts produced during test process. Identify test condition (what), specify test scenario and test
cases (with which, one or more test cases for a test scenario), specify test procedure (how)
Levels of testing unit (smallest testable part, constructed in isolation, functionality and non-functional
characteristics, structural testing, robustness degree to which operates correctly, memory leak poor
memory management not freeing up used memory, supported by tools), integration (expose defects in
interfaces and interactions between components and systems, component integration, system
integration), system, acceptance
Integration testing types stubs (inputs to code)/mocks (output from code)/fakes (working objects that
take shortcuts) can be used with different integration models: big-bang, bottom-up, top-down, inside-
out, outside-in, branch-wise strategy
System testing types focuses on whole system or product in live environment, non-functional tests
(installability, interoperability, maintainability, portability, recovery, reliability, usability, load
(spike/stress/stability), soak/endurance, volume, configuration, compatibility, environment)
Acceptance testing types provide end users confidence that system is according to expectations, client
involvement, alpha (internal acceptance testing), beta (external acceptance testing)
Maintenance testing (implies a live environment) Additional features, New faults being found, Retired
system, modification of live environment, adapt product to modified environment, impact analysis
Traceability traceability matrix (link requirements back to stakeholders rational and forward to design,
artifacts, code, test cases - all 4 of one/many-to-one/many) horizontal (components across
workproducts), vertical (relationship of parts of single workproduct), impact analysis (assess impact on
rest of system)
Fundamental test processes - Test planning (work breakdown structure (WBS)), Test Control (Test
analysis and design, Test implementation and execution, Evaluating exit criteria and reporting), Test
closure activities
Test planning levels - Quality Policy/Test Policy (company level) and Test Strategy (company level,
possible at all levels)), High-level test plan and Test approach (operational) (project level), Detailed test-
plan (test stage level)
Master test plan/project test plan SPACEDIRT acronym covers all minimum 16, and 19 from IEEE
(scope (test items, glossary), people (features to be tested, features not to be tested),
approach/approvals, criteria (pass/fail, suspension/resumption), environmental needs/estimates
(schedule), deliverables, identifier/introduction, risks (software, planning)/responsibilities/references,
tasks (remaining)/training)
Quality software Customer (cost-effective), Developer (easy to develop/maintain), User (easy to use),
Development manager (profitable), Product quality vs Process quality
Risk types Project (planning), Product (quality) expressed usually as likelihood and impact, level of
risk = probability of risk occurring x impact if it did happen (qualitative e.g. low, medium, high or
quantitative e.g. 25%)
Risk management Risk Identification (raw risk data), Risk Analysis (risk priority table), Risk
Response/contingency plans (strategies of avoidance, transference, mitigation, acceptance, individuals
to take responsibility, added to the project action list, continually monitoring and correcting)
Product risks (error prone software delivered, poor requirements, defect in software, poor software
quality, software not meeting requirements, system forces user to spend inappropriate amount of time,
system crashes, corrupt data, slow performance, incorrect documentation), Project risk (project
manager responsibility, supplier issues, organizational factors, specialist issues, documented in test plan,
risk register maintained by test leader)
Risks for testing: insufficient/not available/poor test basis, aggressive delivery, lack of test expertise,
poor test management processes, bad estimations/effort overrun/schedule delay, problems with test
environment, poor test coverage, etc
Psychology of Testing show what system should and should not do easiest test cases (faults
remain)/hardest test cases (few faults remain), reduce the perceived risk, confidence, testing paradox
(best way to build confidence is to destroy it (by testing/finding fault)), developer (prove code works,
driver by delivery), independent tester (prove code does not work, driven by quality)
Successful tester: focus on delivering quality product, results presented in non-personal way, attempt to
understand how others feel, confirm understanding after discussions, be construction, must be able to
communicate, Verbal and Written Communication/Passion/Technical Skills/Analytical
Skills/Attitude/Productivity
Technical skills needed in testing for - Test managers, Test analyst, Test automation experts, Test
performance experts, Database administrator or experts, User interface experts, Test environment
managers, Test methodology experts, Test tool experts, Domain experts
Test job (employed to do, one or more roles), Testing roles (activity(-ies), one or more role in project)
test leader, tester
Test leader: responsible for test strategy, collaborate with project management, takes notice of
organization testing policies, coordinates testing activities with other project activities, coordinates
design, specification, implementation, execution of tests, monitors test results and exit criteria, decides
on test environment implementation, maintains plans of test progress/results, determines what should
be automated and how/how much, responsible for test support tools, decides proper metrics, writes
test summary report
Typical tester: reviewing and contributing to test plans, analyzing, reviewing and assessing user
requirements, creating test specifications from test basis, setting up test environment, preparing and
acquiring/copying/creating test data, implementing tests on all test levels, executing/logging/evaluating
tests, using test administration/monitoring tools, automating tests, run tests, review tests from others
Code of Ethics (testers may have access to confidential priviledged information) Public, Client and
Employer, Product, Judgement, Management, Profession, Colleagues, Self
Test Analysis (review test basis, evaluate testability, identify/prioritize test conditions), Test Design
(predict how Software Under Test (SUT) behaves, design prioritize test scenarios/cases, design test sets,
identify necessary test data, design test environment set-up)
Test oracle expected result, states precisely what outcome from program execution will be for a test
case if possible
Writing test cases: too many scenarios (use risk based testing), business case focused (model with
activity graph), technology focused (OO programming, model with sequence diagram)
Use case format Use Case Name, Scope, Level, Primary Actors, Stakeholders, Preconditions,
Postconditions, Main Success Scenario, Extensions/Exceptions, Special Requirements, Technology &
Data Variations List, Frequency of Occurrence, Miscellaneous
Structure-based techniques (white/glass box, based on structure of program/generate test cases from
code itself/pseudo-code) procedure, component, integration, system levels and statement (only
executable statements), decision (all branches are decisions but some decisions like question mark colon
?: operator or short circuit evaluation (&&,||) implicit without branch), branch (if/then/else, switch),
path coverages (exponential in number of conditional branches, presence of cycles must have limit to
prevent infinite, linearly independent paths or basis paths identified, McCabe Cyclomatic metric upper
bounds on number of independent paths V(G)=E-N+2 where G control flow graph, N number of nodes, E
number of edges) (test coverage is quantitative measure and provides estimation number of
cases/total number * 100%)
Control flow graph (nodes are basic blocks/segments (each block with sequence of statements, no
jumps from or to middle of block, once executes guaranteed to execute to end))
Experience-based techniques error guessing, exploratory testing, checklist-based testing, attack testing
Defect-based techniques taxonomy lists of root causes, defects and failures e.g. Beizer (defect), Kaner
(general), Binder (OO), Vijayaraghavan (e-Commerce)
Choosing test techniques Internal Factors (models used, tester knowledge/experience, likely defects,
test objective, documentation, life-cycle model, previous experience of defects), External Factors (level
and type of risk, customer/contractual requirements, type of system, regulatory requirements, time and
budget)
Test Implementation (create test cases, test data, expected results, test procedures, test harnesses,
automated test scripts, test suites, verify test environment), Test Execution (execute test procedures
manually or using tools, record Software Under Test with test tools and testware, Logging outcome,
reporting discrepancies, repeating test activities with re-testing, regression testing)
Test harness test execution engine and test script repository providing test environment with stubs
and drivers to execute tests.
Re-testing/confirmation testing previously failed tests run again to test for pass.
Regression testing previously passed tests run again to test not failing, misnomer: more like anti-
regression or progression testing.
Automating tests is not trivial, takes 2 to 10 times longer than manual, cannot automate everything,
must plan what to automate.
Static testing (review, static analysis) code not executed. Dynamic testing code executed.
Test review stages Planning (select personnel, allocate roles, define entry/exit criteria, select parts of
document for review), Kick-Off (distributing documents, explain objectives, process and documents,
check entry criteria), Individual Preparation (reviewers perform review, defects by severity
critical/major/minor and type error/conflict/missing/extra/unclear, questions, external issues, praise),
Review Meeting (formal and informal process of providing defect list, desk(top) check/informal review
avoids formal meeting just list of defects given, checking fulfillment of exit criteria, Process: moderator
distributes copies of agenda, describes purpose of meeting, polls reviewers for time spent and general
comments, identified findings are presented, recorder notes finding, reads list of finding, team
determines outcome, author collects and marks packages with defects), Rework (correcting and
rewriting deliverable, author marks where changes made), Follow-up (process of checking if bugs
corrected, documents distributed and corrected again, for formal review moderator checks exit criteria
compliance), Final Report (by moderator, record review process, all activities including lessons learned
stored, statistical analysis addresses issue)
Review Objectives: find defects, gain understanding, generate discussion, make decision by consensus,
Steps: Study document, identify issues/problems and inform author, author updates document, Roles
and Responsibilities: Manager (decides what to be reviewed, ensures time allocation, determines
objectives met), Moderator (review leader plans review, runs meeting, does follow-up), Author (writer
or person with chief responsibility, must fix defects), Reviewer (individual with specific
technical/business knowledge), Scribe (recorder document all issues/defects/problems/open points in
meeting)
Review process types Informal review (desk check, main purpose to find defects, simple, low
overhead, only review meeting/rework), Walkthrough (step-by-step presentation by author to find
anomalies, consider alternatives, evaluate conformance to standards, gather information and establish
common understanding, detailed study not always required, usually to check scenarios and program
code, must have planning though individual preparation/follow up become optional), Technical review
(peer group discussion activity, achieving consensus on technical approach, conforms to specifications,
adheres to regulations/standards/guidelines/plans, changes are properly implemented, changes affect
only those system areas identified by change specification, only kick-off and follow-up optional, rest of
steps mandatory), Inspection (most formal peer review, examination of documents to detect defects in
specifications, specified quality attributes, regulations/standards/guidelines/plans deviations, based on
rules and checklists, users entry and exit criteria, findings with metrics are essential, significant
investment, not simple process since all steps mandatory) from low formality to high formality
Software reviews Management, technical, inspection (cost of quality, person hours highest
measurable expensive, inspect percentage of code, inspect critical portions, when done correctly is
valuable), walkthroughs, audit
Automatic static analysis syntactic, data use, control flow, interface, program slicing, path analyses
Test reporting and closure evaluating exit criteria (check test logs), reporting (test summary report for
stakeholders, communicate findings, analysis, assessment of defects, economic benefit of testing,
outstanding risks, level of confidence), test closure activities (check which deliverables are deliverable,
close incident reports, finalize and archive testware, handover testware to maintenance, analyze lessons
learned)
Test documentation Test Plan, Test Design Specification, Test Case Specification, Test Procedure
Specification, Test Item Transmittal Report, Test Log, Test Incident Report, Test Summary Report
Test Plan document Plan Identifier, Test Items, Risk Issues, Features to be Tested, Features not to be
Tested, Test Approach, Pass/Fail Criteria, Suspension Criteria, Test Deliverables, Environmental
Requirements, Staffing/Training Needs, Schedule of Test, Planning for Risks, Approvals
Test Design Specification document Test design specification identifier, Features to be tested,
Approach refinements, Test identification, Features pass/fail criteria
Test Case Specification template Test Case specification identifier, Test items (features and
conditions), Input specifications, Output specifications, Environmental needs, Special procedural
requirements, Inter-case dependencies
Test Summary Report Specification template Test summary report identifier, Summary, Variances,
Comprehensive assessment, Summary of results, Evaluation, Summary of activities, Approvals
Test monitoring checking test status metrics including: test execution (number of cases
pass/fail/blocked/on hold), defect, requirement traceability, test coverage, miscellaneous (tester
confidence, dates, milestones, cost, schedule, turnaround time)
Configuration management change (for military and government change control board (CCB)), version
(control, traceability, repository/check-in/check-out/workspaces/branches centralized e.g. SVN and
CVS or distributed e.g. GIT and Mercurial version control system (VCS)), build (system) (development
system, build server, target environment), release management (deployment 4-tier: Development,
Testing, Staging, Production)
Configuration management terms Software configuration item (SCI), Configuration control, Version,
Baseline, Codeline, Mainline, Release, Workspace, Branching, Merging, System building
Test management related tools test management tool, requirement management tool, incident
management tool, configuration management tool
Static testing tools Review process support tool, Static analysis tool, Modelling tool
Test specification tools Test design tool (computer aided software engineering (CASE)), Test data
preparation tool, simulators(behavior)/emulators (inner workings)
Test execution related tools Test execution tool, Test harness/Unit test frameworks, Test comparator,
Coverage measurement tool, Security testing tool
Performance and test monitoring tools Dynamic analysis, Performance testing, Load testing, Stress
testing, monitoring tools
Other testing tools spreadsheet, word processor, E-mail, Back-up and restore utilities, SQL, Project
planning tool, Debugging tool, DevOps tools