You are on page 1of 16

The Future of Testing

Where Do Testers
Spend Their Time?
Brought to You by

Table of
Contents

Overview

Organizational Information

Tools & Automation

Handling Defects

Where Do Testers Spend Their Time?

10

Moving into Mobile

14

Conclusion

16

Overview

TechWell, IBM, uTest, and keynote partnered on a survey

Other topics covered in our survey and resulting report include:

to explore where todays testers are spending their time,

what obstacles they most often encounter, and where


they think their attention should be focused.

How release cycles have changed in the past few


years

How frequently and what types of tests are being


automated

There are references available on the Internet that

How testers handle defects and delays

discuss where testers attentiontest related or notis

What non-testing activities are vying for testers time

required but many of these resources are based on an-

What investments are being made in mobile testing

ecdotes or derived from discussions the author has had


with a small number of testers.

Timeframe:
The survey was conducted in April 2014 by TechWell.

Purpose:
Our goal was to create a data-driven report analyzing

Participants:

where testers are really spending their time and discov-

TechWell surveyed 250 software testing professionals

ering where they think their time is best utilized.

from six continents, with 63% of respondents reporting


North America as the headquarters of their organization.
Forty percent of respondents are test managers or test
leads, and 89% of respondents have six or more years of
testing experience.

Organizational
Information

We asked what types of applications you test, and the

This increase in agility may be affecting release fre-

vast majority reported web-based (82%) followed by

quency. During the past three years, 15% of respondents

client-server (54%), mobile (50%), and service-based

claimed semiannual releases; only 8% say that is their

(41%). (Figure 1)

current release frequency. Conversely, 4.5% of the testers who answered our survey say they used to release

Twenty-three percent of respondents work for organiza-

semimonthly; that number has grown to 10% for the cur-

tions that deliver testing services and, hence, they are not

rent cycles. We also noted that 38% of respondents are

a direct part of a development team. Forty-seven percent

currently doing monthly or quarterly releases while 15%

work on teams with a tester-to-programmer ratio ranging

report they release continuously. (Figure 3)

from 1:1 to 1:5.


With iterative development methodologies seemingly the
norm these days, we looked for some data to support
that perception. And we found it. While 43% of respondents cite Scrum as their main development methodology (24% are sticking with waterfall), we discovered that
56% of organizations are using some variation of iterative
development. (Figure 2)
4

Percent of Respondents

100%

Types of Applications Tested

80%
60%
40%
20%
0%

Web-based

Client-server

Mobile

Service-based

APIs

Packaged
applications

Mainframe

Embedded

Other

Figure 1

What Methodology Do You Use to Develop Software?


(more than one answer per respondent possible)

Scrum
Waterfall
V-model
Plan-based
No methodology
Kanban
Iterative
Hybrid (combination of methodologies)
Client's methodology
Water-scrum-fall
Extreme programming
0%

10%

20%

30%

40%

50%

Figure 2

How Have Release Cycles Changed?


8%
No change in release of frequency
32%

Increased frequency of releases

60%

Decreased frequency of releases

Figure 3

Tools &
Automation

Our survey found that 81% of respondents use commer-

Sixty-nine percent of respondents automate functional

cial tools from HP, Microsoft, and IBM for QA manage-

verification tests at least some of the time. Sixty-nine per-

ment or test execution, and 61% keep it low-tech with

cent automate load and performance testing with some

text documents and spreadsheets (e.g., Word, Excel,

degree of regularity with 22% automating load and per-

Google docs). Forty-seven percent of respondents use

formance testing more than three quarters of the time.

open source tools like Bugzilla, Selenium, Cucumber, and


RSpec, and 9% build their own. (Figure 4)

Security and user acceptance testing have the lowest frequency of automation with 43% and 35% of respondents,

When it comes to managing test data, 64% create test

respectively, reporting some level of automation, but

data manually, and only 7% use a commercial test data

57% and 65% of respondents claim they never automate

solution. (Figure 5)

security or user acceptance testing. (Figure 6)

Survey results indicate that increasing the number of


automated tests is something a lot of testers want. Lets
look at which tests you currently automate (at least occasionally) and which tend to be automated infrequently
or not at all.

QA Management/Test Execution Tools

80%

(more than one answer per respondent possible)

Percent of Respondents

70%
60%
50%
40%
30%
20%
10%
0%

Text docs and


spreadsheets

Open source
tools

HP products

Microsoft
products

Other

In-house
tools

IBM products

Borland
products

Figure 4

How Do You Create and Manage Test Data?


(more than one answer per respondent possible)

Manually create test data from scratch

64%

Extract production subset & add seed data

34%
5%
7%

Clone production database & use as is


Clone production database & manually edit out secure info

29%

25%

Clone production database & automate scrubbing of secure info


Use commercial test data management solution

25%

Other
Figure 5

Percent of Respondents

Percent of Automated Test Scripts


100%

22%

80%

37%

60%

17%
43%

25%
40%

18%
51%

47%

40%
20%
0%

41%

40%

Unit

Integration

32%

12%
31%

38%
57%

35%

31%

Build
verication

Functional
verication

21%
Regression

31%
Load/
performance

1% 50%

None

Unit

31%

Security

18%

18%

10%
25%

40%

42%

42%

40%

65%

System

System integration/
User
end-to-end
acceptance

51% 100%

Figure 6

Handling
Defects

When it comes to feeling good about what you do, 57%


report being very confident to completely confident in
the quality of the products they release. (Figure 7) But
as we know, defects happen. The question is: Where
do they happen the most? Well, the good news is 59%
say they seldom find defects in production. Forty-one
percent always find defects during the testing stage, 41%

frequently find defects during user acceptance testing,


and 33% frequently find defects in dev. (Figure 8)

How Condent Are You in Your Product's Quality


5% 6%

38%

52%

Completely condent
Very condent
Somewhat condent
Not condent

Figure 7

Where and How Often Do You Find Defects?


17%

Early in the project

17%

During development

In production

30%

22%

12%
6%
100%
Always

23%

29%

17%

8%

28%

33%

41%

During the test stages


During user acceptance

18%

41%

29%

25%
80%

25%

60%

40%

Frequently

Very often

6%
4% .4%

1%
2%

59%
Percent of Respondents

7%

20%
Seldom

0%
None

Figure 8

Where Do Testers
Spend Their Time?
Based on our survey results, testers spend their time in

occasion, 86% say theyve brought in additional resources

three ways: navigating development delays and missing

to keep the release date, and 71% have gone ahead and

functionality; dealing with unplanned, non-test activities;

released only what has been tested. (Figure 9)

and, of course, testing.


When missing functionality is holding up end-to-end testWhen development delays threaten to bring testing to a

ing, 55% write mocks to emulate the missing functionality,

halt, three mitigation techniques seem to be the most of-

54% wait to test until the missing code is available, and 6%

ten utilized, at least to some degree. Ninety-one percent

use a commercial service virtualization tool. (Figure 10)

say they have been known to slip the release date on

How Do You Mitigate Delays?


14%

Bring in more testing resources

9%

Slip the release date

26%
38%

32%

Release without testing to desired level

0%

17%

44%
29%

Release only what has been tested

Never

52%

20%

Rarely/Sometimes

Figure 7

12%

39%

40%

13%

80%

Percent of Respondents

Half the time

18%
15%

14%

60%

15%

Frequently

Figure 9

How Do You Deal with Missing Functionality or Code?


(more than one answer per respondent possible)

Manually write simulation mocks

55%

Defer testing until dependent code is available


54%

6%

Descope testing of that functionality


Use open source mocking solutions

8%

12%

Other
33%

Figure 10

10

Figure 8

Use commercial service virtualization solution

2%
3%
6%
1%

100%
Always

Everyone knows the frustration of not getting work done

Once you work around the delays and finish your myriad

because of unplanned activities and distractions like

ad hoc assignments, its finally time to test. So, what are

email, impromptu meetings, management requests, fires

testers testing and how often? And the more interesting

that need to be put out, and bugs that need to be fixed

question: What testing tasks do testers want to spend

right now. But how much time are testers really spending

little to no time on and on which would they like to spend

on this non-test work and which activities do they find

a lot of time?

most distracting?
Seventy-four percent spend a moderate amount of time
Almost half of respondents (49%) spend about eight

on regression testing, 65% spend a lot of time on func-

hours a week on non-testing work, while a scary 36%

tional verification testing, and 15% spend no time on

spend 50% to 100% of their workweek not testing.

security testing (tsk, tsk). (Figure 12)

Fifty-eight percent cite ad hoc requests as their biggest


disrupter, 66% say audit and compliance tasks dont eat
up much of their time, and 26% find unplanned meetings
only moderately disruptive. (Figure 11)

How Much of Your 40-Hour Workweek


Is Spent on Unplanned, Non-Testing Activities?
1.5%

7%

Most Disruptive Non-Testing Activities

.5%
21%

16%
40 hours

58%

30 hours

27%

Ad hoc requests

29%

General meetings

20 hours

Email

8 hours

49%

4 hours

45%

48%

Defect triage
Audit/compliance

None

Percent of Respondents

Figure 11

How Much Time Do You Spend on These Testing Activities?

100%
80%
60%
40%

27%
19%

45%

20%
0%

30%
34%

31%

9%

5%

Unit

Integration

None

21%

25%
26%

39%
10%
Build
verication

51%
65%

17%

23%

15%

24%

3%

2%

Functional
verication

Regression

Minimal

25%

42%
12%
Load/
performance

16%
20%

49%

15%
Security

44%

22%

49%

36%
25%

24%

28%

22%

6%

5%

34%
5%

User
System System integration/
end-to-end
acceptance

Moderate

Large

Figure 12

11

When you look at the larger project landscape, testers re-

When asked what one change they would like to see in

port the activities that take up more time than theyd like

their organizations culture, we received a broad range of

are: waiting for test assets such as requirements (59%),

responses. However, we did see five changes repeated

non-test activities such as email (53%), and rescoping test

frequently. (Figure 14) These are obvious pain points:

coverage due to changes (47%). (Figure 13)

1. Increase automation
2. Involve testers early in the lifecycle
3. Increase corporate-wide awareness of testers value
4. Improve requirements
5. Hire more testers

On Which Activities Do You Spend More Time Than You'd Like?


(more than one answer per respondent possible)

Waiting for test assets


Nontest activities
Rescoping test coverage due to change
Reporting activities
Absorbing changed test environments
Waiting for the review process to end
Other
None

0%

10%

20%

30%

40%

Percent of Respondents

50%

Figure 13

Top Five Desired Organizational Culture Changes


14%
32%

Increase automated testing


17%

Involve testers earlier


Promote value of testers

20%

Improve requirements

17%

Hire more testers

Figure 14

12

60%

We asked testers to compare their current efforts on

The top five test activities that respondents say they want

specific test activities to the amount of time they would

to spend a sizeable amount to a lot of time on are:

like to spend on those activities. (Figure 15) On the little-

1. Creating automated tests

to-no-time-spent side, there wasnt much difference

2. Performing exploratory tests

between the current and desired efforts. On the sizable-

3. Executing automated tests

to-large-amount-of-time end we found:

4. Designing tests

5. Planning tests

46% want to spend significant time creating automated tests, but 64% currently spend little to no time on it

41% want to spend significant time performing ex-

The top five test activities that respondents say they want

ploratory testing, but 55% currently spend little to no

to spend little to no time on are:

time on it

1. Setting up, configuring, and refreshing test environments

30% want to spend a significant amount of time

2. Investigating & submitting defects

executing automated tests, but 68% currently spend

3. Rerunning tests

little to no time on it

4. Creating/refreshing test data


5. Installing and configuring test tools

Where do Testers Want to Spend Their Time?


I want to Spend MORE time

Creating automated tests


Performing exploratory tests
Executing automated tests
Designing tests
Planning tests
Reviewing test results
Running pre-dened tests
Maintaining automated tests
Creating reports
Maintaining automated test tools
Conguring test tools
Creating test data
Rerunning tests
Investigating & submitting defects
Setting up labs
I want to Spend LESS time
Figure 15

13

Moving into
Mobile

We included a few questions to gauge how testers and

We also determined three main methods companies are

their organizations are faring with the new challenges

using for functional testing mobile apps and websites:

presented by mobile devices and their range of plat-

(Figure 18)

forms. When asked about the main challenges to mobile

testing, 48% cite having enough time to test as the big-

Emulators

gest challenge, but 49% say the least challenging aspect

is having access to mobile devices. (Figure 16)


When it comes to investing in mobile development,
organizations are putting the most resources into build
automation and continuous integration. (Figure 17)

14

Devices in hand
Remote device cloud solutions

Ranking of Mobile Testing Challenges


30%

Availability of proper testing tools

18%

32%

Access to mobile devices

11%

35%

Implementing the right test process

39%
49%
27%

48%

Having enough time to test

17%

37%

Availability of mobile testing experts

0%

28%
25%

19%

20%

33%

40%
60%
Percent of Respondents

Most challenging

80%

Least challenging

Moderately challenging
Figure 16

Where Organizations Are Investing Their Mobile Development Dollars


(more than one answer per respondent possible)

25%

38%
Build automation

10%

Continuous integration

14%

Other

21%

Build distribution
Gated deployments or A/B testing

37%

Not developing mobile apps

Figure 17

How Do You Test across Devices and Platforms?


Percent of Respondents

50%
40%
30%
20%
10%
0%
Devices in hand

Emulators

Not sure

Remote device
cloud solution

Other

Outsourcing

Crowdsourcing

Figure 18

15

Conclusion

Based on the survey results, testers are spending a significant amount of time on non-test-related tasks as well
as on testing activities they dont feel are the best use of
their time.

Thank You
Thanks to all of the testers who took the time to com-

Organizational improvements many testers cited that

plete the survey. Your input is invaluable to helping

would likely free up time for them to focus on testing and

understand where testing is heading and the chal-

improving product quality include increasing automa-

lenges you are facing along the way.

tion, involving testers earlier in the lifecycle, improving


requirements, and hiring more testers to do the work.

Special thanks to our survey review panel: Michael


Bolton, Dorothy Graham, Janet Gregory, Linda Hayes,

Do you find yourself bogged down by time-consuming

and Karen Johnson. Your expertise and dedication to

busy work, bottlenecks and delays, or non-vital testing

the testing craft is admirable and always appreciated.

activities? How do you mitigate these challenges to ensure you release a high-quality product on time? Email us
and let us know: editors@techwell.com.

16

You might also like