Professional Documents
Culture Documents
Testing Secure
Applications for Embedded
Devices
)NCLUDEDåINåTHISå7HITEå0APER -OCANAå#ORPORATION
350 Sansome Street
Executive Summary
Suite 1010
San Francisco, CA 94104
Introduction
415-617-0055 Phone
Embedded Test Planning 866-213-1273 Toll Free
Appendix: NanoDefender™
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Secure Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Vulnerabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Appendix: Nano$EFENDERTM . . . . . . . . . . . . . . . . . . . . . . . . . . 36
About Mocana . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
Downloads and Contacts . . . . . . . . . . . . . . . . . . . . . . . . . 39
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html II
Executive Summary
Despite the fact that embedded systems design teams spend a considerable
portion of their overall development budget on testing, embedded systems
continue to be buggy and vulnerable to security breaches. Why?
Embedded systems are generally more difficult and expensive to test than
standard desktop applications due to many factors, such as the separation of
development and target platforms, lack of test tools for their target operating
systems, and the need to focus on difficult-to-obtain performance metrics.
Further, because many embedded devices are connected to the outside world,
often through the inherently insecure Internet, testing beyond mere functionality
is required to validate the applications as secure, robust, and resistant to attacks.
It’s easy to see why testing embedded systems is such a challenge.
The good news is that the usual causes of buggy, vulnerable code can be
mitigated by employing best practices, such as designing security right into
the code, crafting a test plan that recognizes what makes embedded systems
unique, using a wide array of testing techniques, and giving as much importance
to designing the test environment as is given to designing the end product itself.
By following these best practices, you’ll not only uncover bugs, but faults and
vulnerabilities before you release your code or device to the outside world...
far better than having those vulnerabilities discovered by your customers. Well-
tested code leads to secure and robust systems, which lead in turn to lower
lifetime development costs, reduced time to market for follow-on releases,
dramatically reduced support costs, reduced vulnerability to attacks, and best of
all, positive brand identity.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
Introduction
Back in the days when software programs were simple, testing was also simple
and straightforward. But today’s complex applications connect with each other
using an enormous variety of communication protocols, and require significantly
more complex testing to ensure that they not only function as intended under
ideal conditions, but that they also contain no vulnerabilities that attackers can
exploit.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
The System Under Test
Before delving into the details about types of tests, when to run them, how
they uncover bugs, and so on, it is important to first have a clear idea of which
devices we’re talking about when we refer to secure embedded applications.
COMMUTE COMMUTE
Figure 1. The “Internet of Things” encompasses an ever-growing list of connected devices, which must not
be allowed to harm the rest of your network assets.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
Secure Applications
When we discuss secure embedded applications, the first things that come
to mind are security products and protocols such as firewalls and SSL
implementations. However, it might be more accurate to state that all embedded
applications need to be secured—that is, be able to defend against imposters,
eavesdropping, takeover, or subversion. This exactly fits the focus of this paper:
how to design and test embedded applications so as to ensure that they are
secure. ... 70% of
business
Vulnerabilities security
vulnerabilities
RFC 2828 defines a vulnerability as “a flaw or weakness in a system’s design,
implementation, or operation and management that could be exploited to violate
are at the
the system’s security policy.” application layer.
[S12]
Vulnerabilities can be divided into two classes:
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
Embedded Test Planning
Embedded systems have some unique characteristics that affect the testing
)Nå4HISå3ECTION
process. This section explains how embedded software is different, where in the
s Embedded Systems’
development cycle various tests should be performed, and where testing should Unique Characteristics
be focused so as to find the greatest number of bugs. s Embedded Testing
Process
s Embedded Testing Focus
Embedded Systems’ Unique Characteristics
A basic question is, “is testing embedded software any different from testing any
application’s software?” The answer is yes, because embedded software:
Usually runs for much longer between “reboots” than typical application
software. Desktop users routinely shut down their computers when they go
home for the night. Embedded devices are often “always on.”
Typically resides on hardware boards that are very expensive. This adds extra
cost to setting up a comprehensive test environment that provides for testing
on all supported platforms.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
The Embedded Testing Process
Although it can seem as if there are many flavors of testing lifecycles, especially
when you consider different development methodologies (such as waterfall and
Agile™, as well as the secure software development lifecycle that is discussed
later in this paper), the test process itself always consist of the activities shown
in Figure 2.
Although there is much to discuss about test methodologies and the testing
process (see “Security and Testing Books and Articles,” on page 33), in brief,
embedded software testing (like testing any application) consists of:
0LANNING In the planning stage, which can begin as soon as the system’s
high-level requirements are complete, you create a high-level test plan.
$ESIGN At this stage, you can design your test environment so it will support
executing the tests as outlined in the detailed test plans. This is also the time
to perform risk assessments, define test data sets, and determine which
tests can be automated.
#ONSTRUCTION Now it’s finally time to code the actual test cases, as well as
accompanying test harness code (software and test data required to exercise
a system under test, and particularly simulation code for external systems
that will be unavailable during testing).
0OST
DEPLOYMENT A final test activity that is all too frequently overlooked
is the post-deployment evaluation. By closely examining bug reports, you
can find out which sorts of bugs occurred (and of course take measures to
prevent them in the future); test equipment can be restored to a “clean”
configuration state for future use; and wish-lists developed for future test
planning.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
Embedded Testing Focus
Embedded software testing differs from traditional application testing in several
important ways. Instead of concentrating solely on functional requirements,
embedded testing must focus on:
4ESTå"EDå3ETUP
2EAL 4IMEå"EHAVIOR
At a minimum, you should create test cases for typical and worst case real-time
scenarios. For example, if a message handling system expects to receive six
different types of messages, all in a particular sequence, tests should include all
possible sequences, especially out-of-order and redundant messages.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
happens at the moment of an unrelated deadline? If resources are scarce, the
deadline may be missed. Your test suite must be sure to test for such corner
cases.
0ERFORMANCEåANDå#APACITYå4ESTING
Encryption/decryption times
Throughput
Latency
Footprint size
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
Finally, an unexpected benefit of code coverage analysis is the probability of
finding dead code—code that was needed during development, or code for a
feature that has been removed. While dead code obviously does not contribute
to performance issues, it certainly increases the code footprint, which is
something to minimize in most embedded systems.
2ELIABILITYå4ESTING
For example, a traditional functional requirement for logging into a system might
be stated as “User names must be six to eight characters, contain at least one
number, and contain at least one upper-case letter.” A security requirement for
that same user login would focus on the negative requirements, as follows:
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
“The user name processing should validate the user’s input and display an error
message if any illegal characters are included.”
To minimize the severity and impact of security design flaws, software engineers
should employ the following techniques [S19]:
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
Attack surface reduction—Limiting software interfaces to only those
necessary for the program to complete its job.
-ITIGATINGå0LATFORMå2ISKS
Symbolic linking—Symbolic links are files in a file system that point to other
files; for example, symlinks in UNIX, and hardlinks in UNIX and Windows.
A clever attacker can exploit symbolic linking to trick the application into
operating on a file of his or her choosing.
For example, the attacker could create symlink with a predictable filename
that an application is likely to try to operate on (perhaps opening and
subsequently deleting a file such as /tmp/temp), and link the symlink it to a
likely system file (such as /etc/passwd). The result is that the program itself
would delete its password file.
To mitigate this risk, the programmer should be careful to check for symbolic
links every time the program creates, opens, or deletes a file, or changes the
file’s permissions.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
!VOIDINGå-ISPLACEDå4RUST
In the security world, trust refers to a reliance on things being as they appear:
users being who they say they are, and data being correct, valid, and for its
intended purpose. Trust in this context can be explicit (a source is verified,
and then anything coming from that specific source is trusted). Or trust can
be implicit (incoming information is trusted because, for example, it uses a
particular protocol and the correct port). Regardless, when evaluating and testing
applications for security, information should never be trusted without verification. &IGUREååå4RUSTåISåTRICKY
As van der Linden [S16] explains, a key effort of security testing is finding and
documenting all the places in the target application and system where trust
is misplaced—granted without appropriate checks. Common examples of
misplaced trust include:
Any of the above scenarios make embedded code vulnerable to attack. But
simple measures, such as validating every input no matter where it comes from,
will eliminate many typical weaknesses that attackers can exploit.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
$ESIGNINGåTOå0REVENTå#OMMONå##å%RRORS
Although the C language (which is the focus of this paper because C is the most
commonly used language for embedded applications) has historically exhibited
the most security-related issues of any programming language, careful coding
and strict adherence to a coding style guide can mitigate most of the risks
associated with C library functions. These risks include:
There is no safe native string type, nor any safe string handling functions.
To mitigate this risk, it is imperative that your code always manage string
buffer sizes and confirm that target buffers are large enough.
To mitigate this risk, application code should always check the length of
user-supplied input variables and avoid unbounded string operations. An even
better practice is to use a string buffer module that automatically manages
memory allocation and string lengths, and avoids using the native C string
functions altogether.
Generally you can prevent buffer overflows by checking the length of all
externally-supplied input variables—from users, external programs, and even
any shared memory and data storage that can be modified by an external
program or user. Additionally, measures taken to mitigate specific buffer
overrun risks should be employed.
The only guaranteed techniques to prevent string attacks are to use static
strings instead of dynamically formatting strings with variable numbers of
source arguments, and to never accept user input as input for a variable-
formatted string. (For a detailed explanation of format string attacks, refer
to the Windows 2000 Format String Vulnerabilities paper available on the
Next Generation Security Software website, http://www.webcitation.org/
query?url=http%3A%2F%2Fwww.nextgenss.com%2Fpapers%2F&date=20
09-03-09.)
To avoid this problem, do not use implicit type conversions, particularly from
signed to unsigned integers.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
-AKINGå%XAMPLEå#ODEå3ECURE
It’s important, therefore, to not take shortcuts when writing example code
intended for your customer-developers, to include proper validation, and to make
use of all an API’s built-in security features.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
such as a firewall, DMZ, and IDS (Intrusion Detection System). (Mocana’s
NanoWall™ and NanoDefender are custom built for embedded devices; see
“Appendix: NanoDefenderTM,” on page 36.)
1. Security
Guidelines /
Rules and
Regulations
6. Determine 2. Security
Exploitability Requirements
SSDL
3. Architectural
5. Testing Reviews/
Threat
Modeling
4. Secure
Coding
Guidelines
Patch Management
Depending on the industry in which your product will operate, your product may 0HASEåå$ELIVERABLE
be expected to conform to security policy standards such as the VISA (Virtual s System-wide security
specification
Instrument Software Architecture) standard for communicating with test and
measurement instruments from a PC, HIPAA (Health Insurance Portability and
Accountability Act), or SOX (Sarbanes-Oxley Public Company Accounting Reform
and Investor Protection Act of 2002). However, often there is no customer-
mandated security policy to follow, in which case it is important to define your
own.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
33$,å0HASEåå3ECURITYå2EQUIREMENTS
Although the answer depends on the specifics of your system, typical topics
included in a security policy are:
Key management.
Password requirements.
Memory management.
As the above list implies, defining security rules is not only about defining
functional requirements for an embedded application, but also, constraining how
the system operates and responds to behavior that should not be allowed.
33$,å0HASEåå!RCHITECTURALå2EVIEWSåANDå4HREATå-ODELING
0HASEåå$ELIVERABLES
No matter how well thought-out the test plan, and how complete a test case
s Test plan
suite, it is impossible to test a program for every possible input, code branching
s Risk analysis
scenario, and so on. Therefore, you must use an intelligent method, such as
threat modeling, to determine which tests to perform.
Threat modeling is a type of risk-based testing where potential attacks are ranked
according to the ease of attack and the seriousness of the attack’s impact.
After modeling, testing efforts can be focused on those areas that are easiest
to attack and/or where the impact is greatest. For example, high-priority tests
should focus on any security flaws that can be exploited by anonymous remote
attackers to execute arbitrary code.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
The threat modeling process is composed of four main steps:
33$,å0HASEåå3ECUREå#ODINGå'UIDELINES
Secure coding guidelines make it possible to prevent both kinds of security 0HASEåå$ELIVERABLES
s Software development
vulnerabilities—design weaknesses and implementation flaws. Such guidelines
procedures
ensure that designers use proper techniques to minimize the effects of
s Coding style guide
implementation flaws, and enable coders to avoid such flaws in the first place.
For details about secure programming, see the previous discussion, “Eliminating
Security Design Flaws,” on page 10.
33$,å0HASEåå4ESTING
0HASEåå$ELIVERABLES
Finally, we come to the focus of this paper: testing. It’s important to understand
s Test cases
that “testing” is performed at many different points throughout the development
s Bug reports
lifecycle, encompasses many types of tests, and is performed on many different
s System analysis
subsets of a system.
Although testing is such a broad subject that one paper cannot serve as a
single source of information, “Testing Secure Embedded Applications,” on
page 19, provides background and best practices information that enables you to
confidently test your secure embedded application.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
33$,å0HASEåå$ETERMINEå%XPLOITABILITY
0HASEåå$ELIVERABLES
Vulnerabilities can be categorized as low-level or high-level. Vulnerabilities that
s Test cases
corrupt the state of the running application or runtime facilities, such as buffer
s Bug reports
overflows, are said to be low-level. They may not be 100% reproducible because
s System analysis
they can corrupt either the application or the programming language runtime
state. Conversely, high-level vulnerabilities, such as logic errors in the application,
are typically very reliable and reproducible, and tend not to crash the application.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
Testing Secure Embedded Applications
Given that there are development methodologies and a testing framework in
place that are specifically designed for secure and embedded applications, it’s )Nå4HISå3ECTION
s Best Practices
important to make sure that your embedded software test suite encompasses
s Types of Tests
the full range of software test classes, described below.
s When to Stop Testing
s Building a Custom Test
Environment
Best Practices
Although the unique security challenges of embedded systems require particular
attention, many problems can be avoided by following a few simple guidelines.
5SEåAåCODINGåSTYLEåGUIDE Just as journalists follow writing style guides, CODING GUIDELINES AND STYLE
so should programmers following a coding style guide that spells out the
Contents
requirements for development environment directory and file organization, Overview...................................................
naming conventions, declarations and types, error handling, white space ANSI Standard C......................................
formatting, memory management, library use, and so on. (For C language- File Organization......................................
General Structure...................................
specific suggestions, see “Designing to Prevent Common C/C++ Errors,” on Source Files............................................
Header Files...........................................
page 13. For references to model coding style guides, see “C Coding Style Naming Conventions................................
General Guidelines.................................
7RITEåAåTHOROUGHåTESTåPLAN Although you could use theoretically take a Use of the Preprocessor.............................
simple approach to test planning and use spreadsheets to keep track of test Declarations and Types............................
Functions..................................................
cases (perhaps noting their status, such as “in design,” “coded,” “executed,
Expressions and
no bugs found,” and “executed, bugs found”), writing an official test plan Statements...............................................
brings many benefits. A test plan is an agreement among product design, Error Handling..........................................
and release criteria. Test plans also typically include the scope of work, risk Boolean Expressions..............................
Function Definitions................................
Structure Definitions...............................
analysis, and contingency plans. Comments................................................
General Guidelines.................................
programmers developed applications in their garage, they could list bugs on The Standard Library................................
scraps of paper. But with teams of developers and dozens, if not hundreds, or Memory Management...............................
Debugging................................................
thousands of interacting and dependent software modules, it’s essential that
References...............................................
both target and test code be managed with a suite of tools:
MOCANA CONFIDENTIAL
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
Test machines, where test monkeys reside to automatically build and test
code as it’s checked into the code repository. (For more about setting up a
test environment, see “Building a Custom Test Environment,” on page 29.)
Within the security context, black-box testing is most often used during the pre-
deployment test phase (system test) or periodically after deployment to reassess
system vulnerability. Such tests complement the ongoing security activities of
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
the SSDL (see “Secure Software Development Lifecycle,” on page 14), helping
testers identify undiscovered implementation errors, discover potential security
issues resulting from boundary conditions, uncover security issues resulting
from build problems, and detect issues caused by interactions within the
underlying environment (for example, improper configuration files).
0ERFORMANCEåTESTS—Performance analysis.
(Note that in addition to functional tests, black-box testing includes fuzz testing;
see “Fuzz Testing,” on page 25.)
Because black-box testing does not rely on knowledge of the completed code,
test planning can often begin in the design phase, with testing performed
throughout the software development lifecycle.
Although it is tempting to assume that a given set of black-box tests find both
traditional and security bugs, it generally doesn’t work that way. Because the
pass-fail criteria is quite different (functional tests traditionally are positive tests,
while security tests are often negative tests), you should perform redundant
tests that focus on different pass-fail results. (For more about negative
requirements, see “Security Testing’s Negative Requirements,” on page 9.)
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
#OVERAGEå7HITE
"OX å4ESTING
For embedded systems, coverage testing is vital because the greater the test
coverage, the less likely it is that bugs will become apparent later. White-box
tests include:
Decision (branch) coverage—Test cases that cause every branch (both true
and false results) to execute.
It’s recommended that you use commercial test tools, such as Insight from
Klocwork [T3], to perform static testing of your embedded code.
'RAY "OXå4ESTING
A typical technique for performing gray-box testing is to run the software under
test in a debugger. As soon as the software is running, the normal tools of black-
box testing, such as fuzzers and automated regression suites, can be used. By
setting break points on lines of code that are potentially dangerous, the tester
can determine if such code cold be reached with external input to the program.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
!TTACKå4ESTING
Attackers will gain access (see Figure 6). And attackers make it their business to
understand typical program entry points, as well as coding patterns that are often
overlooked by developers who are focusing on the program’s intended functions
more than securing code.
The first step to designing attack tests is to fully analyze the inputs that an
attacker could use to gain unauthorized access to your program and through
which could manipulate it. These inputs, called the attack surface, can include
network I/O (such as sockets), APIs, open files, pipes, shared memory, and OS
calls.
Device Platform
Internet
Network I/O Memory Read
OS Calls
External
Process
&IGUREååå!TTACKERSåCANåGAINå
(Attack surface shown in a dashed-red line)
ACCESSåTOåANåAPPLICATIONå
THROUGHåITSåATTACKåSURFACE
The following resources can help you to fully define the attack surface [S19]:
System debugging tools that list the files, network ports, and other system
resources a program is using.
The source code itself for its use of system input/output APIs.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
Developer interviews to learn more about the code’s architecture.
Detective work using the same tools an attacker might (see “Attack Tools,” on
page 29).
Verify that user input will not allow an attacker to manipulate a back-end
database through an attack known as SQL injection.
Verify that cross-site scripting, an attack that can cause an attacker’s script to
execute in a victim’s Web browser, is not possible.
Verify that poor buffer handling while reading data from the network will not
cause a server to crash when it is sent an invalid packet that it erroneously
processes, resulting in a denial of service (DoS) attack or allowing a remote
attacker to execute code of his choosing.
Verify that errors are handled correctly so that the program safely recovers
from an unexpected input—the bread and butter of a software attack.
Verify that private data is protected when in transit over a network or when it
is stored.
Verify that information leakage, which can help an attacker stage attacks, does
not occur.
,OADå4ESTING
Load testing is just what its name implies: testing the software and embedded
device at (and exceeding) its intended capacity, whether that capacity is
measured as the number of users, connections, calculations, or what have you.
It is not unusual for a system to work perfectly fine at 50% capacity, but to
experience performance degradation, run up against resource limitations, or even
fail altogether when the load is at its design capacity.
In addition to standard module tests, it’s often quite beneficial to use commercial
standalone test tools, such as Spirent Communications’ SmartBits® [T2].
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
0ENETRATIONå4ESTING
So given that such test results are not particularly helpful to developers,
is penetration testing worth the time? Yes, especially when the testing is
performed earlier in the development lifecycle, and when the tests are designed
as white-box tests instead of black-box tests. This enables the development
team to use the test results to modify the code and even its design. In this
way, the test scenarios can exercise not only the program’s external interfaces,
but programmatic interfaces between modules, data flow, environmental
boundaries, and so on.
&UZZå4ESTING
Providing fuzz (random data) to the inputs of a program augments test monkey
suites (see “Test Monkeys,” on page 30) by finding bugs that occur in unusual
situations outside the normal program flow and testing.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
Fuzzers can be categorized in two ways [S15]:
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
-
- -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Application Applications
Layer
Transport IP
Layer
Wireless WPA2,
Datalink Bluetooth,
etc.
Embedded Application
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Wireless IP Based
Embedded System &IGUREååå#LASSIFYINGåFUZZERSå
BYåTHEIRåTARGETåATTACKåVECTORS
% % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % %
% % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % %% % %
% % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % %
Application Layer
% % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % %
&IGUREååå#LASSIFYINGåFUZZERSå
BYåTHEIRåTARGETEDåAPPLICATIONå
LOGICåLAYERS
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
Commercial tools such as Codenomicon Defensics [T1], Nessus™ from
Tenable Network Security [T4] (especially useful for random testing), and
Rapid7’s SSHredder™ [T6] can help ensure that your code is free from
malware vulnerabilities, network vulnerabilities, and communications protocol
vulnerabilities.
)NTEROPERABILITYå4ESTING
3YSTEMå4ESTING
System testing is not a single, unique type of test, but the logical culmination
of functional, integration, and security testing that is performed on the system
as a whole instead of its separate pieces. Activities such as stress testing,
performance testing, load testing, and many forms of penetration testing are
meaningless until the entire system is available. It is also important to repeat
functional and integration tests that may have been performed so early in the
development cycle that some components were replaced by test stubs.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
4HIRD
0ARTYå4ESTING
No matter how fully you think you’re tested your code, it’s doubtful that you’ve
been thinking like a hacker. Part of what makes hackers successful is that they
deliberately try to subvert a program’s normal and logical behavior. Simply testing
that the code does what it’s supposed to do under normal circumstances just
isn’t enough.
Therefore, it might be a good idea to contract with third parties who provide
independent testing and verification for protocol conformance, interoperability,
performance metrics, and more. Organizations such as Cryptography
Research (www.cryptography.com), VPNC (www.vpnc.org), and Offensive
Computing (www.offensivecomputing.net), which can certify performance and
interoperability, as well as serve as white hat hackers, help ensure that your code
really is safe and secure.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
Building a Custom Test Environment
One of the biggest challenges when testing embedded systems is building your
4ESTå%NVIRONMENTå
test environment—both the infrastructure and testing tools. Unlike desktop
#OMPONENTS
systems, whose vendors themselves test the operating system stack, CPU,
s Testing infrastructure
drivers, and so on, embedded systems are highly customized. Therefore, you (servers, platforms,
must test the hardware, stack communications, drivers, and so on yourself. databases, traffic
generators)
s Attack tools
4ESTINGå)NFRASTRUCTURE s Monitoring tools
s Commercial tools
A full-function test infrastructure should include at least the following elements:
s Test monkeys
Database for storing results and generating reports.
Traffic generators.
!TTACKå4OOLS
Many of the most effective attack tools are available for free download, or are
even open-source, making it easy for attackers to obtain them. Therefore, you
should use such tools in your own security testing:
There are so many attack tools; which should you use? First, look for tools that
have been written to attack the same protocols and file formats that your target
program uses. Next, if no such tools exist, or you’re using a custom protocol or
file format, look for a fuzzer framework that allows you to integrate your own
protocol or format handler. (For a list of fuzzing software, refer to the following
website: www.fuzzing.org.) Finally, you can often use the same functionality and
unit test tools that you use for traditional testing by modifying them to send the
same sort of random or malformed data that fuzzers do.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
-ONITORINGå4OOLS
Because embedded system OSes are generally proprietary, you usually must
create your own test tools to monitor memory, network operations, and RAM
utilization.
#OMMERCIALå4OOLS
In addition to the tests and tools you develop, your test environment should
include those commercial tools necessary to perform the full range of tests
discussed earlier (see “Types of Tests,” on page 20). Such tools typically can
exercise your code in a more automated and complete fashion than would
otherwise be feasible. It is important to evaluate the tools early in the test
environment design phase so as to ensure operating system compatibility,
sufficient resources (disk space and memory availability, for example), and
communications/interface support.
For a representative list of recommended tools, see “Test Tools,” on page 35.
4ESTå-ONKEYS
Test monkeys are build commands and tests designed to run automatically,
without human intervention, on a scheduled basis (whether that schedule is
according to the clock or on-demand due to a code check-in). Because embedded
systems usually must be tested on many platforms, test monkeys are particularly
helpful because they ensure that every test is run every time on every target.
As described earlier, engineers should run test monkeys on the local &IGUREååå4ESTåMONKEYSåAREå
AåCRITICALåCOMPONENTåOFåYOURå
development machine before checking in code, and an automated test monkey TESTåENVIRONMENT
framework should perform builds and run test suites on all of a device’s
supported OS-platform combinations automatically.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
Test monkeys perform several important families of tasks:
"UILDS—Before the monkeys can run tests, they need to build the required
executables. For embedded devices, this involves not only the actual build
commands (such as invoking make files), but building the required images and
downloading them to the target devices. All this requires in-depth knowledge
of every target device’s operating system and the steps required to download
firmware onto the device.
Regression testing.
Power cycle devices and boards when necessary, such as when loading
new images or recovering from crashes.
Write test log results and dashboards (summaries) to the test database.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
Conclusion
Although only a surface-level survey of security design and testing (see
“References and Further Reading,” on page 33), this paper can certainly speed
you on your way to improving your development and testing processes for
secure embedded applications. From examining the characteristics of embedded
systems, it’s clear that your testing focus must be broader than required for
traditional application testing, particularly in the areas of real-time behavior and
performance and capacity testing.
Taking the two topics together (that is, embedded systems and secure
applications), this paper provides some best practices, describes in some detail
the variety of tests that are important to perform (particularly security-related
tests such as attack, penetration, and fuzz testing), and outlines what to include
when building your test environment.
Using these recommendations as a guide, you can ensure that your embedded
application is robust and secure, reduce time to market, and promote positive
brand identity.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
References and Further Reading
3ECURITYåANDå4ESTINGå"OOKSåANDå!RTICLES 2EFERENCES
[S1] R. M. Backus, Embedded systems security has moved to the forefront, Embedded. s Security and Testing
com, 10/07/07, URL:http://www.embedded.com/design/networking/202103432, Books and Articles
accessed: 2009-03-07. (Archived by WebCite® at http://www.webcitation. s Coding Style Guidelines
org/5f6YJ5ENw.) s Test Tools
[S4] Walter Bright. Code Coverage Analysis, Dr.Dobb’s CodeTalk, June 7, 2008,
URL:http://dobbscodetalk.com/index.php?option=com_myblog&show=Coverage-
Analysis.html&Itemid=29, accessed: 2009-03-05. (Archived by WebCite® at http://
www.webcitation.org/5f3ifBGHH.)
[S5] Vincent Encontre, Testing embedded systems: Do you have the GuTs for it?,
2005, IBM, URL: http://www.ibm.com/developerworks/rational/library/459.
html, accessed: 2009-02-23. (Archived by WebCite® at http://www.webcitation.
org/5eoXrvU9M.)
[S6] Mark G. Graff & Kenneth R. van Wyk, Secure Coding Principles & Practices, 2003,
O’Reilly & Associates.
[S7] Michael Howard and David LeBlanc, Writing Secure Code, 2003, Microsoft
Corporation.
[S8] Michael Howard, David LeBlanc, and John Viega, 19 Deadly Sins of Software
Security: Programming Flaws and How to Fix Them, 2005, McGraw-Hill
Companies.
[S9] Nat Hillary, Measuring Performance for Real-Time Systems, 2005, Freescale
Semiconductor, URL: http://www.freescale.com/files/soft_dev_tools/doc/white_
paper/CWPERFORMWP.pdf, accessed 2009-02-23.
[S11] Gary McGraw, editor, Software Security Testing, IEEE Security & Privacy,
September/October 2004, URL: http://www.cigital.com/papers/download/bsi4-
testing.pdf, accessed 2009-02-23.
[S12] Gary McGraw, editor, Software Penetration Testing, IEEE Security & Privacy,
January/February 2005, URL: http://www.cigital.com/papers/download/bsi6-
pentest.pdf, accessed 2009-02-23.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
[S13] C. C. Michael and Will Radosevich. Black Box Security Testing Tools, 2005,
Cigital, Inc., URL:https://buildsecurityin.us-cert.gov/daisy/bsi/articles/tools/black-
box/261-BSI.html, accessed: 2009-02-23. (Archived by WebCite® at http://www.
webcitation.org/5eoQuS66s.)
[S14] C. C. Michael and Will Radosevich, Risk-Based and Functional Security Testing,
2005, Cigital, Inc., URL:https://buildsecurityin.us-cert.gov/daisy/bsi/articles/best-
practices/testing/255-BSI.html, accessed: 2009-02-23. (Archived by WebCite® at
http://www.webcitation.org/5eoVUWdcx.)
[S15] Ari Takanen, Jared D. Demott, and Charles Miller, Fuzzing for Software Security
Testing and Quality Assurance, 2008, Artech House, Inc.
[S16] Maura A. van der Linden, Testing Code Security, 2007, Auerbach Publications.
[S17] Kenneth R. van Wyk, Adapting Penetration Testing for Software Development
Purposes, Carnegie Mellon University, 2007, URL:https://buildsecurityin.us-cert.
gov/daisy/bsi/articles/best-practices/penetration/655-BSI.html, accessed: 2009-02-
23. (Archived by WebCite® at http://www.webcitation.org/5eoUKoUtg.)
[S18] James A. Whittaker, How to Break Software: A Practical Guide to Testing, 2003,
Pearson Education, Inc.
[S19] Chris Wysopal, Lucas Nelson, Dino Dai Zovi, and Elfriede Dustin, The Art of
Software Security Testing, 2007, Symantec Corporation.
#å#ODINGå3TYLEå'UIDELINES
[C2] L.W. Cannon, et. al., Recommended C Style and Coding Standards, URL:http://
www.doc.ic.ac.uk/lab/cplus/cstyle.html, accessed: 2009-03-06. (Archived by
WebCite® at http://www.webcitation.org/5f4tdERfB.)
[C3] Jim Larson, Standards and Style for Coding in ANSI C, URL:http://www.jetcafe.
org/~jim/c-style.html, accessed: 2009-03-06. (Archived by WebCite® at http://
www.webcitation.org/5f4tA3DQ7.)
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
4ESTå4OOLS
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
Appendix: Nano$EFENDERTM
Mocana’s device intrusion detection system that
defeats malware while eliminating false positives
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
NanoBoot consists of two components: NanoDefender™ Features
a command line tool, which digitally signs
the authorized firmware image, and a NanoDefender is a comprehensive intrusion
small signature verification application that prevention that secures all aspects of a
executes during initialization from within a device: communications, identity, access,
processor’s protected flash memory. The privilege, control and execution. It tracks the
NanoBoot application may be a little as 8 KB, function flow within an application instead of
and require less than 2 KB of RAM, enabling relying on an “attack database” for defense.
SoC design. When the device is powered And, better yet, it delivers complete security
up, NanoBoot verifies the device’s signature, without time-consuming false positives.
thereby ensuring that the device’s firmware
has not been altered. #OMMONå#ODEå0ROTECTION
Mocana’s NanoUpdate is an easy to use, Like all of Mocana’s device security toolkits,
high performance Secure Firmware Update NanoDefender is CPU-architecture and
solution. Mocana NanoUpdate enables platform independent. Linux platforms are
firmware images and other messages to supported out-of-the-box, and ports to other
be securely delivered to devices in the common platforms such as BSD, OSE,
field automatically, eliminating the need for Nucleus, Solaris, ThreadX, Windows, MacOS
insecure manual methods, like email, TFTP, X, (ARC) MQX, pSOS, and Cygwin, as well
FTP, HTTP, or physical DVDs. as real-time operating systems such as
VxWorks, are easily achieved.
NanoUpdate’s command line tool can
create a PKCS #7–digitally signed message.
The signed message is placed at a well
known URL that is is programmed to check
for updates. The signed message is then
downloaded, authenticated, verified, de-
capsulated, saved, and/or acted upon.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
NanoDefender™ Benefits NanoBoot Module Benefits
Prevent subversion (tampering) of
#OMPREHENSIVEå!TTACKå0ROTECTION
firmware images. Blocks unlicensed
Designed to prevent malicious code firmware upgrades and protects
execution in the context of an existing intellectual property.
application or process, NanoDefender can
Enables you to assign unique IDs, such
shut down any exploit changing the function
as SKUs, to firmware images using
flow within running code before it has the
cryptographic private keys.
chance to do any damage. NanoDefender
even provides protection from remote and One simple API function to call at startup
local stack-based overflows, format string or periodically as desired. Endian neutral
attacks/string exploits, heap overflows, and & RTOS not required.
return-to-libc integer overflows.
Code can run in ROM, not just RAM. Ultra
small footprint enables SoC (system on
.Oå&ALSEå0OSITIVES
chip) design.
Because NanoDefender only acts if
“disallowed” behavior is detected, false
positives are impossible. Using a rules base NanoUpdate Module Benefits
of acceptable behavior for any applications Endian-neutral and RTOS not required,
running on the new device, NanoDefender CPU-architecture and platform
only terminates an application if begins independent. Platforms supported out-of-
behaving erratically due to malware or some the-box include Linux, Monta Vista Linux,
other security threat. VxWorks, OSE, Nucleus, Solaris, ThreadX,
Windows, MacOS X, (ARC) MQX, pSOS,
4RULYå0AINLESSå)NTEGRATION
and Cygwin.
NanoDefender was built for ease-of-use and Powerful, simple, easy-to-use API. No
ease of installation from the ground up. It’s crypto expertise required.
a snap to integrate into applications—just
Simple, secure, easy-to-use and install.
rebuild an application using a Mocana-
provided code analyzer and linker. Absolutely Exclusive command line tool for signing
no changes to your code are required. Plus firmware images and messages.
Mocana’s developer support team is available Extends device lifetime out in the field.
24x7 to answer your questions about crypto, Creates new revenue opportunities for
our toolkits, or embedded development in already-deployed hardware.
general.
Can be used for both wired and wireless
mobile applications, over local or remote
networks.
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html
About Mocana
Mocana provides device management solutions and embedded security tools
for consumer electronics manufacturers, datacom companies, telecom carriers, Mocana Solutions
industrial automation applications, and the enterprise. Mocana’s industry-leading NanoBoot™
Secure preboot verification
infrastructure software solutions ensure that wired and wireless devices, for firmware
networks, and their services all scale securely. Mocana offers 18 integrated NanoUpdate™
products, which are the security solution of choice for more than 90 major Secure firmware updates
customers, including Cisco, Freescale, Philips, Dell, Nortel Networks, Harris, NanoWall™
Embedded system firewall
Honeywell, Symbol, Net.com, and Radvision. NanoSSH™
High-performance
Winner of the 2008 Red Herring Top 100 Tech Startups in the World and 2008 SSH client and server
Frost & Sullivan Technology Innovation of the Year awards, Mocana was founded NanoSSL™
Super-small SSL client and
in 2004, is privately held, and is headquartered in San Francisco, California. For server
more information, visit www.mocana.com. NanoSec™
Device-optimized IPsec,
IKEv1/v2, MOBIKE
NanoEAP™
Downloads and Contacts EAP supplicant and
802.11 extensions
For details about the Mocana Device Security Framework, visit http://www. NanoCert™
mocana.com/device-security-framework.html. Certificate managment
for client devices
For your 90-day free trial, visit www.mocana.com/evaluate.html. NanoDTLS™
Embedded DTLS client
For pricing and purchase information, email sales@mocana.com or call NanoDefender™
866-213-1273. Intrusion detection
for devices
NanoPhone™
Quick-development
security toolkit for
Google Android handsets
VPNC
CERTIFIED
Basic
Interop
Tech AES
Interop
Choice IKEv2 Basic
2008 Interop
IPv6
Interop
Best Practices for Testing Secure Applications for Embedded Devices – Free evaluation code at www.mocana.com/evaluate.html