You are on page 1of 10

A PROACTIVE APPROACH TO SERVER SECURITY

A Spire Research Report


Sponsored by Bit9, Inc.

2011 Spire Security, LLC. All rights reserved.

_______________________________________________ A Proactive Approach to Server Security

Executive Summary
We dont need to look very far to see the increasing risk to servers as well as the inadequacy of existing solutions to stem the flow of successful attacks. Servers are easily accessible around the world; they allow direct interaction from thousands or even millions of users, and typically contain information assets of high value to attackers. Current protection schemes rely heavily on reactive approaches to security that are ineffective and inefficient. The security profession has long held that a proactive approach to security that starts with default deny is superior to the default allow situation we are in today. But the traditional obstacle to any lockdown approach has been the false positive the potential for disruption of legitimate applications. Server environments are well-suited to a proactive approach. They are physically accessible to administrators, have fewer changes, and more regimented change control processes. One simple way to think about protection is by evaluating the overhead associated with management of a security solution. An application whitelisting solution is established as a positive approach with fewer touches required that excels in a controlled environment.

About Spire Security


Spire Security, LLC conducts market research and analysis of information security issues. Spire provides clarity and practical security advice based on its Four Disciplines of Security Management, a security reference model that incorporates and relates the functions of identity management, trust management, threat management, and vulnerability management. Spires objective is to help refine enterprise security strategies by determining the best way to deploy policies, people, process, and platforms in support of an enterprise security management solution. This white paper was commissioned by Bit9, Inc.. All content and assertions are the independent work and opinions of Spire Security, reflecting its history of research in security audit, design, and risk management experience.
2011 Spire Security, LLC. All rights reserved.

A Proactive Approach to Server Security _______________________________________________

A PROACTIVE APPROACH TO SERVER SECURITY


Table of Contents

INTRODUCTION THE SERVER RISK PROFILE ASSESSING THE CURRENT STATE RETHINKING SERVER SECURITY PROTECTION CHARACTERISTICS OF SERVERS SELECTING A PROTECTION APPROACH THE BENEFITS OF APPLICATION WHITELISTING SPIRE VIEWPOINT

1 1 2 3 4 5 5 6

ii

2011 Spire Security, LLC. All rights reserved.

_______________________________________________ A Proactive Approach to Server Security

Introduction
Our servers are under attack. Everywhere you look there are stories about breaches that penetrate Web servers, drop malicious code, and burrow deeper into the data center. This isnt necessarily news, but the popularity of server-based attacks is on the rise. Of course, things are never quite the same, as attackers combine strategies, work up the stack to the application layer, and take advantage of new technologies and architectures. For the past few years organizations have been focusing efforts on hardening user endpoints like desktops and laptops, so the attackers have been migrating their strategies to attack servers. Nowadays, any threat, including those that are advanced and persistent, wants to find a way into an environment in a multi-staged attack. The first step comes in multiple ways sometimes, it involves a spear-phishing attack against an end user, but often it incorporates penetrating the initial defenses of the Web server. This compromise provides a staging location for the second phase, which drops control code for remote management and exploitation onto a server. Further steps may involve exploiting known vulnerabilities, running sniffer or other monitoring software, or compromising admin accounts deeper in the data center. To develop a protection strategy for the server environment, it is worth reviewing the server risk profile, evaluating the server security strategy, identifying key characteristics of a server environment, and establishing a protection strategy for servers.

The Server Risk Profile


It is clear from published reports that attacks against servers are of primary importance. The 2011 Verizon Data Breach Investigations Report (DBIR) points out that, In the combined Verizon/USSS data set for 2010, servers edged out user devices to maintain the top spot [from 2009] as the class of assets most often involved in a data breach. It is worth diving deeper to evaluate the server risk profile and compare it to applicable other platforms that organizations must manage.

Servers are available and accessible


Servers like to stay in one place and provide resources to hundreds, thousands, or more simultaneous users. Whats more, they are often deployed on the Internet and made accessible from around the world. That is, 15,000 miles might as well be the house next door for server-based attacks; both locations can be equally effective for launching an attack. Conversely, a laptop either needs to be on the same local network as an attacker, such as a wi-fi hotspot, or tricked from afar into clicking on a link for an attack to be initiated against it.

2011 Spire Security, LLC. All rights reserved.

A Proactive Approach to Server Security _______________________________________________

Servers allow direct connections


A servers job is to provide services. Typically, this means running application services that remain available and listening for connection attempts. Since the service is looking for a connection, vulnerabilities can be exploited at any time. This makes servers more immediate targets than laptops since the attacker doesnt have to wait for some sort of user interaction.

Servers are interactive


Servers often establish and maintain connections which enable interactive hacking attacks as well as payload-based attacks. These interactive attacks allow hackers to spend extended periods of time on the site and leverage many different attack techniques, including using custom code, in order to compromise the system. This added variety of options for servers, along with the ongoing challenge of polymorphic malware, make identifying attacks difficult.

Servers are valuable to attackers


It appears that end users are finally getting wise to the problems of carrying large amounts of sensitive data around. Their solution? Drop the data on servers and access it when needed. This along with the obvious benefit of a server sharing resources among many users makes the target more attractive to attackers. Even servers without the data itself often have privileges and sit deeper into the data center so that persistent code can significantly benefit the attacker.

Assessing the Current State


As risk to servers continues to increase, and incidents mount, it is useful for organizations to evaluate the protection mechanisms they have in place to ensure that they are optimizing their resources for efficiency and effectiveness. Unfortunately, the report card isnt very good. In general, server security can be characterized in three different ways: Reactive organizations are often forced to address threats rooted in incidents that happen to others. With weak access control, sometimes even this isnt helpful because there are no hacks or malware actually in play. Ineffective a simple review of the incidents that occur can help understand that our existing protection mechanisms are unsuccessful for a substantial number of attacks. This can often be shown through a capabilities review where no protection mechanism is in place to address a specific problem. On the malware side, approaches cant keep up with their variety and dynamic nature. Inefficient With incidents and the hurry up defense, it is difficult to see current protection mechanisms as efficient from a resource consumption perspective.

2011 Spire Security, LLC. All rights reserved.

_______________________________________________ A Proactive Approach to Server Security


With attention being refocused on servers, it is time to review and possibly rethink server security.

Rethinking Server Security


All large enterprises already have server security strategies, but it is worth reviewing the concepts periodically to reassess whether previous decisions still hold. The security professionals goal is to stop all attacks, and only attacks, all the time. Practically speaking, this is impossible, creating the need for effectiveness and efficiency measures that involve judgment calls for optimization. Still, the goal is to maximize the number of attacks thwarted and minimize server disruption - where legitimate users are incorrectly denied access to resources - at the lowest cost possible. Security professionals often toss around terms that are closely related but not exactly so. A good way to think about the security strategy is to define an approach, identification context, and protection scheme while addressing all obstacles. In some ways, this can become a philosophical debate, but it contributes to the overall effectiveness of the program.

Approach: Default Allow vs. Default Deny


In determining the best possible approach to optimize resources, there are a handful of related concepts that must be evaluated. First, whether the initial philosophy will be one of default allow or default deny. This decision involves deciding whether to work from a starting point that initially provides broad access to users and then restrict access (default allow), or a starting point where resources are completely restricted and allow access from that point (default deny). It is rare to find the security professional who doesnt believe default deny is the superior starting point for lower risk; it is equally rare to find the organization that leverages that approach.

Identification: Known Good vs. Known Bad


A derivative concept from default allow and default deny involves our ability to identify either known good objects and activities or known bad ones. Identifying known good is often a deterministic activity that can take place prior to turning on a production environment. A known bad approach, the most popular one to date, involves monitoring activity and behavior and attempting to define all malicious use cases.

Protection: Whitelisting vs. Blacklisting


At the protection level, a security professional can attempt to identify all those applications that are running in the environment and create a whitelist that allows only those apps to run, or she can try to identify all the variations and code snippets for attack software in the world and attempt to classify and characterize it into a

2011 Spire Security, LLC. All rights reserved.

A Proactive Approach to Server Security _______________________________________________


blacklist that seeks and destroys bad behavior. Historically, the blacklist approach has been most popular for two reasons: 1) blacklists can more specifically identify the malicious code; and 2) identifying items for whitelists was often too static and incomplete. In the past, there was no easy, dynamic way to identify all software running on the server.

Obstacles: False Positives vs. False Negatives


More often than not, the real driving force behind security strategy decisions ends up being the extent of problems. For inline controls, the problems occur when incorrectly classified software is either allowed to run when it shouldnt be allowed, as happens more frequently with blacklisting, or not allowed to run when it should be allowed, usually more of a problem with whitelisting. These obstacles lead to either incidents and compromise or inappropriate disruption in services.

Roundup
Theoretically, security professionals strive for a default deny, known good, whitelist approach because it is typically more proactive than their partners. But the truth is that these pairs of concepts sit on a spectrum and security developers look for ways to balance them all with the goal of minimizing false positives and false negatives.

Protection Characteristics of Servers


As mentioned, the theoretical approach often differs from the practical one for security. As part of the review process, however, it helps to evaluate the characteristics of the platform in this case servers to see what has changed since previous decisions have been made. The tension between known good and known bad is well-known on endpoints where malware runs rampant. Servers deserve separate consideration and understanding as the use cases and circumstances are different.

Servers stay in one place


Servers are not particularly mobile. That is the domain of laptops that disappear like a teenager after curfew. Servers like to stay in one place usually a data center, but at least some sort of special closet or protected room. This static physical profile allows people to find servers to manage them.

Servers have fewer application changes


Without end users clicking on every link they see that promises unimaginable riches (or at least discount drugs), systems can remain stable, running the same applications and utilities for extended periods. Because servers provide functionality to many users, a regimen of maintaining a static, controlled environment is often prescribed. New applications and utilities are added only after careful consideration.

2011 Spire Security, LLC. All rights reserved.

_______________________________________________ A Proactive Approach to Server Security

Servers have more regimented management processes


Organizations recognize the importance of their servers and usually have a program already in place for maintenance and updates. This provides increased control and knowledge of server environments, particularly as compared with laptop management practices.

Selecting a Protection Approach


As mentioned in other parts of this paper, efficiency and effectiveness are key objectives of a successful security program (really, any program). This breaks down into the following attributes: Fewest number of touches Sports programs talk about stops and plays as efficiency measures and so touches are simply the number of times a security solution must be updated or reconfigured. Every touch not only consumes resources but also increases the risk of simple mistakes and programming errors. Highest number of attacks stopped From an effectiveness perspective, there is no greater measure than the security programs ability to stop an attack before it happens, and to do it deliberately. Least amount of user disruption Any security professional knows that blocking resources from legitimate usage is the kiss of death for any security program. It is inefficient to have to deal with inappropriately blocked activity.

These attributes open the door to whitelisting on the server side.

The Benefits of Application Whitelisting


What is changing more frequently - the threat landscape of attacks against your servers or the applications you are running on them? This is the basic question that must be answered to assess the overhead and potential for damage that a security solution may provide to an organization. With its lower application churn rate and the increased variety of attacks, servers are a perfect candidate for whitelisting. Heres why.

Superior approach
With its pedigree in the default deny and known good camps, whitelisting is more likely to stop attacks than a blacklist approach. Not only can it catch new and unknown malware, but it can be managed proactively before any attacks are active.

Fewer touches
No organization wants to make changes to its fully-functioning servers unless its absolutely necessary. Security is one of those needs that forces touches of the server to make configuration changes, update the software, and add new

2011 Spire Security, LLC. All rights reserved.

A Proactive Approach to Server Security _______________________________________________


applications. Because of the slower rate of application introduction and the increased rate of malware being introduced, it is likely that a whitelist approach will result in fewer touches of the server over time.

Excel in controlled environment


One of the challenges of whitelisting on laptops is the natural contention between users of the device and the administrators. Users always want more apps than the admins would like. In a server environment, the administrators have much more control. Applications and processes on servers are supposed to be predefined, so applying stricter controls should not be a problem.

Spire Viewpoint
All security professionals agree that a default deny approach is more secure than default allow. The trick has always been manageability even well-controlled environments claim some discomfort in being able to predefine applications and processes in an efficient manner. In todays threat environment, many organizations will find that the number of touches for signature updates associated with new malware is accelerating. In a server environment, these touches are likely to exceed the number of management changes required by whitelisting software.

2011 Spire Security, LLC. All rights reserved.

Contact Spire Security


To comment about this white paper or contact Spire Security, LLC about other security topics, please visit our website at www.spiresecurity.com. This white paper was commissioned by Bit9, Inc.. All content and assertions are the independent work and opinions of Spire Security, reflecting its history of research in security audit, design, and consulting activities.

Spire Security, LLC I P.O. Box 152 I Malvern, PA 19355 www.spiresecurity.com

You might also like