You are on page 1of 7

MULTITENANCY

Multi-tenancy is an architecture in which a single instance of a software application serves


multiple customers. Each customer is called a tenant. Tenants may be given the ability to
customize some parts of the application, such as color of the user interface (UI) or business
rules, but they cannot customize the application's code.

Multi-tenancy can be economical because software development and maintenance costs are
shared. It can be contrasted with single-tenancy, an architecture in which each customer has
their own software instance and may be given access to code. With a multi-tenancy
architecture, the provider only has to make updates once. With a single-tenancy architecture,
the provider has to touch multiple instances of the software in order to make updates.

In cloud computing, the meaning of multi-tenancy architecture has broadened because of new
service models that take advantage of virtualization and remote access. A software-as-a-
service (SaaS) provider, for example, can run one instance of its application on one instance
of a database and provide web access to multiple customers. In such a scenario, each tenant's
data is isolated and remains invisible to other tenants.

Multi-tenancy defined

A tenant is any application -- either inside or outside the enterprise -- that needs its own
secure and exclusive virtual computing environment. This environment can encompass all or
some select layers of enterprise architecture, from storage to user interface. All interactive
applications (or tenants) have to be multi-user in nature.

A departmental application that processes sensitive financial data within the private cloud of
an enterprise is as much a "tenant" as a global marketing application that publishes product
catalogs on a public cloud. They both have the same tenancy requirements, regardless of the
fact that one has internal co-tenants and the other has external.

Multi-tenancy is the key common attribute of both public and private clouds, and it applies to
all three layers of a cloud: Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS)
and Software-as-a-Service (SaaS).

Most people point to the IaaS layer alone when they talk about clouds. Even so,
architecturally, both public and private IaaSes go beyond tactical features such as
virtualization, and head towards implementing the concept of IT-as-a-Service (ITaaS)
through billing -- or chargeback in the case of private clouds -- based on metered usage. An
IaaS also features improved accountability using service-level-agreements (SLAs), identity
management for secured access, fault tolerance, disaster recovery, dynamic procurement and
other key properties.

By incorporating these shared services at the infrastructure layer, all clouds automatically
become multi-tenant, to a degree. But multi-tenancy in clouds has to go beyond the IaaS
layer, to include the PaaS layer (application servers, Java Virtual Machines, etc.) and
ultimately to the SaaS or application layer (database, business logic, work flow and user
interface). Only then can tenants can enjoy the full spectrum of common services from a
cloud -- starting at the hardware layer and going all the way up to the user-interface layer,
depending on the degree of multi-tenancy offered by the cloud.

Degrees of multi-tenancy

The exact degree of multi-tenancy, as it's commonly defined, is based on how much of the
core application, or SaaS, layer is designed to be shared across tenants. The highest degree of
multi-tenancy allows the database schema to be shared and supports customization of the
business logic, workflow and user-interface layers. In other words, all the sub-layers of SaaS
offer multi-tenancy in this degree.

In the lowest degree, multi-tenancy is limited to the IaaS and PaaS layers, with dedicated
SaaS layers for each tenant.

And in the middle degree of multi-tenancy are clusters of homogenous tenants that share
database schemas (or schemae) and other application layers. In the middle level, each cluster
of users has its own version of database schema and the application itself.

We can sum up the discussion on the degree of multi-tenancy as follows:

Highest degree: IaaS and PaaS are multi-tenant. SaaS is fully multi-tenant also.
Middle degree: IaaS and PaaS are multi-tenant. Small SaaS clusters are multi-tenant.
Lowest degree: IaaS and PaaS are multi-tenant. SaaS is single tenant.

For example, Salesforce.com, at the relatively high end of the multi-tenancy spectrum, has
72,500 customers who are supported by 8 to 12 multi-tenant instances (meaning IaaS/PaaS
instances) in a 1:5000 ratio. In other words, each multi-tenant instance supports 5,000 tenants
who share the same database schema. Intacct, a financial systems SaaS provider in the middle
of the spectrum, has more than 2,500 customers who share 10 instances in a 1:250 ratio.

Private clouds, and offerings such as SAP's Business By Design (due this summer), would be
at the lowest end of the spectrum of multi-tenancy, with application layers that are dedicated
and are more suited for specific large enterprise customers.

How to choose your multi-tenancy degree


One size doesn't fit all when choosing between different degrees of multi-tenancy. The
characteristics of the workload in question have to be carefully studied first, including the
workload's utilitarian versus strategic value, volatility, security, etc. Higher degrees of multi-
tenancy are best suited for cross-industry utilitarian workloads such as catalog management
and sales force management.

These applications can very easily share the same schema and also benefit from a rapidly
evolving set of features that are best developed centrally by a vendor or a corporate shared
services team. They also tend to have simpler security requirements such as encryption and
authorization. That's why public clouds are attractive multi-tenant platforms for "low-hanging
fruit" types of workloads such as e-mail, collaboration, situational applications (expense
reporting, travel authorization) and pre-production activities (development, user training and
functional/acceptance testing).

For each such workload, IT managers need to determine the degree of multi-tenancy needed,
and accordingly choose their providers from a growing list of vendors.

But for workloads that are meant for private and community (consortium of enterprises)
clouds, the responsibility of designing a multi-tenant architecture rests with the IT managers.
For these workloads there is a large list of fast-maturing technologies from both established
and start-up vendors. IT managers have to evaluate these vendors and build their own custom
IaaS, PaaS and SaaS layers, including support for building shared services and shared
database schema.

Multi-tenancy is the core tenet of cloud computing. While multi-tenancy takes forward some
of the concepts of mainframe computing to the x86 server ecosystems, its ongoing efforts to
scale up these mainframe concepts to support thousands of intra- and inter-enterprise tenants
(not users) are complex, commendable and quite revolutionary. It's only when the required
degree of multi-tenancy is incorporated into all the layers of public and private clouds that the
promises of improved scalability, agility and economies of scale can be fully delivered.

PRIVACY IN CLOUD:

What Is Privacy?
The concept of privacy varies widely among (and sometimes within) countries, cultures, and
jurisdictions. It is shaped by public expectations and legal interpretations; as such, a concise
definition is elusive if not impossible. Privacy rights or obligations are related to the
collection, use, disclosure, storage, and destruction of personal data (or personally
identifiable informationPII). At the end of the day, privacy is about the accountability of
organizations to data subjects, as well as the transparency to an organizations practice
around personal information.
Likewise, there is no universal consensus about what constitutes personal data. For the
purposes of this discussion, we will use the definition adopted by the Organization for
Economic Cooperation and Development (OECD): any information relating to an identified
or identifiable individual (data subject).*
Another definition gaining popularity is the one provided by the American Institute of
Certified Public Accountants (AICPA) and the Canadian Institute of Chartered Accountants
(CICA) in the Generally Accepted Privacy Principles (GAPP) standard: The rights and
obligations of individuals and organizations with respect to the collection, use, retention, and
disclosure of personal information.
IMPROVING
PERFORMANCE:

1. Preserve the Life of Legacy Systems

Is your organization holding on to old legacy systems that won't run on today's hardware
systems and is no longer supported by the vendor, but still handles critical workloads for your
company? Virtualization frees you from that antiquated machine that you keep stuck in the
back just to run the dinosaur. You can now keep your pet system without the hideous, slow,
dysfunctional box that has housed it since the Bush administration.

2. Lower Operational Costs in the Data Center

Running a sprawling, ever-growing set of servers is costly. Not only is the hardware
expensive, but you've also got to pay to cool the stuff down and maintain it. Virtualization
eliminates much of the hot, expensive hardware so that you can REDUCE OPERATIONAL
COSTS, including the amount of space you need to house the data center, which is quite
expensive in some zip codes.

3. Deploy New Servers in Minutes

If a physical server goes down, it can take hours to get it repaired or replaced, even longer if
you don't happen to have a replacement sitting around somewhere. It can be out even longer
if you have to order one or send someone out to buy one. A virtual server can be up and
running in just minutes, freeing up your IT workers for other tasks and minimizing the
downtime experienced by users (who are often losing productive work time during a server
outage).

4. Simplify the Processes of Backups and Disaster Recovery

Virtual servers don't require lengthy backups that suck network performance. Backups are
just a matter of getting Snapshots, which can be taken multiple times during the day. That
means that if you do have to go to a 'backup' Snapshot, you don't lose much at all in the way
of data. Virtualization greatly simplifies the PROCESS OF DISASTER RECOVERY if your
organization happens to be hit by something like a massive data breach or a natural disaster.

5. Provide the Ideal Testing Environment for Developers

When developers test a project and it goes horribly wrong in a physical server environment, it
can cause real problems that affect your systems and users. Virtualized test environments
allow developers to isolate their projects from the users, while keeping it online for testing
purposes. If everything goes awry, developers simply revert to a previous Snapshot and carry
on. No harm, no foul.

6. Claim Independence from Greedy Hardware & Software Vendors


Physical data centers are tied to specific proprietary hardware and software. Are you sick of
that? Many organizations are. Virtualized servers are hardware and software agnostic. A
virtualized server doesn't care if they're on a system built by ABC Company or XYZ Inc. It
completely frees your data center from proprietors trying to box you into using only their
stuff.

7. Add "Going Green" to the List of Reasons Customers Love You

If "GOING GREEN" is important to your business' image, a virtualized data center consumes
less power, produces less waste, and generally has a lower impact on the environment.
Whether green initiatives are a heartfelt part of your company's policies or just something
that sounds really good on a marketing brochure, virtualization is definitely a greener way to
go.

SECURITY CHALLENGES:

1: Data Security

Our recent Insider Threat Report Cloud/Big Data edition featured survey results indicating
the locations of where the largest volumes of sensitive data are stored:

Databases (49%)
File servers (39%)
Cloud service environments (36%)

Cloud trails closely behind databases and file servers as a top location for the storage of
sensitive data. Much of that data is sensitive, regulated or legally controlled information.
Needless to say, a lot has changed in the past few years.

Ensuring that data is secure when deploying a cloud environment can be a daunting task.
Naturally, as the adoption of cloud resources continues to grow, the risk of data breaches
grows with it. The fear of a new data breach is so high, that preventing them tops the list as
the number one spending priority, trumping compliance in our survey for the first time.

But, there are common sense strategies for protecting data. Implementing and designing a
proper cloud security database structure can help mitigate the risk. This includes protecting
data at the file level or application level through transparent and application level encryption.
Additional methods of encryption can involve tokenization and dynamic data masking. In his
latest blog, Vormetrics CEO Alan Kessler breaks down the when and why behind these
approaches.
2: Navigating Global Trust Issues

We think its fair to say the Edward Snowden/NSA revelations have seriously impacted
global trust levels. An increasing number of enterprises (and their governments) are unwilling
to put their data in the hands of U.S.-based cloud service providers (CSPs). This anxiety has
manifested itself on the policy level; for example, many data-and-privacy-focused countries,
such as Germany and Japan, have tightened up their data residency requirements even
further.

For CSPs to increase their footprint in the enterprise, they must address enterprise
requirements around security, data protection and data management. More specifically, CSPs
need to provide better protection and visibility to their customers.

One company taking a proactive approach to assuaging customer fears is Amazon. In October
of 2014, Amazon AWS announced it would open new data centers in Germany to ensure
compliance with both EU and German privacy laws. In theory, this will allow German AWS
customers to keep their data physically inside Germany and in compliance with German law.

3: Shadow IT

Businesses are evolving quickly and, via shadow IT, internal business units and operating
groups are often bypassing IT and IT security controls altogether in order to get things done.
While this might speed things up, it can open the door for security vulnerabilities that are
expensive to fix. Keeping stock of, and tamping down on, shadow IT endeavors is vital,
especially when it comes to the cloud.

One of the best ways to prevent against leakage of sensitive data due to shadow IT is to a)
encrypt data and b) implement an intelligent key management model. Key management
basically allows for access control, which means limiting access to encrypted data to only
those whose work requires it.

When it comes to key management, theres basically two models to consider for encrypted
data. Either the enterprise owns and manages the key, or allows the CSP to own and manage
the key on its behalf. Each model has its own risks, so the final decision should depend on the
level of risk and cost the enterprise is prepared to take on. As a best practice, as the owner of
the data, we recommend enterprises own and manage their own keys.

Vormetric Cloud Encryption, for example, includes encryption key management within the
solution and is completely transparent to applications and users. This allows for existing
processes and usage to continue with no changes. Thus, enterprises can protect any data file
within cloud environments simply, easily and efficiently.

4: Advanced Attacks & Cyber Conflicts

In 2014, I predicted there would be a major cloud or SaaS provider data breach in 2015.
Threat actors are gaining in sophistication, and attacks are becoming more complex. While
we cant predict the future, we can take steps to prevent cybersecurity attacks and create a
safer environment.

In our opinion, the best way to do this is to encrypt, encrypt, encrypt. As Alan noted back in
October, very people at very smart companies have come to the conclusion that encrypting a
vast majority of their data is one of the best things they can do to reduce risk and assuage
customer fears. While no company or CEO wants to discuss a data breach, having a broad-
based strategy to make data protection a priority plays well from both a security and
marketing perspective.

5: Service Provider Visibility & Translating Enterprise Requirements into the Cloud

Nurturing a safe, compliant environment is an ongoing concern, particularly as business


continue to expand their global networks. According to our 2015 Insider Threat Report
Cloud/Big Data edition, enterprise clients say that adoption levels would be even higher and
involve more key enterprise applications if the service providers did more to assail their fears
on security, data protection, and data management issues. Specifically, the top three concerns
about data safety for cloud services include:

Lack of control of the location for data (69% globally)


Privileged user abuse at the cloud provider (67% globally)
Vulnerabilities from shared infrastructure (66% globally

You might also like