You are on page 1of 28

Accenture

Technology
Vision 2011
The technology waves
that are reshaping the
business landscape
Table of Contents
3 Foreword: Pierre Nanterme and Kevin Campbell 17 Data Privacy Will Adopt a Risk-based
Approach
4 Introduction
19 Social Platforms Will Emerge as a New
5 Data Takes its Rightful Place as a Platform Source of Business Intelligence

7 Analytics Is Driving a Discontinuous 21 User Experience is What Matters


Evolution from BI
24 Seeing Beyond the Walls: The Process of
9 Cloud Computing Will Create More Value Change Begins Here
Higher up the Stack
26 Notes
11 Architecture Will Shift from Server-centric to
Service-centric 27 Research Methodology

14 IT Security Will Respond Rapidly, 27 Contacts


Progressively—and in Proportion

2
Foreword

What’s next? That’s a simple question to ask, but it’s not so simple
to answer. Our clients and our own company are constantly looking
around the corner to see what’s coming, and what the future will
hold for our business and our lives.

The Accenture Technology Vision for 2011 represents our look


toward the future of technology. But as you will see in this report,
technology trends are not isolated and are intimately intertwined
with business and societal trends. Our Technology Vision is as
important for business and government leaders as it is for IT.

Technology touches everyone in the modern world. It’s no longer


on the sidelines in a support role, but instead is driving business
performance and enriching people’s lives like never before.

We encourage you to take the time to read this important piece,


reflect on what the future of technology means for you and your
organization, and then take all the steps necessary to deliver on
technology’s enormous potential to create value.

Pierre Nanterme Kevin Campbell


Chief Executive Officer Group Chief Executive - Technology
Accenture Accenture

3
Introduction

The 1969 moon landing required radical, back-to-the-drawing-board


ideas about everything from earth orbit to life in zero gravity. The
entire program called for exceptional innovation, willingness to shed
old dogma, unprecedented teamwork, and great boldness.
Just as the U.S. space program could Three threads run through the report: IT and business leaders who see and
not have put a man on the moon using understand the significance of the
conventional aviation technology, 1. Things will be distributed technology changes now under way
IT leaders and business executives The obvious and immediate realization will be those who are best placed to
cannot use yesterday’s approaches to is that data today is spread far and help their organizations outperform.
realize tomorrow’s objectives. Their wide. Data is also dispersed across many They will root their observations
long-held assumptions are being more locations, and under the control and actions in nine core capabilities
turned upside down as three forces of far more owners. At the same time, essential for the effective operation of
converge. First, price-performance services will be distributed more widely. all IT departments. And while they will
ratios keep improving; we have access Analytics will follow data and services, continually sharpen the skills specific
to a superabundance of computing and will become distributed too. All to their roles, they will never limit their
power, network bandwidth, and storage of which accentuates the importance perspectives to their areas of specialty.
capacity, all at lower and lower price of factors such as master data
points. Second: The expectations of management, secure communications,
consumers are changing dramatically and federated identity.
because they are being exposed to
technology choices that empower 2. Things will be decoupled
them as never before. And third: New Technology today enables decoupling
technology trends put IT in position to of entities and layers once deemed
drive innovation and growth rather than inseparable. Data representation is
focusing on cost-cutting and efficiency being decoupled from data access.
improvements. Software layers can be addressed
separately. Application interfaces no
Many changes are under way in parallel. longer need to be tied to physical
Some, like cloud computing, have been interfaces. Decoupling on such a scale
talked about and debated for years promises unprecedented agility and
but only now are able to deliver their flexibility. But it also calls for a very
potential. Others, like the strategic different mindset—and skills set—and
recognition of the importance of data, for wise governance disciplines.
are just now becoming apparent. Still
others, like data privacy, are being 3. Things will be analyzed
propelled by a worldwide wave of Since everything from keystrokes to
concern about individual rights and the consumer behavior can now be tracked
greatly expanded potential for abuse of and studied, analytics will become the
those rights in an information age. super-tool with which to drive more
agile and effective decision-making.
So this year’s Accenture Technology Business processes will have to keep
Vision report tells a story of pace if those super-tools are to be
discontinuous change. That story is effective. There are a host of positive
apparent in the trends that comprise implications, in categories as diverse
the core of this report—the trends as customer intelligence and threat
that will have the greatest impact on detection. But there is no shortage of
performance in the future. negative implications—among them
the risks to data privacy and the over-
optimization of business processes.
4
Data Takes its Rightful Place
as a Platform
The age of viewing everything through an application lens is coming to an end.
Coming next: a world in which the quantity, processing speeds, and distribution of
data compel IT leaders to see the world through a data lens.

Generations of programmers and application architecture, will refer to scaling will be increasingly deployed
architects have grown up thinking in abstraction layers and separation of to help meet the performance
terms of applications—seeing the world concerns, and not just to data models. requirements.
through the lens of the functions that
the business has needed and with Recognizing three catalysts The third factor is that data is no
data being the object, not the subject. longer “contained” in an enterprise data
That thinking will change. Although Three factors will drive this shift in center—nor is it necessarily “owned” by
a focus on applications will continue perspective: dramatic increase in the the enterprise. The adoption of cloud
to be important, it will give way to an quantity of data; sustained processing computing will lead to corporate data
emphasis on data. It is our belief that in requirements in the context of being distributed across the enterprise
the near future, platform architectures growing data volumes; and widening boundary and potentially among several
will be selected primarily to cope distribution of data. Let’s look at each cloud providers. Further—publicly
with soaring volumes of data and the factor in turn. available information as well as data
complexity of data management—not from third-party data service providers
for their ability to support this or that To start with, there are enormous is rapidly becoming a crucial part of the
application. increases not only in the volumes mix.
but in types of data—more and more
Whereas traditional databases are transactional data, a surge in meta- Distributed data is the new
designed to keep track of where the
data is stored and how it can and
data, an explosion of sensor data, and normal
a staggering rise in the volumes of
should be accessed, data platforms unstructured data such as e-mails, While it is true that data today is
will provide a layer of abstraction that tweets, blogs, video clips, and more. already distributed among different
hides the data’s location, and is not Second, because organizations want data centers and application silos, we
concerned with the form in which the to do more with the data in service are talking about distributed ownership
data is stored or how its consistency of their business processes, there are and control of data, which requires
is maintained. So in effect, the data very strict performance requirements a different approach to management
representation architecture will be for analyzing the vast new data of the data as well as its security and
decoupled from the application. volumes. As a result, new processing governance. If IT leaders are not already
Data architecture, much like today’s architectures emphasizing horizontal
5
facing up to the fact that they must
deal with data that resides outside their
backup frequency, compounding the
difficulty of taking snapshots of the Think data utility,
enterprises, they will very soon have to
do so.
data and of managing recovery from
disaster consistently. And there is the
not just data quality
challenge of recombining data dispersed The concept of data quality will
The distribution of ownership will have among different providers into a single soon give way to the idea of data
a host of consequences. For a start, view of the truth.
utility, which is a more sophisticated
distributed data will be more frequently
measure of fitness for purpose. This
shared across applications. For instance, Related to this point is the question
will get IT departments away from
one process, say, involving an asset of how to ensure the destruction of
often-fruitless discussions about
tracking application, may produce data sensitive but old data and handle
the cleanliness of data and toward
and require only a basic level of data destruction audits. How does an
storage. But if another process, like an organization now confirm that data has
productive talks about what can
inventory management or supply chain been destroyed according to its policies? be done with the data on hand.
application that uses the data and now And when auditing data destruction, it Importantly, it will allow them to
needs to use that data in a mission- must be possible to audit the data paths apply semantic and analytic tools
critical application, the new demand too—tracking all of the places where to extract useful insights from
completely changes the storage and the data may have left a trace along the inaccurate data and to integrate
access requirements for the data. way. data silos more easily.

At the same time, master data Our prediction is that these factors will We characterize data utility in eight
management (MDM) will become lead to new value-added services from ways:
considerably more complex than it is cloud providers or will become part of • Quality: Data quality, while
today. Specifically, MDM needs to keep future service-level agreements (SLAs) important, will be one of the many
track of the origin and location of data, that differentiate cloud providers. dimensions of data utility.
access policies, backup frequencies, • Structure: The mix of structured
degrees of redundancy, location of Distribution will also affect data and unstructured data will have a
ownership of meta-data, etc. We expect quality. How can we effectively detect big impact, and will vary from task
that this will create big headaches duplication and inconsistent data when to task.
because MDM will become crucial when it is distributed across different silos, • Externality: The balance between
already scarce MDM skills will become vendors, and providers? And how can we internal and external data will
scarcer. tell when data duplication is beneficial— be important, with implications
and should be planned for—as opposed for factors such as competitive
The shift toward a data platform to accidental and potentially unsafe? advantage (external data may
mindset will turn the spotlight on Therefore it is necessary for IT leaders
be less trustworthy, yet fine for
alternative databases. The trusted to reframe the whole concept of data
certain analysis or tasks).
relational database is not about to quality more broadly around the idea of
• Stability: A key question is how
be retired, of course. But it will soon data value and utility.
frequently the data changes.
start to make way for other types
• Granularity: It is important to
of databases—streaming databases, We believe that more and more
for instance—that mark a significant organizations will come to see data
know whether the data is at the
departure from what IT departments as something that can bestow a right level of detail.
and business users have relied upon for competitive advantage, and begin to • Freshness: Data utility can be
decades. Naturally, not all of the new view application services and algorithms compromised if a large portion of
database technologies will be right as utilities that can be procured off the data is out of date.
for every user in every circumstance; the shelf. In other words, the roles of • Context dependency: It’s
cost, flexibility, reliability, and speed application and data will be reversed, necessary to understand how
will be key motivators. It is essential with data becoming the platform that much context (meta-information)
for IT leaders to start thinking beyond supports application services. is required to interpret the data.
conventional constructs in terms of • Provenance: It is valuable to know
how data is organized, accessed, and
managed.
Action step where the data has come from,
where it has been, and where it is
being used.
Begin to reframe IT’s perspectives
Then there are the questions about
around the idea of data
backup and recovery, which become far
platforms—and start the
more complex in a world where data
may reside with several cloud vendors.
conversations and workshops that
Each vendor is likely to have its own enable those perspectives to take
hold quickly.

6
Analytics Is Driving a
Discontinuous Evolution from BI
Analytics is emerging as a major differentiator and value creator for businesses. But
to reap the real benefits, companies must see analytics as a discontinuous change
that will involve several different architectures and deployment models.

Analytics drives insights; insights lead such as cloud computing is changing However, new challenges will appear.
to greater understanding of customers how data is generated, collected, As distributed data becomes the new
and markets; that understanding yields and stored across an organization. In normal, we will see the emergence of
innovative products, better customer practice, this will require a distributed distributed ETL—that is, the need to
targeting, improved pricing, and approach to analytics. extract data from multiple on-premise
superior growth in both revenue and and off-premise platforms in order to
profits. This distributed approach will require run centralized analysis. Call it the price
different ideas about who is best at of progress.
That’s why farsighted companies doing what. In general, we expect
are viewing analytics as essential to see some companies sourcing The quest for closed-loop
for creating value. In contrast, their from third parties the deep analytics
peers who think about analytics skills required for, say, customer
nirvana continues
only as a simple extension of segmentation, route planning or process
The ultimate goal for analytics-savvy
business intelligence (BI) are severely optimization, and keeping in-house the
organizations is complete integration
underestimating the potential of even deeper skills for the interpretation
of analytics with business process
analytics to move the needle on the of the results.
automation, leading to a true “closed
business. For one thing, they overlook
loop” capability that integrates
the fact that traditional BI does not As analytics becomes integrated into
analytics with automated responses to
address the wealth of unstructured data the underlying technology platforms,
the results of the analysis. While this
that is now available. one existing challenge will ease. ETL
analytics nirvana won’t be achievable
(extract, transform, and load), the
anytime soon, we will begin to see less
So what does the future of analytics process of retrieving data and preparing
complex and more pragmatic levels
look like for IT organizations? First of it for analysis, has traditionally been
of integration between analytics and
all, despite a steady drumbeat calling the most time-consuming element
business logic embedded in IT systems.
for the integration of data across of any analytics project. But ETL will
an organization, there will be no become easier as data quality tools
Leading IT organizations will go through
such thing as an integrated analytics improve, analytics applications become
a progression, moving from traditional
platform, technology, or deployment more tolerant of “noisy” data, and
BI (reporting) to business activity
model. The emergence of technologies ad hoc capabilities are replaced with
integrated platforms. 7
monitoring (BAM) to measure specific
business activities and report business
applications to define the primary
decision points. Brute-force
metrics. From there they’ll proceed to improvements
predictive analytics, in which business The need for analytical
rules and processes are adjusted to
literacy Don’t expect sophisticated,
address business changes – a peak
handcrafted analytical models to
sales period, for example, or a market The growing sophistication of analytics drive performance improvements in
disruption. capabilities and supporting technologies analytics. A more likely driver: brute
will open up the risk of “oversteering”— force computational power applied
This move to predictive analytics will of making increasingly frequent, fine- to larger data sets.
drive the use of analytics to acquire grained decisions. That’s especially so
new data to fill knowledge gaps – because business users will be tempted Traditionally, analytics solutions
which will further improve analysis to “get value” from these powerful tools. have been constrained by computing
and decision-making. Think of how However, too-frequent optimization performance and limited availability
the combination of additional data can can be counterproductive when the of data. Improvements often
provide deeper context to improve the decision-making time scale is not
quality of analysis. A sales team for a stemmed from meticulous program
appropriate for the process to which
retailer could compare regional point- optimization, better algorithms,
it is applied. For example, just because
of-sale data with local weather, for simplified assumptions, and other
a utility can gather real-time data on
example, to gain better insights into methods for milking limited data for
fuel (e.g., oil, gas, coal) pricing from
customer behavior. all it was worth.
the markets doesn’t mean it should
be changing its generation mix every
To truly take advantage of analytics, 30 seconds – especially if its business
Increasingly, however, analytical
businesses need to integrate their processes are tuned for long-term fuel improvements are coming
analytics capabilities into their business contracts. from the availability of greater
rules and processes to connect the computing power applied to more
relevant insights across all stages of data. Utilizing machine learning
So how can companies avoid analytics-
decision-making. Business processes techniques instead of handcrafted
induced course corrections that do
involve a series of discrete decision rules, analytics teams will gain
more harm than good? Effective use
points: demand prediction, pricing, the scale needed to match the
of analytics will require considerable
promotion, etc. Today, businesses tend increasing complexity of business
analytical literacy. Remember that real-
to apply analytics to these decision problems.
time data does not necessitate real-
points in isolation. While this approach
time decisions. Decision-making with
can certainly improve decisions at each
analytics requires an understanding of
step, it can also lead to problems.
the sampling rates of different events
and their interdependencies, because
Consider, for example, the well-known decision-making must be consistent
bullwhip effect, which shows that even
with the time scale of data. The business
if each stage of a process is optimized
process should dictate the analytics; not
based on the data that’s available at
vice versa.
that stage, the overall process will
still be suboptimal because different
decision points are not coordinated and Action step
do not have visibility into the entire
process. Because every decision in a Determining the right approach
process generally is predicated on the to analytics involves many critical
outcomes of prior decisions, analytics decisions; IT executives should
must integrate all decision points to work closely with business leaders
provide an understanding of the larger to identify where analytics and
decision process. insights can be leveraged most
effectively as well as the proper
As decisions across the process become mix of services required to
apparent, companies can eliminate optimize analytics capabilities
undesirable side effects and optimize across the enterprise.
business processes enterprise-wide,
leading to better results. The best way
to head down this path is to start
decoupling processes and rules from

8
Cloud Computing Will Create
More Value Higher up the Stack
The current focus on infrastructure cloud doesn’t help organizations differentiate
themselves. Together, SaaS and PaaS rather than IaaS will enable IT to create value
through a combination of cost reduction, speed to market, agility, and the ability to
gracefully integrate business processes with partners and suppliers.

There’s no denying the momentum of many large enterprises, the logical next solutions as a way to entice more
cloud computing. Accenture’s research step after virtualizing their data centers enterprises to move their applications to
shows that enterprises are already has been to leverage IaaS to augment the cloud.
moving applications into the cloud. 1,2 those centers.
The demand is anything but an IT fad; Hybrid solutions will emerge
it is coming from a host of business However, IaaS is becoming a commodity.
We see much greater value in SaaS
as the dominant model
functions. And it is truly a global
phenomenon; companies everywhere and PaaS, higher up the stack. That is Cloud computing deployments will take
from Brazil to China are moving ahead where tomorrow’s leaders will find real many forms. More organizations will
rapidly with adoption. It’s clear that IT differentiation. Hybrid clouds – i.e., SaaS deploy virtualized desktop infrastructure
and business executives should expect and PaaS in combination with internal (VDI) for high-security or highly
cloud computing to become ever more applications – will help organizations standardized desktop environments to
pervasive– to the point that the term accomplish tasks that they cannot manage sensitive data centrally and
“cloud computing” itself becomes accomplish today and will cement IT’s keep it out of the individual’s control.
superfluous. role as a driver of business growth. Private clouds will also be utilized
for development and testing, along
But what’s needed now is a shift The reason? Unlike IaaS, which is
with “transient” applications, such as
in thinking from obvious but non- driven mostly by cost-management
product demonstrations for customers,
differentiating benefits such as cost objectives, the migration of applications
that can be set up and retired quickly.
reduction through cloud infrastructure and application development to the
Public clouds will be leveraged more
to where the cloud will have its cloud will be based more on business
for non-differentiating applications or
real impact. When we look at the need. SaaS-based applications reduce
for “cloudbursting” – a computing-on-
different facets of cloud computing concerns about the infrastructure and
demand model for processing heavy,
– Infrastructure-as-a-Service (IaaS), help organizations get to market quickly.
short-term workloads.
Software-as-a-Service (SaaS), Platform- When SaaS also provides a platform, it
as-a-Service (PaaS), and so on – it is makes it easier to customize and to tap But we expect hybrid clouds – mixing
easier to see that most of the current into an expanding ecosystem of third- public and one or more private services
emphasis on cloud is focused on the party applications. Service providers – to emerge as the dominant model in
lower levels of the technology stack. For will begin to provide targeted vertical most enterprises. As data and services
9
are spread across a variety of service
providers, hybrid models will provide
a consequence, the technical sourcing of
cloud providers will become a new skill Action step
the best balance of flexibility while that organizations have to acquire and
managing risk. In such an environment, master. Deploying infrastructure cloud
IT will focus on orchestrating the today should not distract from
business process while treating Many technical and business planning for the transformational,
everything else as a service. This will differentiating opportunities
challenges remain that SaaS/PaaS offer tomorrow.
force service providers to compete less
on price and more on how well they can Before the full potential of cloud The main focus should be on
differentiate their offerings based on computing can be realized, companies developing a cloud strategy that
factors such as quality of service and and their service providers have plenty drives business transformation by
robust application catalogs. of technical and business hurdles delivering increased functionality
to cross. Technologically, IT teams and flexibility using a mix of
Economics of the cloud: will have to develop strategies for public and private cloud-based
a new game implementing and managing federated application and platform services.
identities, which are necessary for
As cloud services proliferate, enabling consistent permissions, roles,
conversations about cost will have to and traceability across multiple service The lock-in problem
change. We expect the shift to move providers. Organizations’ IT leaders
from the cost of discrete IT components will have to work with cloud service Cloud-based solutions have low
to a discussion about the total cost providers to determine the right barriers to entry. There is no
of ownership (TCO) of cloud solutions. federation rules for their communities. procurement or development lead
Although IT TCO constructs are well time; getting into the cloud needs
known and well accepted, measuring Federated identities are but one of nothing more than a credit card.
TCO in the cloud is a mystery. It’s fair many technical challenges. Concerns for
providers include: platform-level version But getting out of the cloud
to say that currently, there are no
management activities (the ability to will be a lot harder than getting
recognized models for cloud TCO. The
current emphasis is on economies manage local patches, versioning, and in—especially with SaaS. SaaS
of scale from volume, automation, upgrades without introducing risk to vendors have their own styles for
commoditization, and consolidation. cloud consumers); and richer application implementing data models, meta-
programming interfaces (APIs), which data, user group administration, etc.
But cloud hosting is not merely a are necessary for more sophisticated The same issues apply to security
summation of compute hours, storage cloud applications. Concerns for users domain models, proprietary scripting
bytes, and network bandwidth. There involve: consistent policy enforcement and markups, and proprietary meta-
are many implicit cost elements, across cloud providers; access to data schemas. So taking all of
involving quality of service, staff and event meta-data, which is critical for the data back will require a full IT
skill requirements, the granularity of analytics; governance over applications migration project.
licensing costs and charge backs, the and data that is stored in third-party
unknown costs of skills loss, the implicit data centers; better fault tolerance for Cloud lock-in can occur along
and explicit insurance cost to offset application architectures; and network several dimensions: app stores,
downtime, and the unpredictability latency. data migration schema, and skills.
of operating expenses compared with Further, if the cloud service provider
capital expenses. In addition to the technical problems, finds itself in trouble, needing to
several business issues in deploying raise prices to stay in business,
In the future, companies will need cloud solutions need to be solved as its customers will have little
to examine the full life-cycle cost well. IT teams will have to ensure that bargaining power because they
when considering public, private, or any cloud services contain facilities need the provider to remain viable
hybrid cloud services. For example, a for e-discovery and disaster recovery for the sake of their own business
short application life span may not – processes made more complex by continuity.
warrant an investment in supporting the distributed nature of cloud-based
infrastructure, making a case to move information. They will need new So the selection of a service provider
the application to the cloud. Load processes and procedures for tracking must come with the realization that
elasticity is another consideration; user activities and data paths across a shift to another provider will call
the ability to reduce utilization when cloud-based and on-premise systems. for re-implementing the application
demand is low makes costs variable as While license management may become or expending considerable effort to
well. Cost decisions will need to be built easier, SLA management will become far migrate to a new provider.
into architecture planning and provider more complex when incorporating cloud
selection – traditionally the realm of providers into the mix.
architects, not finance professionals. As
10
Architecture Will Shift
from Server-centric to
Service-centric
More mature tools, frameworks, and methodologies will pave the way toward
process-centric IT. Rapid advances in infrastructure will drive a new generation of
system architecture that will be reconfigurable at runtime to operate on different
infrastructures.

Information technology is evolving from academia, advances in cloud technology tightly coupled, even though their
a world that is server-centric to one at all layers of the stack create a architectures evolve at dramatically
that is service-centric. Companies are burning platform for such architecture. different paces. The life span of an
quickly moving away from monolithic enterprise application is measured in
systems that were wedded to one or Dynamic reconfiguration will take decades – most mature companies
more servers toward finer-grained, place at the business process level have 45 years or more of intellectual
reusable services distributed inside and – in which processes can switch property locked up in custom-built
outside the enterprise. to an alternate service provider in applications and data structures 3 –
response to an outage or promotion while dozens of start-ups promising
The evolution is being driven by the – and at the infrastructure level, in infrastructure innovations pop up every
ongoing maturation of supporting which servers or nodes can be added month.
tools, frameworks, and methodologies. through cloudbursting and similar
There is still much to be done to techniques to handle temporary, peak As a result, applications that are tightly
decouple infrastructure, systems, processing needs, or discrete projects. coupled to their infrastructures often
applications, and business processes In the future, business processes, not restrict access to new innovations in
from one another. This shift has technology functionality, will dictate infrastructure. An application that
major repercussions for all levels of how or when you use these scaling relies on a single instance, for example,
the enterprise architecture stack, mechanisms. Business rules will help cannot take advantage of horizontal
from infrastructure to applications. make this a reality. scaling technologies.
Decoupling will enable components to
operate independently while making The most interesting part of this The case for decoupling
software architectures reconfigurable movement is that infrastructure –
during run time to adapt to various commonly viewed as a constraint to Why is it imperative that the
environments and design objectives, application functionality – will become application architecture be decoupled
which will increase the flexibility the primary driver of architectural from infrastructure? Application
of application deployment and change. Infrastructure (e.g., computing, design typically makes assumptions
maintenance. Although dynamic storage, and network connectivity) about certain characteristics of the
reconfiguration is not a new concept in and applications have always been infrastructure required to run it.

11
These assumptions tie the application
architecture to the architecture of
In this new paradigm, applications
will be immune to any underlying
Software
the infrastructure for which it was
originally designed – assumptions
changes in data representation or
infrastructure, resulting in scalability,
engineering
that must be preserved even as the increased performance, and cost becomes more
infrastructure improves. This is the reduction thanks to the adoption of
curse of backward compatibility, in new advances in infrastructure. This will science than art
which support for new functionality speed the adoption of new technologies
must be sacrificed for the applications and enable companies to upgrade or Watch for software development
to continue to run reliably. migrate at will. to become a lot more predictable
and easier to measure and manage.
New architectures, on the other hand, To fully reap the benefits of cloud- New and highly sophisticated
allow – or even require – companies to based infrastructure and service tools, exploiting analytics and
relax many long-standing assumptions orientation, system design will have standardized instrumentation, are
about application and infrastructure. to adhere to well-established best already becoming integral to the
For example, no longer will developers practices that will enable systems to tooling used by leading developers.
be restricted to a single box with fixed be dynamically reconfigurable. Some
capabilities, a single instruction or examples include: using parallelization Instrumentation is starting to
processing stream, pure vertical scaling, frameworks for flexible scaling; avoiding provide richly detailed information
or the co-location of processing and explicit assumptions regarding service across the entire development
data. Instead, infrastructure is being configuration in the service design; process. And analytic techniques,
provided as services that can be chosen, using intermediaries like a service bus applied across multiple teams and
procured, and configured based on to avoid direct communication between projects, will allow longitudinal
application requirements. services and data stores; and isolating tracking and benchmarks against
the stateful and stateless components which new projects can be
We are already seeing horizontal scaling of an application from each other. measured and evaluated.
at the chip level, the grid architecture
level, and the server level. Parallelism Ultimately, this means that a large Examples of tools that enable
and new processing mechanisms part of what used to be hardwired instrumentation and analytics
such as MapReduce, non-relational into the system during design must be include Rational Jazz, a framework
databases, virtualization, and fabric made configurable at run time so the for connecting to underlying tools
computing are rapidly becoming the system can adapt to the highly dynamic such as ClearCase and Subversion,
mainstream today. There is a reverse operating environment. The system Microsoft Visual Studio’s Team
trend as well, as evidenced by the rise may have to be designed around the Foundation Server, and Hackystat,
in special-purpose appliances where possibility of more frequent failures, an open source instrumentation
complete stacks have been collapsed requiring more attention to managing framework.
into an appliance, effectively creating state in order to recover from failures
a service in a box. For example, SAP with a consistent state. These advanced tools make it
released SAP HANA (High-Performance possible to measure software
Analytic Appliance) recently, optimized Innovations must trickle up development processes in
for in-memory execution of applications systematic, consistent ways. As a
such as its BusinessObjects offering. 4
the stack result, software project managers
will be able to estimate and plan
IT organizations have been excited by
Going forward, architectural development projects more precisely
the agility that service orientation,
heterogeneity will only increase as and with more predictability—
business process management, and
hybrid clouds, distributed data, parallel for example, tracking team
other emerging technologies have
algorithms, non-relational databases, and individual productivity, or
provided to date. But they are seeking
service-centric applications, and monitoring the quality of code over
even greater agility. Innovations in lower
specialized appliance-based applications time. And when necessary, the new
levels of the stack require flexibility
all coexist, and new processing approaches will help managers to
in the higher levels to deliver the true
paradigms emerge. Application take corrective action much earlier.
potential of service-centric IT.
architectures will continue to be
decoupled from infrastructure and
As layers of the architectural stack
from business processes, resulting in
are decoupled from one another,
applications that are self-describing,
languages and notations for formal
self-correcting, self-scaling, and self-
communication between the layers will
modifying.
become necessary. Today, an advanced

12
application might have logic embedded or switching to new services, by
in it that triggers other actions as the invoking different business rules for
application approaches peak capacity, evaluating decisions, or by modifying
for example. Tomorrow, communications the underlying business process itself.
between layers will alert higher levels
of the stack to the fact that the The approach to system and enterprise
application is approaching capacity, architecture will fundamentally change
and a combination of business rules as we go from applications that are
and business processes will handle the tightly coupled to infrastructure to
issue rather than the infrastructure layer those that are infrastructure-agnostic
making a decision in isolation. and, eventually, infrastructure-aware.
Better inter-layer communications Ultimately, applications will control
will enable dynamically reconfigurable the infrastructure instead of being
architectures to be self-monitoring – constrained by it. These types of
meaning that they can generate events dynamically reconfigurable services are
based on changes to the environment. a key element of the next generation of
These changes may take many forms. software architecture that will increase
They may involve new demand—for IT’s agility – and thus boost its ability to
instance, when the infrastructure is innovate and deliver business value.
reaching capacity, as would be the
case if a cloud provider has an outage.
They could be configuration changes,
such as new capabilities in underlying
Action step
services, and organizational, as in the Explore ways to begin decoupling
case of adding a new outsourcing applications from infrastructure as
provider. Business processes will have to part of your life-cycle management
respond to these events appropriately: strategy.
by executing existing services differently

13
IT Security Will Respond
Rapidly, Progressively—and
in Proportion

In the past, IT has architected everything around the idea of “100 percent” security.
This fortress mentality must now give way to a realistic and practical approach to
IT security. What’s needed now is a cascaded, reflex-like security architecture that
responds proportionately to threats when and where they happen.

There is no such thing as watertight – across more devices, more systems, the threat is large or moving fast,
IT security. Yet for years, business and more people, more business partners, shuts down other parts of the network.
technology leaders have acted as if the and broader physical infrastructure— More sophisticated responses will also
only alternative to a “fully secure” state supports the case for automated place a premium on using intelligence-
is an unacceptable “fully breached” capabilities that detect, assess, gathering and forensics techniques to
state. and respond to external threats learn about adversaries.
immediately.
This “fortress mentality” is outdated— Overall, it will be essential to step
and no longer realistic or practical. Leading organizations will understand up the organization’s collective
Leading security specialists are devising the consequences of inevitable data understanding of what “security” really
reflex-like systems whose responses leaks. For example, they will know means. That will involve moving away
step up with the severity of the breach. that the leak of data about a major from simple low, medium, and high
In extreme cases, counterattacks may retailer’s transportation routes does brackets and assessments at the level
even become part of an organization’s not automatically mean that rivals can of the individual object and toward the
repertoire of responses. replicate the retailer’s supply chain. security of a network of interconnected
They will be able to think beyond objects. In essence, security will become
We believe that new security solutions the simple binary notion that their a fluid continuum across the network.
and architectures will, like human organizations are either secure or have
reflexes, respond instinctively to the been breached. The bottom line: Automation will
growing speed, scale, and variety of quickly become a “must-have”
attacks. This implies that for the first Those organizations will know that component in the overall security
line of defense, people will not be different levels of attacks require strategy of every IT organization. There
part of the decision loops; the speed different speed, scale, and types of is simply no other way to detect threats
and frequency of attacks dictate that responses. A cascade of responses swiftly enough, let alone to contain the
human responses must make way for might, for example, involve the damage and recover from it.
automated capabilities that detect, immediate shutdown of one portion
assess, and respond immediately. of a network coupled with active
And the increasing “attack surface” monitoring, which, if it detects that

14
Teaming up to fight back The importance of being able to prove
In extreme circumstances, it may be you’re you
necessary to actively fight back. Given
the speed and scale of today’s cyber- Identity will become even more important in the future. And biometrics—
attacks and the growing significance of already established as a security tool – will augment and steadily replace
the inflicted damage, organizations can other methods of identification and authorization. The importance of identity
no longer stay only in defensive mode. will drive biometrics adoption in two very different directions: high-security
Cyber crime accounts for an estimated uses in government and business, and convenience-driven uses for average
$1 trillion in annual losses to businesses individuals.
around the world. 5 Counterattacks,
then, are likely to become part of the The catalysts for these twin tracks are clear. There is certainly a growing
strategy for augmenting IT security. emphasis on identity for consumer and enterprise needs; mobility, health,
Organizations are likely to collaborate and e-commerce all require strong forms of authentication in the face
to counterattack their assailants, of increasing security threats. At the same time, biometrics solutions are
not least because few organizations becoming more affordable as deployment of high-value solutions brings
have the resources needed to defend down unit costs, enabling other business processes to take advantage of the
themselves. technology. In short, organizations will have more options for acquiring and
implementing biometrics at differing investment levels.
Efforts are underway to develop
the systems that can enable In the high-security realm, we expect biometrics to evolve from a high end,
countermeasures. One example: Sypris James Bond–type specialist technology to the primary tool for high-volume
Electronics recently unveiled plans to applications with a strong security requirement – for instance, border control,
create an international “cyber range.” voting, or police applications. We also anticipate that biometrics solutions
The company hopes the “range” will proliferate in the private sector, used everywhere from online banking and
will become the preferred practice payments to securing electronic medical records and to help prevent clinical
battlefield for digital warfare where errors, prevent fraud, and protect patient confidentiality.
military, government agencies, and,
later, businesses running critical For consumers and individuals, biometrics will be used increasingly in less
infrastructure services can test their security-centric domains and developing countries to access benefits or
defensive – and offensive – firepower services, from healthcare or food subsidies to unemployment benefits or
against cyber enemies. The range is banking. They will also help make things more convenient in low-security
expected to be operational by the first situations—for example, providing easier access to a gym or library. As device
quarter of 2011. 6 costs continue to fall, the technology will be increasingly integrated with
products and processes – for fingerprint unlocking of laptops, for instance.
In parallel, offensive “weapons”
are being explored and tried out. As a consequence, we predict that average citizens will soon start to see
Researchers successfully demonstrated biometrics less as an intrusion on their privacy and more as a means of
counter-hacking techniques at the 2010 enhancing their privacy—securing their bank accounts or health records, for
Black Hat security conference. They example.
reverse-engineered a bot and looked
for flaws in the bot itself to exploit, in
the same way that the bot uses flaws
in commercial software. From there
they were able to retrace and, in theory,
take control of the botnet itself. 7 At the
same time, Microsoft and others have
partnered to attack malware spread by
registering domain names necessary
for continuing the exploits exploit. A
pseudorandom technique generates a
large number of domains per day that
the worm looks for in order to connect
to the command-and-control structure.
The team is either pre-registering, or at
least flagging, those domains to block
the spread of malware. 8

15
The idea of counterattacks is fraught programming errors—and consequent There is one other facet of IT security
with policy and governance headaches. calls from security experts of technology that is worth touching on. We expect
To begin with, the amorphous nature vendors to work within a strict set leading organizations to revisit their
of the Internet makes it very easy for of software development security approaches to federation of identity—to
standards. start thinking in terms of federated
attackers to hide – and makes it equally
identity that extends beyond the
easy for counterattacks to harm the
boundaries of any one organization so
innocent. Jurisdictions will complicate Counterfeit IT products that an employee or contractor’s identity
matters further. Whose laws apply if can follow him or her from place to
a Japanese company counterattacks This far-reaching view of security place.
a Latvian hacker who is using botnet extends to the proliferation of
counterfeit products. As far as IT security
computers in Norway?
is concerned, a worrying question is: Action step
“If we do discover fake products, how
Making sure every router do we know they aren’t exporting data Stop thinking in terms of
and every chip is secure to a hacker cartel somewhere?” The watertight security—there is
counterfeiting problem is very real. For no such thing. Instead, begin
We also believe that IT security will example, fake products--particularly planning for cascaded, reflex-
soon expand beyond just securing counterfeit network equipment— like security systems that rely
information systems to securing critical have been the target of enforcement heavily on automation to respond
business processes. Think of this as a initiatives across several countries,
immediately and locally—and then
continuation of the conversation about leading to seizures worth more than
step up their responses as the
$100 million.
closer alignment between IT and the severity and scope of the threat
business, so that IT staff have a better increases.
Recognizing that people are the
understanding of what’s important weakest security links, more and more
about key business processes and are enterprises will shift a large part of their
able to quickly identify the security security spend to training and cultural
vulnerabilities of those processes and change programs. The motivation is as
build defenses against them. profound as it is trivial: the recognition
that people are the weakest link when
This more holistic view will use it comes to security. We expect that
“integrity measurement” tools to organizations will make “security
reach into every corner of the IT fluency” central to their corporate
realm. The tools will be used to gauge culture; in those organizations, even
job interviews are likely to involve
the trustworthiness of everything
ways of gauging candidates’ security
from processor chips to software to
consciousness.
smartphones to servers and entire
clouds. Already there are warnings
that most cyber attacks are enabled by

16
Data Privacy Will Adopt a
Risk-based Approach
Complete data privacy is a myth—all the more so in the WikiLeaks era.
Leading organizations already know that. They will be attuned to regulations
governing privacy and will develop a risk-based approach to data privacy.

In an age when WikiLeaks has become from organizations to understand the The public is clearly becoming more
a household name, every business risks surrounding the use and misuse sensitive: November 9, 2010, marked
leader is right to be even more paranoid of personal data. And it will require the two millionth consumer complaint
about data privacy. Just as leading constant vigilance because things are filed with the Internet Crime Complaint
organizations now realize there is no changing so fast. Center (IC3) in response to suspected
such thing as 100 percent IT security, so or actual online criminal activity.
complete data privacy is being exposed To begin with, it will call for close This milestone is especially notable
as a myth. attention to regulation—worldwide. because it took seven years for the IC3
Just one example: Authorities recently to receive its first million complaints
In one study, the Wall Street Journal found that Google committed a serious between May 2000 and June 11, 2007.
assessed and analyzed the cookies breach of the U.K. ‘s Data Protection The second million arrived in less than
and other surveillance technology Act when its Street View mapping half the time – just under three and a
that companies use on the Internet. service collected personal information half years. 11
The study found that the nation’s 50 from unsecured wireless networks in
top web sites on average installed 64 England. 10 The U.K. was the not the The privacy challenges may well become
pieces of tracking technology onto the only nation whose privacy policies even more burdensome. In the United
computers of visitors, usually with no Google violated. States, some politicians are proposing
warning. A dozen sites installed more to fine technology companies up to
than a hundred each. 9 We predict that individual privacy $100,000 a day unless they comply
will take center stage as a result with directives imposed by the U.S.
If you think privacy of increased regulation and policy Department of Homeland Security. The
enforcement. Privacy outcries are new bill is called the Homeland Security
protection is important getting louder. And governments are Cyber and Physical Infrastructure
today... becoming considerably more active in Protection Act (HSCPIPA). 12
enforcing compliance and investigating
It will not be enough simply to the flexibility of current policies in
accept the reality of data leaks. It adjusting to emerging capabilities and
will require very proactive responses business models.

17
Privacy by design Action step
At the same time, the concept of Given the difficulty of securing
“privacy by design” will become much data long term, the questions to
more prominent; U.S. and European consider are how to plan the right
regulators expect technology companies responses to leaks, and whether the
to incorporate data privacy in the design data should be created or acquired
of their products and services. But it in the first place.
will be some time before enterprises are
rewarded for proactive privacy controls.
The converse applies: They can expect
to be punished for what is deemed to be
poor privacy practices.

We expect that leading players will


develop superior levels of understanding,
enterprise-wide, about the distinctions
between being a data processor—
broadly, handling the personal data of
others—versus being a data controller,
thus lowering their risks of unwitting
breaches of privacy regulations and
perceptions of privacy breakdowns. We
also expect the privacy exemplars to
deploy the kinds of cascaded, reflexive,
automated systems that the leaders will
use as the backbones of their overall IT
security strategies.

18
Social Platforms Will Emerge
as a New Source of Business
Intelligence
Social networks will evolve into “platforms” for reaching customers, tapping into
their social identity, and gaining information about them, and about competitors and
the market as a whole.

The rapid growth of social media has Social platforms have three major the final short list and provides a
been eye-popping—especially so in dimensions: functionality, or the basic classic case study. By opening up its
the last few years. Facebook, founded capabilities these platforms offer; development platform, the company
in 2004, now has more than half a community, or the groups of people has encouraged third parties to
billion users and is spending heavily to who belong to them; and user identity, build applications that augment its
accommodate more. Twitter’s service the unique name and associated basic services. Facebook Connect,
generates billions of tweets per month. information that characterizes an a mechanism that enables users to
Social networks are not just a product individual. log into a number of other online
of and for the young consumer: Many communities with their Facebook
of the world’s Internet users aged 50 A virtuous cycle of growth identity, has made it doubly attractive
and over are active users of social Only a small number of social for third parties to support and for
media. And increasingly, businesses communities will emerge as true social users to join.
and government organizations are platforms with a large ecosystem of The cycle will eventually lead dominant
using social media to connect their services built around them. We believe players to squeeze out smaller networks
constituents in an effort to improve that social networks evolve through a and make it increasingly difficult for
collaboration. virtuous cycle of growth: More features new social networks to join the space,
This is just the tip of the iceberg. The attract more customers, in turn and companies will look for ways to
evolution of social media will continue attracting even more customers and connect to these dominant platforms
to disrupt the way companies do making the whole ecosystem appealing using APIs such as Facebook Connect
business, posing new challenges to IT for third parties to support with and Google’s OpenSocial. Already, more
as it attempts to harness social media additional features, and so on. than 250 million people are using
in the enterprise. The key driver of Among the many candidates – Facebook Connect on third-party sites
this change? The transformation of Facebook, MySpace, Yahoo Groups, every month, and 10,000 new sites
social networks into social platforms, Google, Orkut, Twitter, LinkedIn, and are adding Facebook Connect every
each with its own ecosystem to fuel Renren, to name some – we believe day. 13 Companies are also “fishing
increasingly deeper levels of interaction. that Facebook has already made where the fish are” by launching
targeted communities inside Facebook’s

19
walls, giving them access to the rich will enable companies to communicate trained. Using a generic collaboration
information and activities that flow by design instead of by opportunity. platform for complex, mission-critical
through the platform. Combinations of social platforms, processes like design or pharmaceutical
devices, mobile apps, etc., mean that drug approval may be architecturally
Disintermediation is a good corporate web sites will lose their simpler, but will require considerable
primacy as online destinations. As custom development and may suffer
thing such, companies will begin placing less from poor usability due to the fact
One of the vaunted business beliefs has emphasis on search engine optimization that it’s built on a generic, lowest
been that companies should own the and promotions designed to bring people common denominator. Essentially, social
relationship with their customer and to their sites, shifting their resources to platforms call for clear guidelines about
never let any third party disintermediate programs for engaging users where they when to use which type of solution.
between them and the customer. Social congregate online – that is, on social
platforms. Collaboration platforms and
platforms will overturn that belief.
Enterprises should be looking at these analytics
The rich history of information that
“social identity providers” to connect Lessons learned from social platforms
individuals leave in social networks
all of their interaction channels into will lead to fresh perspectives on
through their interaction with others a cohesive, multichannel customer
will be a much more valuable form of collaboration models inside the
experience. The winners will be those
identity – a “social identity” – than enterprise as well, eventually enabling
who recognize and serve both the short-
name, physical address, social security more sophisticated and optimized
term whims and the long-term goals
number, tax file number, driver’s of individuals and establish an ongoing process-oriented collaboration.
license number, and other such isolated relationship that transcends any single Analytics, in combination with
forms of identity. Through APIs, social interaction. knowledge of the collaborative process,
identities can now be linked across can help measure, reengineer, or tune
the Web, providing a consistent and
Process-oriented the processes. As with social platforms,
comprehensive view into individuals’ collaboration inside the internal collaboration platforms
will provide more visibility into user
preferences, interactions with peers, enterprise
and other activities. activities. By analyzing this data,
Today, collaboration in enterprises is companies will be able to gain more
For this reason, social identities evolving from communication and intelligence and insights about their
will become much more valuable to channel integration (also known as
internal communities and collaborative
businesses than getting an individual unified communication) into process-
processes. Extracting “employee
to register on the corporate web based platforms where the underlying
collaboration technology has knowledge intelligence” or “process intelligence”
site. Social identity not only provides in the same manner that marketers use
of the business process in which the
authentication (just like registration), to extract customer intelligence from
collaborating individuals are engaged
but also a wealth of additional data and is specifically tuned to support it. external communities will also enable
about that person. That’s why more web companies to capture and preserve
sites, including leading media sites such Architecturally, process-based
the organizational knowledge that is
as Reuters, CNN, and ABC, are allowing collaboration will evolve in two distinct
directions, making it necessary for IT created and exchanged through these
visitors to log in with various social communities.
organizations to develop clear and
accounts.
careful guidelines for when to adopt The next stage of the evolution of
A better source of business which direction. For mission-critical social networks—as they become social
processes (say, CAD design in a high-
intelligence tech company or software development
platforms—will bring users new levels
of engagement and interaction. At the
The same wealth of information created in an IT organization) where the process
burden outweighs the collaboration same time, it will transform the way
by users and businesses in the social
needs, niche, vertical solutions that in which businesses must think about
platform is also a valuable source of
support the end-to-end process will social media. The changes will be much
business intelligence. Think of it as
an ongoing focus group, in which any be the preferred solution. However, for too important to ignore.
interaction between users tells you most simple processes, the collaboration
something about your customers, the
market, even your competitors. This
burden will outweigh the process
burden, making it both necessary and
Action step
customer intelligence – mined and simpler to standardize on a corporate
Build a case for accommodating
analyzed at aggregate or individual collaboration platform (such as
Microsoft SharePoint or Lotus Notes) on social identities in your web site
levels – will help companies monitor
which many specific processes can be registration process, based on the
their brands, develop more targeted
promotions, and measure their implemented. additional insights your business
performance more effectively against will be able to capture. And begin
The pros and cons are obvious. Too
competitors. designing the frameworks for next-
many vertical solutions will lead to a
generation enterprise collaboration
The integration points that social proliferation of platforms that need to
be licensed and supported, and users models.
platforms provide for this information
20
User Experience is What
Matters
The ability to create experiences by deeply engaging the user, using natural interfaces
and integrating processes and devices, will be what differentiates leading companies
and systems from the rest.

Today, business process design is driven Let’s look at each factor in turn. We could be controlled by a blend of voice,
by the need for optimization and cost expect that integrated experiences touch, and gesture.
reduction. But tomorrow it will be will be created by minimizing the
driven by the need to create superior context-switching cost for the user. Design will be a multidisciplinary
user experiences that help to boost Put another way, there will be further exercise: Typically handled today by
customer satisfaction. synchronization around a single IT architects and business owners,
identity—a customer-centric, follow- tomorrow it will involve optimization
But in the future, great user experiences the-user approach as seen today in from the perspective of the process
will require more layered approaches Facebook Connect that allows users to actor, with the emphasis on simplicity
than what is typical today. Leading maintain their identity as they browse. and on removing inefficiencies. As
IT providers are thinking way beyond such, it will call for the talents of
the next great touch-screen interfaces We predict that leading providers sociologists and social anthropologists,
or gesture-driven devices. They are will offer ways to synchronize across among other less typical professions.
preparing to address three specific multiple devices, multiple services, Today, these talents, in connection
factors: the integrated user experience, and multiple processes. For example, with the user experience, are neither
with no cognitive cost of switching it will be possible to unify the user’s recognized nor easily available.
from one context to another; a experience on Amazon.com’s Kindle
compelling experience, which minimizes platform across devices – the Kindle Experiences that truly
tedium and boredom; and a natural e-reader as well as the iPhone, iPad,
device interface – one that involves etc. Similarly, it will be possible to
engage
little or no learning time. Apple has participate in computer gaming across Second, leading providers will
mastered all three factors; for instance, mobile devices and consoles. concentrate on experiences that
its iPhone and iPod products can be “hook” the individual. That implies
used right out of the box, with little At the same time, the application personal engagement—customizing the
need to resort to a user manual. interface and the physical interface will experience to that person’s interests,
gradually decouple. So, for example, a sense of what is fun, responsiveness to
game on a mobile phone will not be challenge, and social connection.
constrained by the physical keys; it

21
The customization aspect is crucial.
Tomorrow’s leading providers will devise
Highly entertaining
ways to reflect back the user’s sense
The entertainment industry will be the trailblazer in terms of user experience.
of self—perhaps his reputation among To a large extent, it already is. More and more consumers worldwide are using
his peers or an amusing aspect of her their TV sets not just for broadcast and cable viewing but to watch movies on
personality—and will ensure the right demand—and to stream what’s on their computers to a bigger screen.
levels of socialization, teaming users
with people they like. Similarly, it will be At the same time, entertainment is now free of the constraints of the home
theater setup. The episode of your favorite comedy that you missed last
essential to emphasize the fun aspect,
week is seen as easily on your Motorola Droid phone as it is on your laptop
providing immediate gratification,
or TiVo. North American and European consumers are getting used to the
appealing to the senses or to the desire
idea of decoupled content, broadcast, and device. Many of their counterparts
for escapism, perhaps. Just one example: in South Korea, Japan, and now China are already quite familiar with the
Pop quizzes on airlines’ frequent-flier concept.
sites that provide participants with a
few minutes of entertainment and the The decoupling is moving fast. Already, Google TV promises “a full Web
chance to win a prize. browser on your HDTV screen.” And more and more entertainment-center
products—not only the TV set itself—can now connect to the Internet for fast
Also important to engagement is the
access to movies, music, and more.
idea of a challenge with tangible goals The point is that entertainment experiences will span more and more devices
and incentives and a gauge of progress. and tap into more and more sources of content. Increasingly, we will have
Challenges invoke competitive spirit to rethink our concept of “the TV” so that we separate the device from the
and provide a sense of accomplishment. service and the service from the process of streaming content. Broadcast as
Interestingly, some features such as we know it will fade away, edged out by personalized services. The idea of
avatars or 3D space will be loosely watching football games or Formula One races, with surrounding advertising
translated as a focus on identity or matched to your profile, is not so far away.
structure while others, like clear story It’s clear that as the TV set, the content, and content-delivery processes
lines, currency to drive incentives, or become more digital, TV will become more Web-like. We are not talking about
feedback that measures progress, will be Web pages and hyperlinks; we mean “Web-like” in the sense of optimization,
more directly visible. personalization, and advertising focused on what will appeal to you.

22
Interacting through more Possibilities that are being explored
Action step
include tilting, rotating, or waving a
than touch device to turn it off or on or to switch
screens. Gestures will also be used to
Start planning for superior user
Third, more and more consumers will experiences that help to boost
control common user-interface tasks
expect natural interfaces that require customer satisfaction—experiences
such as scrolling and switching windows.
little learning and have few or no that have little or no cognitive cost
And they will involve interesting
barriers to use. Touch screens, of course, of switching from one context to
combinations of intuitive and learned
have become familiar as standard another, that are very engaging,
behavior—for instance, flicking left to
interfaces for phones, airport check- and that are entirely natural,
right to move through lists.
in terminals, ticket vending machines, requiring little or no learning time.
and tablet computers. Now we see
Voice is the great hope for tomorrow.
touch screens migrating rapidly into
While voice inputs do well with limited
laptops, desktops, and panel displays in
vocabulary sets and are usually good
public places. As a next step, we expect
enough for common command sets
multiscreen interfaces—for instance,
and simple scripted interactions, the
pairing an iPhone or iPad with a personal
technology still has not developed to the
computer for additional input.
point where it is good enough for, say,
dictation.
Gestures comprise a logical extension
to touch interfaces. They are already
We are confident that in a few years
becoming common on phones and
we will be able to use a broader range
tablets. And in the consumer realm,
of voice-controlled inputs to control
Nintendo’s Wii gaming system set
our phones and car communications
the bar—and raised the kinds of “Why
systems. But for several years to come,
can’t we do that here?” questions that
most of us will still be struggling to
will help us use waving and pointing
make ourselves understood when calling
and more of these natural forms of
“customer service.”
expression to control our devices.

23
Seeing Beyond the Walls: The
Process of Change Begins Here
The trends discussed in this report will have
profound impacts on IT inside the enterprise.
But if they are to realize their full impact, they
must not be addressed in isolation.

The themes discussed in this report will Essential IT capabilities Architecture


have profound impacts on IT inside the New infrastructure architectures will
enterprise. Faced with change on so A useful next step is to understand how continue to emerge. That means system
many fronts, IT specialists have to be the trends apply when mapped against architecture should be decoupled from
prepared to change too. When faced the capabilities that should be part of infrastructure architecture as well
with many changes all at once, they the fabric of every IT organization— as data architecture. You want your
can’t afford to become overwhelmed. capabilities in which high-performance system to run on any architecture and
Nor can they afford to haphazardly reap IT groups excel: to become reconfigurable. The goal is to
the benefit of one trend or other in set the enterprise on a course toward
isolation. IT governance greater agility.
Perhaps the most significant changes
Instead, a more effective planning requiring a reevaluation of IT policies Information management
approach starts by actively looking for are related to security and privacy. Distributed data requires good master
connections among the themes inside These can no longer be seen as black- data management, which is the
one’s own organization. The cloud, and-white issues, but rather as shades foundation for better analytics and for
for instance, pervades several of the of grey. IT governance thus should set managing data privacy – two crucial
themes. It is a major influence on the clear policies and guidelines around risk differentiators. Converting data to data
distribution of data, it raises questions tolerance, so that the IT organization services will also help you decouple the
about information security, and it can understand where it is positioned data from applications – the first step to
presents new ways to conceive of IT along the security and privacy achieving a true data platform.
architecture. So the relevant question continuum – and where it should be
for an IT leader might be: How do your positioned.
cloud strategy and your data strategy
collide?

24
Workforce and resource management Service management Accenture’s Technology Vision 2011 report
New and advanced skills will be needed Although most technology trends are describes discontinuous change and
for data management, architecture, raising user expectations, the reality explains why IT leaders and their business
security, and analytics. Obtaining these is that IT specialists actually have less colleagues now need to recast their
skills will require a clear workforce and control over the technology universe. approaches to technology in the context
compensation strategy that emphasizes As a result, it pays to understand of these capabilities. Specifically, they
hiring, training, retaining, or outsourcing user expectations, to improve the must accept that everything – hardware,
to the best talent available. Just as user experience with processes and software, applications, data – will be
importantly, your people must be willing interfaces, and to obtain the right distributed. They must accommodate the
to embrace change. providers motivated by the right fact that everything – data from data
incentives. representation, infrastructure architecture
Security
from application architecture, business
The new challenges include the cloud,
Outsourcing processes from applications – will be
the consumerization of IT, and the
Buying services involves different skills decoupled from everything else. And they
complexity of cyber threats. High-
than buying products. In addition to have to recognize that everything will be
performing organizations will take
the traditional emphasis on technology analyzed – structured data, unstructured
a practical rather than emotional
features, what matters in services are data, meta-data, and even keystrokes on
approach to the new devices and new IT
reality, informed by their tolerance for the viability and business practices of a web site.
risk and policies on risk management. the provider. You need to understand the
Threat complexity, for its part, merits economics of sourcing, the trade-offs The organizations whose IT and business
coordinating responses with partners in involved, and the strengths of various leaders are quick to grasp those realities
industry and government. providers. Outsourcing has moved will be those that rapidly pull away from
beyond commodity functions to include the pack.
Solutions delivery a partnership with organizations that
Business needs should motivate adoption provide sophisticated skills in security,
of new technologies in your solutions. analytics, architecture, and other
For example, a practical approach to crucial activities. Sourcing and vendor
security and privacy, and therefore cloud management thus have become critical
adoption, will help your businesses speed skills.
time to market. Focusing on decoupled
architecture will increase agility, so that Strategic IT alignment
you can respond to market shifts quickly. In the past, alignment referred to how
And a focus on analytics will help the IT the IT organization served the business’s
organization to become a close partner needs. The new trends discussed here,
with business units in making better and their accelerated pace, shift the
decisions that lead to improved business alignment emphasis to educating the
outcomes. business about what new technologies
can do and how IT can help improve
execution of the chosen strategy. In that
way, the IT function can move from its
focus on service-level agreements and
costs to being a creator of value. That is
the next frontier of high-performance IT.

25
Notes Additional Resources

1 Jeanne G. Harris and Allan E. Alter, “Cloudrise: Reward and Risks at the Mind the Gap: Insights from
Dawn of Cloud Computing,” Accenture Institute for High Performance,
November 2010
Accenture’s third global IT
performance research study.
2 Accenture, “Mind the Gap: Insights from Accenture’s Third Global IT
Performance Research” November 2010 u Register to Download PDF
3 Forrester Research, Inc., “A Workable Application Modernization Framework Mind the Gap
Insights from Accenture’s third global IT

Is Job No. 1 Now,” April 26, 2010


performance research

4 SAP press release, “New Reality of Real Time With Launch of SAP High-
Performance Analytic Appliance,” December 1, 2010
Computing in the clouds.
5 CQ Weekly, “Cybersecurity: Learning to Share,” August 1, 2010 This article originally appeared
in the May 2008 issue of

The journal of
u Read on Accenture.com
6 Sypris Electronics press release, “Sypris to Develop International Cyber
high-performance business

Range Host Cyber Warfare Testing for U.S. and Its Partners,” November
On the Edge

Computing in the clouds


By Kishore S. Swaminathan, Chief Scientist, Accenture
u Download PDF
10, 2010; St. Petersburg Times, “Cyberspace War Games, For Security and Geeks are not generally known for colorful
metaphors. But “cloud computing,” a
metaphor that captures the imagination
as well as the essence of an emerging
computing paradigm, may be one of those
know or care which cloud a given raindrop
comes from. Within this rather broad definition,
there are at least three discernible categories
of cloud computing.

Profit,” November 21, 2010


rare exceptions. Hardware cloud. This is a very large and very
sophisticated data center that lets you use
Among other things, cloud computing its hardware—servers, storage, network—for
promises sourcing flexibil-ity for hardware a use-based charge. You can, among other
and software, variable (as opposed to fixed) things, run your enterprise applications, store
costs for IT, continuous (as opposed to step) your data or execute e-commerce transactions.
upgrades for mission-critical enterprise soft-
ware, centralized management of user soft- Need more processing power, storage or
ware, and arguably better security against bandwidth during crunch time? No problem.
malicious attacks and data theft. Although The hardware cloud infrastructure expands

7 Dark Reading, “Researcher Demonstrates how to Counterattack Against a


the phrase “cloud computing” is somewhat or contracts to accom-modate your need,
new, many of the underlying technologies and you pay only for what you use. Exactly
are relatively mature and available. how the provider does this or precisely
where the computing takes place, you don’t
Cloud computing refers to the sourcing of know and, theoretically, you don’t care.
some capability—hardware, software, execu- Amazon’s Elastic Computer Cloud, or EC2,
tion of a business process—from somewhere is an example of a hardware cloud.

Targeted Attack,” April 19, 2010


“out there.” The users of the capability don’t
know and don’t care exactly where it comes Software cloud. Also known as Software as
1
Outlook 2008
from or how it’s put together, much the same a Service, or SaaS, this is specialized software
Number 2 way a grateful farmer in a drought doesn’t (say, for customer relationship management)

8 PCWorld, “Conficker Worm Draws a Counter-Attack,” February 12, 2009


Cloud Computing – Where is the rain?
9 Wall Street Journal, “The Web's New Gold Mine: Your Secrets,” July 30, 2010
u Read on Accenture.com
10 BBC News, “Google in 'Significant Breach' of UK Data Laws,” November 3,
2010
This article originally appeared
in the October 2010 issue of
u Download PDF
The journal of
high-performance business

11 Internet Crime Complaint Center (IC3), “The Internet Crime Complaint On the Edge

Cloud computing:
Center Hits 2 Million,” November 15, 2010 Where is the rain?
By Kishore S. Swaminathan
Chief Scientist
Accenture

Cloud computing makes traditional IT faster, better

12 U.S. House, 111th Congress, “H.R.6423: Homeland Security Cyber and


and cheaper—and it has the potential to change both
the business and IT landscapes in fundamental ways.

Physical Infrastructure Protection Act of 2010” (introduced November 17,


2010)
Title ofthe
What brochure
Enterprise
Needs to Know About What the enterprise needs to know
13 Mashable.com, “Each Month 250 Million People Use Facebook Connect on Cloud Computing

the Web,” December 8, 2010


October 2009
about cloud computing.
u Read on Accenture.com
u Download PDF

26
Research Methodology Contacts

For this year’s Tech Vision report, we The response—approximately 400 Don Rippert
cast the net wider and deeper than hypotheses with input from scientists,
architects, and engineers—covered
Chief Technology Officer
before. In late 2010, the Accenture
Technology Labs developed hypotheses topics as well publicized as cloud and Managing Director –
about information technology computing and mashups along with Technology
developments that will have a many others much less familiar, such as
significant impact on Accenture’s text mining and sensor fusion. The team
Dr. Gavin Michael
clients in the next five years. then worked with the R&D groups to
look for overlaps and redundancies, and Global Managing Director,
At the same time, a wide range of other
to test each hypothesis against these R&D and Alliances
sources was scanned to add ideas to
six criteria:
the mix. The sources included the recent
activities of commercial R&D labs, • Certainty of transformational impact Dr. Kishore Swaminathan
the flow of venture capital funding, on corporate IT departments Chief Scientist
trends highlighted by IT analysts, key
• Velocity and scale of technology
themes at industry conferences, and the
change
academic literature.
• Impact beyond any one IT “silo”
We also drew on Accenture’s High
Performance IT research and on the • More than a “one for one”
findings from our annual IT Executive replacement of an existing solution
Forum. And we tapped the expertise of
Accenture practices in areas such as • Being actively explored today and
analytics, IT security, and Innovation. considered practical for the near
future

• Transcends any one vendor or discrete


“product” technology

Out of this process came more than


50 defensible hypotheses that were
synthesized into the themes presented
in this year’s report.

27
About Accenture
Accenture is a global management
consulting, technology services
and outsourcing company, with
approximately 211,000 people serving
clients in more than 120 countries.
Combining unparalleled experience,
comprehensive capabilities across all
industries and business functions,
and extensive research on the world’s
most successful companies, Accenture
collaborates with clients to help
them become high-performance
businesses and governments. The
company generated net revenues
of US$21.6 billion for the fiscal
year ended Aug. 31, 2010. Its home
page is www.accenture.com.

Copyright © 2011 Accenture


All rights reserved.

Accenture, its logo, and


High Performance Delivered
are trademarks of Accenture.

You might also like