You are on page 1of 96

OCTOBER 2015 VOL. 13 ISS.

10
CYBERTREND.COM

INTEL
INTRODUCES
THE FUTURE
ITS FAST NEW CHIPS
STAND TO BOOST THE PC
MARKET & POWER A NEW
GENERATION OF DEVICES

ALSO IN THIS ISSUE

Special Section For IT Managers

Special Section For PC Enthusiasts

Volume 13 : Issue 10 : October 2015

18

INSIDE INTEL & ITS


NEWEST, FASTEST PROCESSORS

PROTECT YOUR
MOBILE ASSETS

4 NEWS
Business technology news and research

24 DATA
Methods for leveraging data and analytics

8 COVER STORY
Intel and its new 6th generation processors

28 ENERGY
Energy efficiency and the environment

12 BUSINESS
Tech strategies for smart business

31 IT
IT and data center concerns

14 CLOUD
Cloud computing and cloud-based services

40 NETWORKING
Wired and wireless networking

18 MOBILITY
Mobile tech for doing business anywhere

46 SECURITY
Solutions and best practices for security

22 COMMUNICATIONS
Communication and collaboration

54 ELECTRONICS
High-end consumer electronics

66 PROCESSOR
Special advertising and content from our
Processor partners

Special Section For IT Managers

77 COMPUTER POWER USER


Special advertising and content from our
Computer Power User partners

56 TIPS
Advice for mobile professionals
Special Section For PC Enthusiasts

CONTACT US
P.O. Box 82545
Lincoln, NE 68501
or
120 W. Harvest Drive
Lincoln, NE 68521

Advertising: (800) 247-4880


Fax: (402) 479-2104
Circulation: (800) 334-7458
Fax: (402) 479-2123
www.cybertrend.com
email: feedback@cybertrend.com

Copyright 2015 by Sandhills Publishing Company. CyberTrend is a registered trademark of Sandhills Publishing Company. All rights reserved.
Reproduction of material appearing in CyberTrend is strictly prohibited without written permission.

Spending On BPM Suites To


Reach $2.7 Billion This Year

Chief Data Officer Role & Benefits Derived From Data


Grow In Importance Simultaneously

The growth of BPM (business process


management) suites reflects the move toward digital processes, with Gartner forecasting a 4.4% increase in BPM spending
this year. Maybe most interesting,
Gartner sees a shift toward iBPMS (intelligent BPM software), which is designed
to provide better insight into processes
via analytics. An iBPMS supports business responsiveness, often at the moment
of truth in a customer interaction, says
Rob Dunie, Gartner research director.
Gartner identifies four significant trends
in the iBPMS market: a focus on business
transformation; integration with the IoT
and digital processes; a shift away from
transactional processes; and the assimilation of mobile, social, cloud, and analytic
features into BPM.

As organizations discover the increasing importance and value of data, there comes a
more defined focus on getting the most relevant and usable information from that data,
managing data practices so resources arent wasted on low-value data, and using data
with less risk and strict adherence to regulations. Along with this focus, there is increased
interest in the role of the CDO. For its new study, The Chief Data Officer, Experian
surveyed more than 250 CIOs and CDOs working for large organizations to get their take
on these trends. Businesses need evangelists for data and individuals with the intelligence
to not only ensure information assets are governed, accurate, accessible and complete, but
also promote the use of data for good across the business, says Thomas Schutz, senior
vice president and general manager of Experian Data Quality. The rise of the CDO puts
that concept front and center. Below are some of the studys key findings:

Server Shipments & Revenues


Grew In Q2 2015

Manufacturers & Buyers Take


A Closer Look At 2-In-1 Devices

Ample Retail Opportunities Exist


For Internet Of Things

Worldwide server shipments grew 8%


and revenues increased 7.2% in Q2 2015
relative to Q2 2014, according to Gartner.
And while comparable growth occurred
among x86 servers, there were declines in
RISC/Itanium Unix servers. It is likely
that in anticipation of further currency
rate shifts that some organizations utilized
their budgets earlier in the year rather than
waiting until the third or fourth quarters
when their purchasing power may be further reduced by these relative currency
changes, says Jeffrey Hewitt, research vice
president with Gartner.

Despite an 8% decline this year in tablet


shipments overall, IDC expects the 2-in-1
segment, which includes devices with detachable screens, to grow 86.5% this year
compared with 2014. The volume itself
is low (14.7 million units) compared to
the overall tablet market, but the current
growth is significant. Commercial segments will play a crucial role in the future
of 2-in-1s, says Jean Philippe Bouchard,
research director with IDC, who expects
that IT departments will begin moving
from portable PCs and tablets to 2-in-1s
after evaluating upcoming device options.

In its latest look at how the IoT


(Internet of Things) is likely to affect
retailers, Juniper Research suggests that
those companies that are able to create
an ecosystem tied to IoT technologies
will gain the greatest market advantage. The joining of wearable devices
and other consumer electronics with
radio tags and beacons offers a prime
example of this. Juniper forecasts that
retailers will spend $2.5 billion on IoT
by 2020, about four times more than the
$670 million Juniper estimates retailers
will spend this year.

October 2015 / www.cybertrend.com

95% believe their organization is


changing in response to the
insights data brings

70% see the CDO as the trusted


advisor of data across the
organization

80% understand data is valuable


but believe their organization isnt
exploiting data to the fullest

63% of organizations without


a CDO are interested in
establishing the role of CDO

Report Looks At How We Use


Smartphones To Disengage

What Are The Social Folks Using? Facebook Is Still No. 1


But Instagram & Pinterest Are Growing

Among the many things the Pew Research Center tracks is how we use our
mobile gadgets. According to its latest
report on etiquette in the digital age, for
which 3,217 adults on the research centers
American Trends Panel were surveyed,
Pew says 89% reported having used their
smartphones the last time they took part in
a social gathering. If youve ever suspected
that someone is using a phone to disengage, you could be right; 16% of survey
respondents said they used their phone
because they werent interested in what the
group was doing, and 10% wanted to avoid
participation. On the bright side, respondents more typically used their phones to
participate with the group; 41%, for example, used their phones to share something relevant to the gathering.

In early September, Facebook passed a milestone: 1 billion users logged in on the


same day. In terms of growth, however, Facebook has remained fairly static since
2013, as has Twitter, according to the Pew Research Center, and LinkedIn declined
by three percentage points this year. Instagram and Pinterest, on the other hand,
have more than doubled in growth between 2012 and 2015. So whats the message
for businesses? Take notice of these and other social media trends; although social is
relatively new, things change, sometimes rapidly. (Remember reading about how to
use MySpace to promote your business?) The charts below illustrate the findings of
the Pew Research Centers spring 2015 telephone survey of 1,907 adults living in all
U.S. states and Washington, D.C.
Percentage of online adults who say they use the following social
media platforms, by year, according to the Pew Research Center.
FACEBOOK
PINTEREST
INSTAGRAM

71
71
72
15
21
28
31
13
17
26
28
20

LINKEDIN

In its Wireless Charging: Opportunities, Applications & Standards 20152020 report, Juniper Research says it
expects about 40% of U.S. households and
20% in Europe will use wireless charging
technologies within five years. Wireless
charging will ultimately be about more
than the power and speed of charge, says
James Moar, research analyst and report
author. The ability to pinpoint device location through data exchange enables all
kinds of location-based activation functions around the home, the car and in the
leisure industry.

22
28
25

YEAR
2012

TWITTER

Wireless Charging Gears Up


For Mass Adoption

67

16

2013

18
23
23

2014
2015

CyberTrend / October 2015

Consumers Favor Chromebooks


More Than Businesses Do

Wearables Industry Sees Significant Growth

Most U.S. businesses arent buying


Chromebooks, despite Googles best efforts to promote them along with its
Chromebook For Work productivity suite.
Chromebooks are worthy of consideration
among small to midsize businesses, especially those with few IT infrastructure
resources, says Isabelle Durand, principal
analyst at Gartner. Chromebooks will become a valid device choice for employees
as enterprises seek to provide simple, secure, low-cost, and easy-to-manage access
to new Web applications and legacy systems, she adds, unless a specific application forces a Window decision. Gartner
says the education segment remains the
largest buyer of Chromebooks worldwide,
and more consumers buy Chromebooks in
the U.S. than in other regions.

IDC now forecasts that


an estimated 72.1 million
wearable devices will ship
this year, representing
173.3% year-over-year
growth, as 26.4 million
units shipped in 2014. IDC
anticipates the CAGR for
shipment volumes will
reach 42.6% over five years,
with shipments reaching
155.7 million units in 2019.
Demand for basic wearables, or those that do not run third-party apps, has been absolutely astounding, according to IDC. Shipments of basic wearables are expected to reach 39 million units this
year. By contrast, smart wearables, which run third-party apps, will only ship 33.1 million units this year, says IDC, but will take the lead over basic wearables next year; IDC
forecasts 89.4 million smart wearables and 66.3 million basic wearables will ship in 2016.
Smart wearables represent an approaching shift in computing and will present vendors,
app developers, and accessory makers with numerous opportunities, IDC states.

Smartphone Sales In Emerging


Markets Driving Growth

Portable Computer Market


Expected To Remain Flat In 2015

Advice For Retailers As Mobile


& Online Transactions Rise

Emerging markets can make for volatile investments, but a recent Gartner
study found that emerging markets
were a key driver of worldwide smartphone sales in the first quarter of this
year. In total, worldwide sales were up
19.3%, led by markets in Asia/Pacific
(excluding China), Eastern Europe, the
Middle East, and North Africa. These
emerging markets had a 40% increase
in sales in the first quarter of 2015. The
Gartner report also examined the top
mobile platforms, and iOS is chipping
away at Androids worldwide lead.

Despite Microsofts introduction of


Windows 10, ABI Research anticipates
total shipments within the portable PC
category will reach 165 million units
this year, changing little compared with
2014 shipments. Laptops in particular
are expected to decline 7% year over
year, according to ABI. Other segments
in the firms portable PC category are
netbooks, Chromebooks, and ultraportable PCs. Chromebooks should see a
335% increase this year to 7 million
units, thanks in large part to the education market.

The number of purchases made online


using desktop computers and mobile devices is likely to reach 125 billion worldwide by 2018, a figure more than 60%
greater than this years total, according to
Juniper Research. Commuter commerce
is also on the rise thanks to widespread
Wi-Fi and 4G connectivity, the firm says.
Retailers must be careful to avoid data
breaches, however, says Windsor Holden,
head of forecasting and consultancy at
Juniper. Consumers need to be reassured
that their vital information is not being
compromised or shared, he says.

October 2015 / www.cybertrend.com

STARTUPS
Startup Receives Investment
For Cloud-Based HCM

Startup Carbon3D Introduces A New Kind


Of 3D Printer Technology

Purchasing and managing employee


health care and benefits packages is a complex business, and organizations are increasingly turning to cloud-based solutions
for help. PlanSource is one such vendor,
offering a cloud HCM (human capital
management) platform that the company
says is used by more than 8,000 companies.
Private equity firm Great Hill Partners recently announced a $70 million investment in the Orlando, Fla.,-based startup.
In its press release, PlanSource indicated
it will use the funds to drive growth in
sales and marketing, invest in continued
product innovation, and expand its operating and technology infrastructure.

Impressions (and often results) of 3D printing tend toward the simplistic. Some
remarkable things can be made
with the current technology,
such as prosthetics, but there are
sharp limits to what the technology can deliver. Carbon3D, a
startup based in Redwood City,
Calif., has developed an alternative called CLIP (Continuous
Liquid Interface Production). According to Carbon3D, CLIP eschews the layerby-layer approach todays 3D printers employ and instead grows parts. This
method also allows for the use of a wider range of materials. Carbon3D announced
a $100 million funding round led by Google Ventures, which company CEO and
co-founder Joseph M. DeSimone says will bring it closer to delivering on its vision.
Carbon3Ds technology has the potential to dramatically expand the 3D printing
market beyond where it stands today and reshape the manufacturing landscape,
says Andy Wheeler, Google Ventures general partner.

BlueData EPIC Offers Virtually


Instantaneous Big Data Clusters

Intel Invests In Silicon Valley


Startup Focused On OpenStack

Mode Seeks To Bring Together


& Streamline Analytics

Big data projects can easily get bogged


down and take longer than expected.
Mountain View, Calif.,-based startup
BlueData seeks to address that problem
with its EPIC platform, which enables
users to create Hadoop and Spark clusters for analyzing unstructured data as
needed and in very little time. In August
Intel Capital led a $20 million Series C
financing round for the company, which
BlueData will use to continue its efforts.
Doug Fisher, Intel senior vice president
and Software and Services group manager,
is joining the BlueData board of directors.

OpenStack is an open-source platform for building and managing computing resources in the cloud, and
Mirantis is a startup that offers an
OpenStack distribution as well as related
services, training, and support, with an
emphasis on improving enterprise scalability. In August the Mountain View,
Calif.,-based company announced a
$100 million funding round led by Intel,
new investor Goldman Sachs, and numerous existing investors. With Intel as
a partner, look for Mirantis to expand
its reach among enterprises.

The analytics experience can be frustratingly fragmented and inconsistent for


organizations. In answer to this problem,
San Francisco-based startup Mode offers
what it calls a collaborative analytics
platform built by analysts, for analysts,
which draws in all analytics workflows
and delivers requested information in a
streamlined, easy-to-understand manner.
In August the two-year-old company announced it had completed a $7.5 million Series A investment round led by
Foundation Capital. Mode will use the
funds to further expand its platform.

CyberTrend / October 2015

Inside Intel & Its Newest, Fastest Chips


INNOVATION LIVES ON IN INTELS 6TH GENERATION CORE PROCESSORS

KEY POINTS
Intel has been developing
processor technology and other
products for nearly 50 years.
Today Intel also offers solidstate drives, server systems,
networking and communication
products, and a platform to
support the burgeoning Internet
of Things.
Intels 6th Gen desktop processors enable multitasking,
4K HD video support, and more
hardware-based security.
The laptop and mobile processors are smaller and more
energy efficient for thinner devices and longer battery life.

October 2015 / www.cybertrend.com

AT INTELS VERY CORE, the company


has always been about how to make
processors smaller, more powerful,
and more energy efficient. With the
launch of Intels 6th generation of
Core processors, theres no better time
to learn a little bit about where the
company came from and how its history of innovation has helped to shape
not only the company itself but the
face of computing.

Intels Processor Foundation


Intel was established in 1968 as NM
Electronics by Bob Noyce and Gordon
Moore after the two men left their positions at a semiconductor manufacturer.
By 1969, the company renamed itself
Intel (short for Integrated Electronics)
and released its first product, the 3101
Schottky Bipolar RAM (random access memory). After creating the first
ever metal oxide semiconductor in

1969, Intel would also enter into the


memory systems market in 1970 with
its MU-10 board. In that same year, the
company would go on to establish its
1103 DRAM product as the industrystandard for computer memory and
never look back.
The year 1971 was one of more firsts
for Intel as the company not only released its first microprocessor, the
4004, but also went public at $23.50 a
share and earned $6.8 million. In 1972,
Intel opened up an assembly plan in
Malaysia; entered the digital watch
market by purchasing a company called
Microma, which built prototype LCD
watches; and made the first 8-bit microprocessor, called the 8008. Further
supporting its microprocessor line,
Intel released its Intellec-4-40 software
development tool and PL/M programming language designed specifically for
microprocessors.

By 1974, Intel would release its new 286 microprocessor, which was part of desktop computers, and the Intel brand
8080 microprocessors, which were said
the major personal computer boom of became synonymous with this eras
processors as the company ascended to
to be 10 times more powerful than pre- the 1980s.
vious models and were used in a wide
Innovation in the microprocessor become the undisputed market leader.
The Pentium Pro followed in 1995,
range of products, including cash regand microcontroller product lines
and in 1998 the company introduced
continued throughout the 1980s and
isters and traffic lights. And in 1975, the
its Intel StrongARM processor, which
8080 microprocessor was featured as beyond by adding more performance
was designed specifically for smaller,
part of the Altair 8800, a basic personal while also reducing power consumption. This led to Intel hitting $1 bilhandheld devices. Also in 1998, Intel
computer that served as a hobbyist kit
released the Pentium II Xeon processor,
lion in revenue by 1983. In 1985, Intel
for those interested in and dedicated to
which served a huge demand
personal computing.
within the server and workstaBy 1976, Intel started
tion market.
packing more and more techAs the PC market solidified,
nology into its microprocessors
with desktop and laptop comand other computer products.
puters growing smaller and beThe 8748 and 8048 were both
coming ubiquitous, the turn of
microcontrollers, which put
the 21st century offered Intel
both the processor and the
a widening avenue of techmemory on one silicon chip.
nology and market opportuniBy bringing these two compoties. For example, in 2000 Intel
nents together, manufacturers
announced its new Pentium 4
were able to add computing
processor as well as a wireless
capabilities to even more types
LAN PC card. In 2003, Intel
of products, including home
launched its Centrino processor
appliances and even automoline, which included LAN supbiles, among thousands of other
port and was designed with lapexamples. This was essentially Intels innovative focus on the processor space means the company
tops in mind. And in that same
the beginning of the smart not only thinks about performance, but also size. Intels mobile-foyear, Intel also continued develera of technology, where even cused microprocessors make it so that smartphones and tablets will
not only be more powerful, but also thinner and more portable.
opment in the mobile space with
the most basic devices are given
its PXA800F cellular processor.
new capabilities through the use
In 2006, Intel released its
of small yet powerful chipsets.
released its Intel 386 32-bit processor, Centrino Duo and Core 2 Duo procesMicroprocessors For Early PCs
sors and at the same time developed
which had the major selling point of
In 1980, Intel released its 8051 being able to run more than one prothe worlds first quad-core processor
and 8751, which were both the best- gram at the same time. In 1985 and
for servers and desktops. This blend of
selling microprocessors in the world 1987, Intel made two big entries into consumer, small business, and enterat the time. And along with the Digital
prise focuses helped Intel grow further
the supercomputer market with its
Equipment Corporation (commonly iPSC/1 and IPC/2 supercomputers,
and become the company were familiar
known as DEC) and Xerox, Intel eswhich used Intels 286 and 386 proces- with today.
tablished the cooperative Ethernet sors, respectively. And in 1988, which
project as a way to develop ways for was the companys 20th anniversary,
True To Its Core
computers to connect to one another Intel created its first flash memory techMuch of what Intel does right now
via LANs (local-area networks). 1981
has origins that are traceable through the
nology called EPROM Tunnel Oxide.
was another big year, as Intels 8088
companys history as well as new capamicroprocessor became the backbone
Center Of The PC Revolution
bilities that continue to push boundaries.
of the IBM PC. This is important beIntel transformed the personal com- Intel introduced its Core processor line
cause in the following year, the com- puting market with its Pentium chips,
in 2010 and continues building on that
pany not only released its first 16-bit
introduced in 1993. By 1994, Intel
line, boosting performance and energy
microcontroller but also its 16-bit Intel
chips were used in more than 85% of all
efficiency with each new generation.

CyberTrend / October 2015

Intel recently announced its 6th


generation of Core processors (more
on that later), and with that announcement illustrated precisely how diverse
it can be in the technology world. Its
processors are not only found in desktops, laptops, and servers, but also in
smartphones, tablets, and wearable
devices. But Intels essential product
portfolio extends beyond processors to include SSDs (solidstate drives), server systems,
networking gear, communications products, and even sensors
for the burgeoning IoT (Internet
of Things). The remainder of
this article takes a look at some
of Intels product lines today.

systems, server boards, server chassis,


and RAID storage.
All of Intels server products come
with the companys quality guarantee.
The products are tested and certified,
and Intel continues improving existing
technologies even as it develops new
ones. Intel views its customers more as
partners because its server solutions are

Converged Network adapters, Ethernet


Gigabit server adapters, and other cables, controllers, and switches.
The key to these product lines is that
Intel is trying to increase networking
speeds and improve performance at
every level. From the individual component level with SSDs up to the high-tier
computing level with server systems
and the networking cables that
connect them, Intel wants to
make sure that data and applications will perform well and improve the overall employee and
customer user experience.

Wireless
& The Internet Of Things

Intel is also investing a great


deal of resources in the IoT
market. There are countless comIntel offers three separate
puterized devices and sensors
SSD product families: one for
out there, many of which need
data centers, one for profesto communicate with each other
sionals, and one for consumers.
or with corporate networks. The
The Data Center Family prioriIntel IoT Platform is designed
tizes performance and capacity
to help keep track of these difso the data center can keep up
ferent devices and make sure the
with the businesss demands
connections between them and
and support all of the necesindividual networks are secure
sary data and applications. The Intels new Core i7-6700K and Core i5-6600K processors are
and fast.
Professional Family focuses on computing powerhouses focused on multitasking, Ultra HD 4K
Through the use of hardware
security in addition to perfor- video playback, and increased hardware security to protect your
and software verification gatemance, with AES 256-bit hard- system on every possible level.
ways, as well as end-to-end seware-based encryption to ensure
curity, businesses can safely use
that even when youre on the go,
data gathered from wireless sensors in
designed to make businesses more comyour data will be as secure as possible.
data centers, industrial machinery, vepetitive, help them bring their products
And the Consumer Family is all about
to market faster, and build stronger cus- hicles, and other areas to make more
making PCs faster and easier to use,
as well as more durable because SSDs, tomer relationships through powerful informed business decisions. Instead
of simply focusing on the connections
unlike traditional hard-disk drives, and reliable technology.
between these devices, Intel makes it
dont use moving parts.
Networking & Communications
so you can see the IoT as one cohesive
Servers
In addition to pure computing and
network where organizations can perWhether your company needs a processing products, Intel also makes
form data analytics and gain actionable
relatively small 1U or 2U server setup
a wide range of networking and cominsights along the way.
or support for a much larger environ- munications solutions. For example,
ment, Intel offers full-featured server
the company offers applications for
Intels 6th Generation Processors
systems as well as individual com- improving network performance and
Intels new 6th generation line of
ponents for the best performance at
maintaining a solid wireless networking processors for 2015, code-named
any level. Overall, Intel offers server experience. Intel also makes Ethernet Skylake, build on the companys roots

Solid-State Drives

10

October 2015 / www.cybertrend.com

notebooks), and Intel Xeon processors


for mobile workstations. Plus, Intels
Core m3, m5, and m7 processors are
designed for tablets, smartphones, and
other mobile devices.
Intel says its 6th Gen i5 processors
are as much as 60% more powerful
than the previous generation, and that
its Core m series are 33% smaller and
consume 60% less power than previous
models. This means that mobile devices using these chips will not only be
smaller, but also less taxing on batteries.
For desktops. Intel is known for
As with the desktop alternatives,
pushing performance and speed to the
Intels mobile-focused processors can
limit, and this years Core i7-6700K
handle high-quality video
and Core i5-6600K desktop
and 4K video playback, deprocessor, are no different.
pending on the device. They
The Core i7-6700K in paralso have the same security
ticular is a quad-core promeasures in place as their
cessor with 8MB of cache
desktop kin, including the
memory and the capability
Software Guard Extensions
to support as much as 64GB
and Memory Protection
of DDR4 RAM. These proExtensions. Additionally,
cessors are built to handle
all of the new Intel procesthe most resource-intensors support Thunderbolt
sive applications, whether
3, which is the new USB
that be for graphic design,
Type-C connection that is
video editing, or other uses.
up to eight times faster than
Both the i7-6700K and i5USB 3.0 and can provide data
6600K come with Intels In addition to desktops and laptops, Intels processors can be found in a
transfer speeds up to 40Gbps.
Turbo Boost 2.0 Technology wide range of hybrids, tablets, and smartphones.
and Hyper-Threading
Heading Into The Future
Technology, which means
Intel
plans
to release numerous other
youll not only get a performance jolt boots up, Intel has built in its Device
Protection Technology with BIOS products in the coming months, inacross the board, youll also be able to
handle two compute-intensive tasks Guard 2.0 as well as Device protection cluding the Intel Xeon E3-1500M proTechnology With Boot Guard. These cessor family for mobile workstations
per core at the same time without
features work together to ensure your and the 6th Gen Intel vPro processors
missing a beat. With the Core i7 processor, you can run eight separate tasks desktop and its contents are safe and for enterprise uses. You will also see
new variations on the 6th generation
protected at all times.
on eight separate threads.
Intel processors in the products that
The i7-6700K and i5-6600K are also
For laptops and mobile devices. fall into the IoT category (impacting
capable supporting Ultra HD and 4K
retail, manufacturing, automotive, and
displays. With the help of Intels Quick Intel recently released numerous other
health care verticals, to name a few) and
processors for specific device types, inSync Video technology, users can edit
wearables.
and/or watch high-quality HD videos cluding Core Y-series processors for
If you plan to buy a new computer,
and multi-task without fear of dropping smaller 2-in-1 hybrid devices, Core
performance. There are obvious benefits U-series processors for ultra-thin mobile device, or other technology
for gamers and PC enthusiasts here, but notebooks and 2-in-1 hybrids, Core sometime in the next year, theres a
the key for enterprises is understanding H-series for notebooks with larger good chance youll find an Intel prowhat applications your employees use; screens (as well as a SKU for enthusiast cessor inside.
and keep the Core family alive and
well. As with all of Intels processor
launches, the newest lineup includes
models designed not only for different
processing needs, but also for different
types of computers. This years chips
focus heavily on desktops, laptops, and
mobile devices, and multiple analysts
have pointed to the speed and energy
efficiency of these chips as standing to
rejuvenate the sluggish PC market.

if those programs require a lot of processing power, then you may want to
look into these new Intel chips.
Most consumers and businesses have
their desktops covered with security
suites, firewalls, and other softwarebased security tools, but hardware is
another story. Intels new processors
provide hardware-level security to
complete the picture. To protect systems and the data stored on them while
theyre in use, Intel offers its Software
Guard Extensions solution and
Memory Protection Extensions tool.
And to protect your desktop while it

CyberTrend / October 2015

11

Software Piracy & License Agreements


LEARN HOW TO PROTECT YOUR INTELLECTUAL PROPERTY

WHETHER YOUR COMPANY develops


software for its own internal use or
for sale on the business or consumer
market, its vital that the end product
be protected to the fullest extent possible. Software is, of course, IP (intellectual property). You invest a great
deal of resources into its creation, and
yet the fact that it is a digital product
makes it highly vulnerable to misappropriation. This is why its so
important to take care in drafting a
comprehensive EULA (end-user license agreement).

Whats At Risk
In short, software is hard to make
and easy to steal. Despite the best preventive efforts, crack sites and warez
groups sometimes manage to gain
access to the developers confidential keys and pass them along to almost anyone who asks, says attorney

12

October 2015 / www.cybertrend.com

Donald M. Gindy. There is a strong


belief among many that whatever is
created for the Internet should be free.
This ignores the mounds of work that
went into the technological creation
and how the law protects the software
developer. Financial gain is also a
huge motivator for software pirates.
Whatever the thieves motivations,
the risks related to pirated software
cut in multiple directions, affecting
the developer, the seller, and the user.
The developer loses revenue, but the
softwares functionality can also be impaired. Software is sometimes distributed in its original state and sold with
stolen keys. In this case the keys may
not work, which protects the developers IP but prevents the user (who
may not know he is using a pirated
product) from correctly installing the
software. This can diminish the developers reputation as the integrity of the

software and the developers security


measures are called into question.
In many cases, stolen software
is counterfeited or changed by malicious developers to include malware.
Counterfeit or hacked software can
cause significant harm to individual
computers and corporate networks, as
the software may not work correctly,
may not include all of the features of
the original software, may not update
correctly, and may contain malware
that goes undetected. As with pirated
software, counterfeit or hacked software
can negatively impact the developers
reputation, even though the developer
has nothing to do with the pirated,
counterfeited, or hacked software.
Pirated software exacts heavy costs
on consumers and enterprises alike,
and it might surprise you to learn how
common such software is. According
to a 2014 study by research firm IDC,

the National University of Singapore,


and Microsofts Digital Crimes Unit,
the best way to avoid malware is to
buy only from name brand computer
vendors or national retail chains.
Regardless, the study says, 62% of U.S.
consumers and 23% of U.S. enterprises have bought PCs from unreliable sources, and 61% of those PCs
were found to be infected with dangerous malware. Users can also get
into trouble for simply using counterfeited or pirated software, and sometimes this usage is easily revealed.
Many employees in big companies
believe that if they steal a program
behind a firewall, they may never be
found, Gindy says. But many developers have tracking systems that identify who their visitors are and what
theyre viewing.
Additionally, sellers and users of pirated software alike are at risk when
they run afoul of laws designed to
protect the developers IP. Computer
software is entitled to the protection
of the Copyright Act of 1976, says
Gindy. If you violate a developers
rights under the Act, you may find
yourself in the cross hairs of a lawsuit
designed to compel your disgorgement
of ill-gained profits or at least pay up
to $150,000 as statutory damages.

How To Protect Yourself


The enforcement of laws protecting
your IP is not in your control, but
the onus is on you, the developer, to
protect your software with a properly
worded EULA. This way, if you find
yourself working with authorities to
combat IP theft, the law will be on
your side.
The EULA is essentially an agreement between the softwares developer
and its user, establishing the terms
and conditions of the softwares use.
Whenever a user initiates a software
installation, the user must acknowledge that he has read, understood, and

It is with the end user license agreement that the


developer dictates what may or may not be done with the
software. Violating the EULA may also be a violation of the
Copyright Act and the Digital Millennium Copyright Act.
If the user uses an illegally generated key to gain access,
he violates the EULA and the DMCA; if the uses that key
to unlock the software he violates both the EULA and the
Copyright Act. In either case or both, the developer may
be entitled to damages.
DONALD M. GINDY
Attorney
Law Offices of Donald M. Gindy

agreed to the EULA before the installation can continue. As an agreement


between developer and user, Gindy
says, there are benefits and penalties.
But the EULA must not be mistaken as
a contract for sale. It must be carefully
drafted to reflect a license agreement
and not a covenant to sell.
A solid EULA should contain specific elements that protect the developer and continue the practice
of licensing, not selling the workproduct, Gindy says. It should state
that the software, the trademarks or
service marks, patents, all the intellectual property remains the property
of the licensor. This license does not
convey any rights to the licensee in the
intellectual property.
The EULA should also clearly state
that a violation of the EULA may
result in a termination of privileges
by rescinding access to the program,
Gindy says. For example, a user
cannot use illegally generated keys to
gain access. Only keys issued by the
licensor or its delegate is approved.
A further condition might include a
statement that the licensee shall be
restricted from certain uses of the
software. These conditions go to the
scope of the license and are particular
to the nature of software issued. It

is established, then, that the user, or


licensee, is prohibited or severely restricted from transferring the software.
The ability of a licensor to control
his product is dependent to a larger
extent upon his ability to prevent the
transfer of the software, Gindy explains. Revisions to the code will be
impeded if the licensor does not control distribution of the program.

Be Part Of The Solution


According to the study cited earlier,
pirated software infected with malware was projected to cost enterprises
$126.9 billion worldwide in 2014,
$22 billion in North America alone.
Software piracy remains a menacing
item in the world of technology, says
Gindy. Even the largest companies
and government agencies find it difficult to prevent unauthorized persons
from gaining access to closely held
proprietary problems. However, with
a well-drafted license agreement a licensor still has redress in federal court.
The EULA demonstrates to the court
that your licensee exceeded the scope
of the license and you are entitled to
damages for those illegal acts. So be
proactive and protect your interests
with a license agreement that will be
honored by a court.

CyberTrend / October 2015

13

Make Cloud Storage Pay Off


ENSURE THE BENEFITS POSSIBLE EQUAL THE ORGANIZATIONS NEEDS

KEY POINTS
Adopting cloud-based storage
can mean immediate cost savings,
but many companies experience
unforeseen long-term complexities
and costs.
Rather than for cost savings,
organizations are increasingly
adopting cloud storage for speed
and agility reasons.
To see a long-term cloud
storage payoff, pinpoint what the
organization specifically hopes to
achieve from cloud storage.
SLAs (service-level agreements), cloud storage model, and
migrating to another provider can
all generate unexpected costs.

14

October 2015 / www.cybertrend.com

FOR YEARS, A NOTION has existed that


going the cheap and easy route with
cloud-based storage was a no-brainer for
an organization because it could see immediate and sometimes significant cost
savings. While this was often true in the
short-term, many organizations found
that over time they actually experience
complications and costs over the longterm that they didnt see coming.
Using cloud storage should not be
about cost, says Ashar Baig, Analyst
Connection president, principal analyst, and advisor. Its not a definite that
if you go to cloud youre going to save
money. That should be the No. 1 rational
of what you should not do when looking
at cloud storage.
Today, in fact, many IT departments
insist they can deliver internal storage at
an equal or cheaper price than a cloud
provider. Even so, there are still several
viable reasons for using cloud-based

storage, even if those reasons are not necessarily tied to costs savings. Whats key
is knowing what an organization should
ask of itself and of potential cloud storage
providers to determine how the organization can benefit and ensure it gets what it
requires from a provider. The following
explores these issues and others in order to
forge a strategy that makes getting a payoff
from cloud storage possible.

Cheap & Easy


Increasingly, organizations are realizing that cloud storage and cloud computing in general are about more than
just achieving cost reductions. As Henry
Baltazar, Forrester Research senior analyst, puts it, the real power of the cloud is
elasticity, or the ability to scale up and
scale down resources on demand. So,
even though IT might provision internal
storage more affordably than a cloud provider, it generally cant do so with near the

I believe that the transparency or abstraction of assets


held in the cloud can and often will lead to a sort of out of
sight, out of mind attitude among IT workers that will be
painful for many organizations.
CHARLES KING
President
Pund-IT

Executives must ask stakeholders what theyre specifically


looking for with cloud storage, be it performance, obtaining new abilities, etc. It doesnt make sense to launch a
cloud program without knowing what stakeholders actually want.
HENRY BALTAZAR
Senior Analyst
Forrester Research

quickness or ease. Further, provisioning


storage in-house means the organization
is stuck with that infrastructure and associated resources potentially for years,
possibly long beyond its really needed any
longer. This makes cloud attractive for
bursty or seasonal workloads for test/development environments, Baltazar says.
Charles King, Pund-IT president, says
with IT in general, a short-term, cheap
and easy approach usually leads to longterm costs and complexity. The cloud
is no different and, in fact, can lead to
further complications because data and
processes occur beyond direct oversight,
he says. Elsewhere, King says, generational
issues pose a constant challenge for IT in
terms of how management and staff turnover impact efficiency and a full understanding of assets and processes. I believe
that the transparency or abstraction of
assets held in the cloud can and often will
lead to a sort of out of sight, out of mind
attitude among IT workers that will be
painful for many organizations, he says.
Baig estimates he speaks with more
than 200 CIOs annually. Where cloud
storage is specifically concerned, he says,
the perspective you get from them is

totally different from any forecast or


survey youre going to read thats associated with IT and IT directors who dont
have visibility into the big picture.
Specifically, Baig says, while its possible for IT to deliver on-premises storage
below the cost of cloud if done efficiently,
cost benefits arent the primary goal when
CIOs choose cloud storage. Its always, always the agility. Thats because hardware
procurement-to-deployment times can
run into the months before completed,
he says. Conversely, procuring needed
storage via the cloud can literally take
minutes. I live and breathe cloud storage
every day, Baig says. To me, thats the
biggest misconceptionthat cloud is all
about economics. Cloud is actually all
about agility.

The Payoff
While some organizations use cloud
providers for primary storage needs,
various data and experts point to cloud
storage as a good option for disaster recovery, backup, and archiving purposes.
In other words, cloud storage is good for
data that needs to be accessed easily but
infrequently, if at all. King says cloud is also

seen as offering the means to efficiently


and cost-effectively address continuing
storage growth and complexity, especially
for ongoing problems such as backing up
employee PCs and mobile devices.
Ensuring that an organization sees a
long-term payoff from cloud storage can
depend on several things, including the use
case in question. When archiving, for example, it makes sense to analyze how often
the company will need to access archived
data because there are bandwidth charges
and longer access times for retrieving content from a cloud, Baltazar says. If looking
at primary storage use cases, such as databases and other transaction-sensitive
applications, expect to be charged for highperformance resources differently.
King says primarily, organizations
should understand why theyre contemplating cloud storage in the first place. This
includes asking questions around data
and/or processes, prices, and potential
benefits. You should not proceed if you
have any unanswered questions or doubts
about those points, he says. How these
points map against potential cloud storage
providers/services should enable determining whether to proceed. King says such
an approach is especially wise with regard
to costs, because some cloud storage fees
arent always entirely clear. For example,
some cloud storage archiving services offer
low costs to upload data but charge considerably more to retrieve it, he says.
Baltazar emphasizes that executives
must ask stakeholders what theyre specifically looking for with cloud storage, be it
performance, obtaining new abilities, etc.
It doesnt make sense to launch a cloud
program without knowing what stakeholders actually want, he says.
Among the finer details that a companys should verify in a potential provider is the type of SLAs (service-level
agreements) it offers and its reputation
for reliability, because outages are likely
to occur occasionally. Here, Baltazar
says, as cloud strategies evolve and organizations use multiple clouds for their

CyberTrend / October 2015

15

workloads, it will be easier to compensate for a cloud outage.


Security is another factor that can differentiate one provider from another.
Many companies never consider the cloud
for storing critical business data due to
security concerns, King says. While most
objections against cloud storage continue
to focus on security and compliance concerns, Baltazar says, these are gradually
declining. Similarly, Baig says security is
improving every year, but if security is a
paramount concern, cloud storage may
not be the right choice. Additionally, he
says, the cloud might not be a good choice
for companies with strong privacy and
compliance concerns due to the often
multi-tenant nature of the cloud.
Other issues companies should check
out include the possibility of being locked
in with a provider, because migrating to
another providers cloud is generally difficult. King says its wise at the outset to
consider what complexity and costs will
be involved in ending a relationship with
a vendor.
Overall, Baig says, the nature of the applications the organization is planning for
cloud storagewhether theyre bursting,
consistent, etc.and what the organizations internal capabilities are must come
into play to choose the right provider.
What capabilities are you looking for?
Static storage? On demand? Is the storage
for real-time applications? How often are
you going to access the data? he says. Baig
also notes that cloud storage isnt one offering. A major cloud service provider,
for example, might offer scores of storage
options, each with different pricing (dedicated, reserved, bidding, etc.) and SLAs associated with them (the shorter the SLA,
the more youll pay), Baig says.

Control Costs
A notable problem with cloud storage
providers is that people dont realize
there are so many complex pricing models
out there, Baig says. Everything from reserved instances to even bidding on cloud

16

October 2015 / www.cybertrend.com

A notable problem with cloud storage providers is that


people dont realize there are so many complex pricing
models out there. Everything from reserved instances to
even bidding on cloud storage on platforms. . . . Its not
plain or black and white that youre necessarily going to
save money.
ASHAR BAIG
President, Principal Analyst & Advisor
Analyst Connection

storage on platforms. While various material is available to help explain how to


intelligently buy cloud storage and different models, there can still be considerable complexity involved. Its not plain
or black and white that youre necessarily
going to save money, he says.
Unforeseen costs can be a major issue
with cloud storage. This can include costs
related to bandwidth use and migrating
data to another cloud. Rogue IT is another
example. Baig says an average company
may use seven or eight cloud storage options with IT typically having no knowledge of who is using what. The biggest
challenge every CIO that I talk to has is
the control of rogue IT, he says. They
say, Were trying to mandate to every
department in the company that, yes,
cloud storage is great. As long as you have
the bosss credit card you can get cloud
storage, but you have to go through us so
we can get a proper ROI analysis.
If this doesnt occur, however, employees lacking proper technical knowledge may select expensive storage options.
Even if they actually need that option at
the time, they may not opt for a less expensive storage model when that need no
longer exists. Choosing the right model
isnt simple, and those wrong decisions
can cost you a lot of money down the
road, Baig says.
In terms of determining how cloud
storage needs will change in the future
and how this will include choices, meanwhile, there are a couple of factors to keep

in mind. For example, an organization


might transfer a large chunk of data to the
provider initially in a backup or archive
scenario. This will probably entail using a
different transfer method (hard disks, for
example) and pricing than whats required
later when just making changes to data.
King says the first key consideration is
knowing that on average, organizations
are doubling the amount of storage they
use every three to four years. At the same
time, IT departments should have records
or at least a sense of how much storage is
being consumed and added annually, he
says. Factors such as what processes and
data the organization is targeting for cloud
support can help refine these estimates.
Another key is knowing that IT should
be responsible for anticipating future application or data usage. This must come
from LOB (line of business), Baig says. A
marketing department, for example, will
need one cloud model early on in a campaign when it will be frequently accessing
data but a different model when the campaign is over and it will access data less
often.
You have to have the right stakeholders or application owners as part of
the decision-making process when youre
setting SLAs and choosing different cloud
storage options. You have to build a plan,
Baig explains. When regulations are in
play, companies have their own compliance and mandates to conform to, and all
those become the feedback into the dataretention period.

The Power Of The Cloud


Made Simple & Affordable
1&1 INTERNET TAILORED ITS CLOUD SERVERS FOR SMBs

1&1 INTERNET has, since its founding in


1988, dedicated itself to making it easy
for businesses and individuals to establish an online presence and monetize the
Web. One of its newest product lines, 1&1
Cloud Server, builds on that tradition by
providing SMBs (small and midsize businesses) with on-demand cloud servers that
are simple to use, inexpensive, and scalable. Javier Salcedo Gadea, 1&1 Internets
Head of Cloud and Servers, spoke with
CyberTrend to discuss the 1&1 Cloud
Server lines capabilities and benefits.

Identifying A Cloud Market Gap


In looking at the cloud computing landscape, it became surprisingly clear to 1&1
that SMBs cloud needs were not being met
by existing cloud providers. We saw two
different kinds of cloud providers, Gadea
says: too simple and too huge. There are
very simple cloud offerings for very cheap,
which is fine for simple use cases but arent
flexible and powerful. And on the other
side there is a mega-cloud market, where
giant companiesAmazon, Microsoft
Azure, Rackspaceoffer super-powerful
products with many capabilities, but which
are complex. To really use it well you need
to be an IT expert.
This recognition produced the spark
behind the 1&1 Cloud Server line. We
saw a very good marketing opportunity
to address the needs of those intermediate
companiesthe SMBs, developers, valueadded resellers, and othersthat want the
powerful capabilities of the cloud but in a

form that is easy to use and very affordable. So we are addressing that mid-segment, combining the best of both worlds.

Not One-Size-Fits-All
The ideal cloud solution for most
SMBs, then, is one that provides ample
capabilities (but not more than are necessary, and not so complex as to necessitate
IT assistance), flexibility (for example,
no vendor lock-in), and speed of deployment. Gadea says 1&1 Cloud Server
delivers on these requirements, and includes multiple options for different sizes
of companies and projects. There are
eight options available, he says, ranging
from an entry-level option for not very
complex websites to a high-end option,
where you may have a database scenario
with a need for at least 8GB of RAM, as
well as a customizable Flex option.

Peak Performance & Security


In order to provide more than convenience, simplicity, and a good price, 1&1
endowed its cloud servers with solid-state
drives for faster performance, and uses
multiple security measures to ensure data
protection. We have in place an antiDDoS [distributed denial of service attack]
solution, a security event management
solution with an IDS [intrusion defense
system] and IPS [intrusion prevention
system] powered by McAfee, and firewalls
by CheckPoint, Gadea says. And on the
virtualization layer we are using VMware
as well as very specific security plug-ins to

JAVIER SALCEDO GADEA


Head of Cloud and Servers
1&1 Internet

prevent anyone from accessing customer


data. In terms of overall performance, a
third-party benchmark company, Cloud
Spectator, found 1&1 Cloud Server outmatched Amazon, Aruba, Azure, and
CloudSigma, taking into account processor, memory bandwidth, network
storage, and internal network features.

The Cloud Done Right For SMBs


The chief benefit of 1&1 Cloud Server,
Gadea says, is that you get all of the
things you would expect in the cloud, all
of the features that generate higher business activity, but at the best price on the
market. Contact 1&1 Internet or visit the
company website for further information
about usage and pricing.

1&1 Internet | 877.461.2631 | www.1and1.com

CyberTrend / October 2015

17

What To Do About Leaky Apps


EMPLOYEE EDUCATION, STRONG POLICIES & SECURITY-INFUSED PROCESSES

BUSINESSES AND consumers have so


much to worry about already when it
comes to security that other areas can
get overlooked. And its those areas
that hackers tend to gravitate toward,
thriving on the lack of attention and
innovating the most. Consider, for instance, that people are using their mobile devices more than ever before, and
there are so many applications available
through various app stores and from
other sources that its difficult to stay
on top of whats new, let alone whats
good and what isnt. Hackers understand
this and can therefore design applications that appear harmless, or that seem
to serve a benign purpose, but which
include viruses, Trojan programs, and
other malware that can harm the device
itself or the corporate network the device
connects to.
But these specifically designed applications are not the only ones potentially

18

October 2015 / www.cybertrend.com

putting you at risk. There are many leaky


apps out there where you enter personal
information thinking its safe only to
find out (or not find out) that some
other party has access to it. This happens regularly with mobile games that
share information with advertisers, but
recent reports indicate that the National
Security Agency and other organizations
have access to that information as well.
In order to avoid these leaky apps and
other app-focused threats, companies
should combine policy, technology, and
education to not only protect employees,
but the business as a whole.

What Leaky Apps Are


& Why Theyre Dangerous
According to Doug Cahill, senior
analyst at Enterprise Strategy Group,
there are typically two different types of
leaky apps. There are those where the
app truly leaks information, with data

getting exfiltrated out of the environment, he says, and then there are other
applications where a third-party entity is
stealing your login credentials, whether
they are stored on the mobile device
itself or not. Cahill explains that these
types of applications typically come from
an untrustworthy source and not one
of the official app stores, such as iTunes
or Google Play, which are pretty well
vetted by Apple and Google.
The problem comes in if you get
baited into downloading an application
from a third-party source thats untrusted, says Cahill. Those applications
will masquerade as legitimate applications. Theres one out there thats a wallpaper application. All of these mobile
applications have manifests associated
with them, which are basically lists of
what services on the smartphone they
need to access. If you think about it, a
wallpaper application should have no

need to access your log-in credentials,


email credentials, or your contact information.
These types of applications can usually be avoided by sticking to trusted
app stores, but there are other applications out there that are much more sinister and can cause even more damage.
As an example, Cahill points out that
a common best security practice today
is two-factor authentication, which involves a username and password but
also another form of verification, such
as a code that is sent to a smartphone.
He says theres a recent trend where
hackers install a Trojan program on
your smartphone and on your desktop,
so they can grab all of the necessary
credentials in one fell swoop and gain
access to a consumers finances or, if the
device is used by an employee, a businesss internal systems.

How To Prevent Problems


& Mitigate Risks
Regardless of whether youre dealing
with apps that are directly leaking information or ones that put your login
credentials at risk, the first and perhaps
most obvious step to avoiding such apps
is to only download and install applications from official, trusted app stores, or
ones that your company allows. Cahill
points out that this is often easier for
businesses that actually issue devices to
their employees, because they inherently
have a lot more control in maintaining
a standard configuration of those devices. But if its a BYOD (bring your
own device) situation, then the problem
is often a little more difficult. Still, the
first step there is clearly stating policies
to end users on what theyre allowed to
run on those devices, he says.
The next step would be to implement some kind of software reputation service for mobile devices, such
as Appthority. These types of services
integrate with mobile device management [MDM] and mobile application

Good policies, processes, and methodologies can really


close gaps and reduce the attack surface area. Its a problem when companies are fast and loose, dont have an
inventory of what apps are running on smartphones, and
are doing software development without tight controls.
DOUG CAHILL
Senior Analyst
Enterprise Strategy Group

management [MAM] platforms so you


can get an inventory of all the applications running on your end users mobile device and then vet the reputation
of that software against a list of mobile
app software, Cahill says. Youre essentially running the inventory of apps
on an employees mobile device against
a cloud-based list of approved and
trusted applications to make sure none
of them stand out as unauthorized or
potentially malicious.
Mobile devices are one thing, but its
also easy to download malicious software
on desktops if employees arent careful.
For those situations, Cahill recommends
taking a white list approach. Security
suites and solutions that offer a white
list tools essentially allow you to approve
certain software and only allow downloads and installations of applications on
that list. Instead of allowing everything
to come in and then running a scanner
to see what is known to be bad, they only
allow whats known to be good to be installed and executed, says Cahill. Thats
a really effective approach.
In addition to using white list solutions, there are also products that are
similar to mobile device offerings that
will vet software for desktops or any
other platform. There are companies
now like Veracode that will verify that a
piece of software has been written with
good security methodology, so basically
incorporating good security practices
[similar to how its done with] software
development, says Cahill. Its an extra
layer of protection to ensure that a piece

of software has had security in mind


from the very beginning.

Improve Your Internal


Development Process
The same idea of vetting software
and taking a security-focused approach
also extends to internal application
and software development. Cahill says
that most companies are caught in a
cycle of continuous everything, where
they are continuously developing, integrating, and delivering software applications at a fast pace. He adds that with
cloud-delivery models becoming more
popular, its easier than ever to publish a piece of software without putting
proper checks in place. In these hurried
development processes, there is a higher
chance of human error and a chance
that security vulnerabilities are finding
their ways into code.
The key is catching these vulnerabilities before the software is delivered and
preventing unnecessary risks, which
is another area where software vetting
companies come into play. What companies can do is submit their software
to a cloud-based application security
testing platform and get the security of
that software vetted before they push
to production to make sure theyre not
baking in any vulnerabilities that can
get exploited in the future, says Cahill.
But he also stresses the need to have
good policies, processes, and methodologies in place as well as to make security
a high priority throughout the entire
development process.

CyberTrend / October 2015

19

Mobile Security Best Practices


MOBILE DEVICES ARE TARGETS & ENTRY POINTS IN TODAYS THREAT LANDSCAPE

MOBILE SECURITY is more complicated


than ever, not because smartphones
and tablets are any less secure, but because many consumers and employees
have multiple devices that connect to
and communicate with each other on a
nearly constant basis. This proliferation
of devices and always-on usage provides hackers with more targets and
points of entry when attempting to steal
data or gain access to a corporate network. Many of these attacks are tied
to phishing schemes, which is why it's
important to focus on both device protection and security awareness.

Mobile Devices As Targets


Although its not necessarily as prevalent as other types of threats, there are
still situations where a users mobile device is the main target in an attack. For
example, there are certain mobile apps
floating around the Internet that, once

20

October 2015 / www.cybertrend.com

installed, will try to access a user's contact


information and other data stored on the
phone, says Doug Cahill, senior analyst at
Enterprise Strategy Group. There are also
mobile viruses that will steal photos from
devices, and GPS-based mobile malware
that can be used to track a user's location
without his knowledge.
Cahill points out that Android is still
more vulnerable than iOS because of its
open operating system and open application store, so the more prevalence
of Android devices that an organization
has vs. iOS, the higher the need for more
security controls on the mobile devices.
This difference has been long acknowledged in the mobile world, so one might
think the situation has improved, but
there is still evidence that Android remains a major target for attackers.
For example, in a recent IDC research
report entitled Five Key Trends for
Mobile Security in 2015, Stacy K. Crook

and Charles. J. Kolodgy write about a


threat called BadNews, which is a malicious advertising network that essentially
advertises malware through the use of legitimate applications. According to the report and Google Play statistics, there were
32 apps in Google Play tied to BadNews
that were downloaded between two million and nine million times. The primary
aim of BadNews was to either gather a
users phone numbers and device IDs or to
point users toward other monetized malware applications. And because the apps
hadnt shown any previous signs of malicious behavior, they were able to pass a vetting process without any trouble.

Mobile Devices As Entry Points


As if attacks focused specifically on
mobile devices werent enough, the
much more common type of mobile
threat is one in which a hacker uses a
smartphone and log-in credentials as a

way to access Web-based applications,


banking apps, or even the entire corporate network. Cahill explains that
most users dont store sensitive data
on their phones, but rather use them as
portals for accessing company data and
exchanging potentially sensitive information. This creates an opportunity for
spear phishing attacks.
In a spear phishing attack, a hacker
may use information from a public
Web page (such as a users LinkedIn
or Twitter page) to craft an email and
send it to the persons desktop. Once the
recipient clicks a link within the email,
a malicious application downloads automatically and surreptitiously. An application such as this might then steal
the users login credentials or other data.
Cahill says that spear phishing attacks
are highly targeted at individuals, using
personal information to make them feel
more believable.
So what does spear phishing have to
do with mobile devices? If, for example,
a Web-based application uses two-factor
authentication as a security measure,
the app might send a code to the users phone after the standard username
and password are entered, ostensibly to
provide more secure access control for
the app. If, however, a hacker is able to
have both a keystroke logger running
on a desktop and an app that sniffs out
a users two-factor authentication code,
then that hacker can gain access to that
application. This is relatively rare, but
can happen. For consumers, the result
could be that a hacker gains full access to
a bank account. And for enterprises, this
could result in an outsider being able to
access customer information or gain access to the corporate network.

Employee Education
& Awareness
The threats facing mobile devices are
certainly worrisome, but before you
start throwing out all of your smartphones and tablets, its important to

remember that there are ways to prevent these attacks from happening, or
at least to give your employees more
defensive tactics for thwarting attacks.
The first step is to improve overall employee education and awareness. In the
IDC report mentioned earlier, Crook
and Kolodgy point out that many users
see mobile devices as more secure than
desktops or laptops, and they chalk that
up to employees simply not being educated enough on what threats are out
there as well as what threats are unique
to mobile devices.

ORGANIZATIONS
SHOULD STRIVE
TO HAVE A
HOLISTIC ENDPOINT SECURITY
POLICY AND
APPROACH THAT
INCLUDES
MOBILE DEVICES,
SAYS ENTERPRISE STRATEGY
GROUPS DOUG
CAHILL.
Thats one reason why its so important to develop employee training
courses in which employees can learn
about spear phishing schemeshow
they are perpetrated and how they can
be avoidedand to put policies in place
regarding access to certain types of applications. With education programs
and policies in place, its then time to
test employee knowledge and make sure
they actively use their training.
The other thing companies can do
is [initiate] a fictitious spear phishing
attack, says Cahill. You can have a
third-party come in and launch whats
called a red team exercise, which is to
test really how effective your education
has been to your end users to not fall

for clicking on a link. Thats a really


good practice for companies to do regularly. You need to do that on a regular basis, Cahill adds, because new
vulnerabilities will pop up. You have
new people that join the company, you
deploy new software, you change software or networking configurations,
and new vulnerabilities will pop up.

Continuously Monitor Devices


& Applications
In addition to implementing strong
mobile security policies and educating
employees, Cahill recommends that
companies take inventory of all of the
devices and applications being used in
the workplace and monitor them to
make sure theres no malicious activity.
He says there are network access control products on the market that will
not only keep track of traditional endpoints, such as desktop or laptops, but
also the vast array of mobile devices
available to consumers and employees
today. And with the help of MDM (mobile device management) and MAM
(mobile application management) solutions, you can also get an inventory of
every application used throughout your
organization and vet them against a
reputation database.
When it comes to the monitoring
side of things, Cahill recommends putting a continuous monitoring solution
that will establish a baseline and then
detect any anomalous behavior. In the
beginning, this may lead to a lot of false
positives, but he says that vendors are
working on introducing data science
and analytics to cybersecurity as a way
to pinpoint truly anomalous behavior.
The good news here is that everything
Ive mentioned are all things customers
can do today and then there is a lot of
innovation by cybersecurity vendors
around using data science to detect
anomalous activity, says Cahill. Its all
about defense and depth at the end of
the day. Security is layers.

CyberTrend / October 2015

21

Fostering Better Collaboration


FORGET UNIFORMITY & EMBRACE AN AD HOC APPROACH

THERE USED TO be a time when enterprises used collaboration management


solutions and productivity suites to
connect employees and empower them
to do their jobs more efficiently. But
now, with so many individual productivity and collaboration applications
available for PCs and mobile devices,
many employees are going outside the
box, so to speak, to find the tools that
work best for them or their team. And
while this approach may seem to be
the antithesis of what enterprises and
IT teams are trying to accomplish, its
important to not discourage or prohibit these ad hoc solutions, but rather
figure out how to incorporate them
and use them alongside the systems
you already have in place.

Employees Bring Their Own


Vanessa Thompson, research director at IDC, says that vendors such

22

October 2015 / www.cybertrend.com

as Novell and IBM used to offer integrated, organization-wide collaboration solutions that offered tools like
email, application development, tasks
and scheduling, and more. But now,
where there used to be large collaboration and productivity suites, there
are now just key apps that support
messaging and are primarily about
connecting people to people and to
information, she says. And while
there are still those types of suites out
there for productivity, such as Office
365 or Google For Work, Thompson
says that on the collaboration side, its
becoming much more of an ad hoc
approach and it isnt as necessary as
it once was for a larger collaboration
management platform.
Whats happened over time is that
weve had more and more automation in the business systems that we
use, so we actually dont need as much

help to go through those structured


business processes to get things done
anymore, says Thompson. Now, all
we need is some support to manage
the exceptions and ad hoc things that
show up. Thats why its not as important to have those big solutions
anymore.

Usage Of Large-Scale Solutions


Is Difficult To Maintain
Another barrier to using large-scale,
company-wide productivity or collaboration solutions is that its difficult to
maintain usage across all departments.
This is partly due to the aforementioned issue of employees wanting to
use their own applications, but it also
has to do with the fact that different
departments arent always incentivized
to collaborate with each other, regardless of whether theres a solid collaboration platform in place or not.

If youre in sales, for example,


and say you have a client that wants
to sign a purchase order [PO], says
Thompson. Before I get the PO to
the client, I have to go ask finance
how much money I need for it,
legal to sign off on it for me if its
big enough, and my manager to sign
off on it for me too. Ive already involved two different business departments and they dont get anything for
giving me information. They dont get
incentivized on closing a deal. The
salesperson gets a bonus, but they
dont get anything, so theres actually
no incentive for anyone in any other
business unit to contribute.
In addition to lack of incentives
and people using their own applications, theres also the major challenge
of finding one large solution that will
fit the needs of every individual department. The job-related tasks of a
sales and marketing person are very
different from those of a customer
service representative or HR employee. Thats why ad hoc solutions
are so prevalent in businesses today.
If the overarching platform the company offers doesnt work for a given
department, then they will go out and
find one that does.

Finding People & Data


Is Still A Major Challenge
Even if your office environment
is relatively harmonious and multiple departments actively collaborate with one another, there remains
the challenge of helping people find
each other or the data they need to
solve problems. Thompson performs
a yearly survey of executive-level decision-makers, and this year she focused on sales, marketing, HR, service
and support, and IT to get a mix of
contributors. When she asked what
the biggest challenge was for businesses today, the most common response was finding experts.

On the enterprise decision maker side, it becomes a


question of, Can I give my employees something thats
good enough to help them with those things they need to
get done and are we open and flexible enough to let them
choose other [solutions] that will support them and help
them manage that a little bit easier for themselves?
VANESSA THOMPSON
Research Director
IDC

In fact, her survey found that


people typically spend eight hours a
week searching for the people they
need and an additional seven hours a
week looking for the information they
need. That means that the average
employee may spend upwards of two
out of five business days searching for
experts or data rather than actually
getting any work done. And this applies to anything from reaching out to
an IT employee to help fix a problem
with a specific system to finding a
data analytics employee who can pull
specific figures for a project or report.
What this means is that even though
employees are finding their own productivity and collaboration applications, communication is still a major
issue that companies need to address.

What Can You Do?


When approaching any problem,
and especially one that has to do with
collaboration, the key is to listen to
your employees about their issues and
challenges. Learn about the applications and platforms they like to use
for productivity and collaboration
and ask them about their opinions on
the company-wide system you offer, if
you have one. This dialogue is critical
because instead of blocking every application your employees like to use
and instituting a mandatory corporate system, you can instead strike a
balance between the two and still leverage your existing investments.

An organization, even if they put


in a big unified communications
system or have one of these larger
productivity suites like Office 365,
Exchange, or Google For Work, whatever they have, they probably still
need some combination of additional
tools to support their specific industry or different sets of users, says
Thompson. Those suites and large
UC systems are not going to meet all
of your needs, and its OK for organizations to fill in the gaps with these
other smaller standalone solutions because they need to help their users
help themselves.
Thompson recommends that companies stop limiting their focus to collaboration, productivity, or project
management, and instead go broader
and focus on how users and teams
manage work. She points out that vendors like Clarizen and Redbooth offer
collaboration and project management
suites designed to not only make projects go more smoothly, but also add
a layer of social task management to
make sure that employees can communicate well and work together. Teams
and business units just need some accountability and transparency around
deliverables and a way to manage content, says Thompson. They dont need
a big solution for that. They need something agile and flexible. And in the
end, its going to take a combination of
larger solutions and ad hoc applications
to get the job done.

CyberTrend / October 2015

23

Fast Data Solutions


INITIATIVES THAT EMPHASIZE SPEED & REAL-TIME ANALYTICS

KEY POINTS
Fast data takes the velocity
part of big data and makes
speed to capture and speed to
insight its primary focus.
Many vendors already offer
fast data solutions, which can
handle the streaming data
capture and real-time analytics
needed for a fast data project.
Fast data can be used for
sales and e-commerce, but it
also fits into manufacturing, security, and many other areas.
Fast data solutions will one
day be able to perform data processing and analytics at the same
time and in the same system.

24

October 2015 / www.cybertrend.com

MOST ORGANIZATIONS are aware of big


data and how its primarily focused on
helping businesses process and analyze
massive amounts of information in order
to gain insights, but there is also a relatively new concept called "fast data" thats
designed to handle streaming data and
real-time analytics processes. Fast data isnt
necessarily separate from big data, however, and in fact, if you think about big data
in terms of the three Vsvolume, velocity,
and varietytypically used to define it, you
can pick out velocity as the section where
fast data fits.
Every time we click a link, query a
system, watch an online video, make
a purchase in a shop; and every second
that our cars, factories, airplanes and other
systems operate; these actions generate
massive data streams, which gush out at
gigabytes per second, says Ganapathy
Subramanian, vice president, Big Data &
Analytics at Infosys. This streaming data

that must be processed in real-time as it


comes in is called fast data.
In the ebook, "Fast Data and the New
Enterprise Data Architecture," Scott Jarr,
VoltDB co-founder and chief strategy officer, writes, data is fast before it is big.
He points out that we are currently shifting
into a new era where streaming data or
data in motion can sometimes eclipse the
amount of historical data. And its because
of this shift that companies are looking at
ways to incorporate faster processes and
analytics into their current big data approaches in an effort to keep up with the
data in real-time to get those all-important
actionable insights.
However, streaming data is only one
part of the overall fast data definition. In
fact, in addition to how quickly data is
coming into your system, the fast in fast
data can also apply to how fresh and upto-date the data is when youre ready to do
analytics or how fast you can actually view

Business users are no longer willing to wait for IT to give


them answers in a timeframe of days or weeks. They want
to ask a question, get an answer, and then ask the next
question from the system, all without being constrained to
a pre-determined set of questions. Fast data can not only
accelerate or even automate decision making, but also
empower users and create competitive differentiation in
the way they understand and run the business.
GANAPATHY SUBRAMANIAN
Vice President, Big Data & Analytics
Infosys

those analytics, says Nik Rouda, senior


analyst at ESG (The Enterprise Strategy
Group). Its not necessarily just how fast
the data is moving and how quickly you
can capture it, but also how long it takes
for you to analyze the data and actually put
it to work.

What Fast Data Solutions


Should Do
Subramanian says the goal of any big
data solution should be to reduce the gap
between the time of transaction and the
generation of insights. When looking for
a solution, you need to focus on rapid capture and rapid insights. Rapid capture
is about ingesting data into a processing
system as the data gets generated at the
source, says Subramanian. So, if a sensor
fitted to certain machinery is generating
data rapidly, a fast data solution should
be able to ingest this data at a comparable
rate. Rapid insights means being able to
quickly transform or massage the captured
data by applying the requisite algorithms
to generate valuable insights or foresight
out of that data.
To achieve this goal, you need to have
a system that enables fast ingest, fast
preparation, fast processing, and fast user
reporting or output, Rouda says. The
challenge here is that not all big data or
analytics products are capable of handling
every step in the chain, which means that

as information is passed from one system


to another, there is the potential for slowdown or bottlenecks along the way.
Fortunately, there are a few vendors,
and more on the way, that offer systems
specifically designed for supporting
fast data projects. Subramanian mentions some open-source options, such as
Apache Kafka and Storm or a combination of Hadoop with Spark that can
process the data in near real-time and in
huge volumes. But there are also major
vendors in the marketplace that offer fast
data products.
Oracle, for example, offers a fast data
solution thats built on the pillars of filter
and correlate, move and transform, analyze, and act. Through the use of Oracles
Event Processing, Data Integrator
Enterprise Edition, Business Analytics,
Real-Time Decisions, and Business Process
Management products, you can put together a fast data implementation with
well-integrated solutions designed to move
data through at a fast pace. All of these
products come from the same vendor and
were built to work well together, reducing
the chance for bottlenecks and other performance issues.
TIBCO is another large vendor that offers fast data solutions. TIBCOs approach
is interesting because it also factors in the
idea of managing threats as opposed to just
analyzing data for the traditional sales and

marketing purposes. TIBCO points out


that through the use of fast data solutions,
you can track potentially malicious activity,
discover patterns, and act quickly before
any damage is done.

Examples Of Fast Data Use Cases


In addition to using fast data to analyze incoming data or handle threat management, there are other examples that
illustrate the diversity of fast data use cases.
One example that Subramanian uses is
an e-commerce site. Thousands of consumers might be viewing hundreds of images, descriptions, reviews and other data
at any second, he says. This information
is streamed to a system that can process it
along with other information, such as past
purchase behavior by that customer or
even todays weather, to provide real-time
recommendations or price discounts that
are completely targeted and relevant to
each individual.
And while its easy to get hung up on
the potential sales and marketing aspects
of fast data, there are situations where you
can use it to improve a business process
or manufacturing process. For example,
Subramanian says you could use fast data
in a factory setting or in a facility that manufactures key machinery such as aircraft
landing gear. These are specific scenarios
where fast data is absolutely necessary to
ensure that production lines are running
on time or that manufactured products are
safe and working properly.
Machine sensors are fitted to this
equipment, streaming data indicating
the health of the equipment and various environmental observations, says
Subramanian. There might be 200 sensors to capture various data points in
complex machinery. As data inputs are
captured, algorithms are applied to identify if a particular parameter captured
by a sensor is at an acceptable threshold
level. If it is beyond the safe level, corrective action can be taken. This can help
to preemptively predict faults, saving
money, time and even lives.

CyberTrend / October 2015

25

How Fast Data & Big Data Are


Related
Its easy to look at fast data as the velocity component of big data, but there
are situations where you might use one
over the other or both at the same time to
achieve a goal. For instance, Rouda says if
your only task is to create quarterly reports
and present them at a board meeting, you
may not need fast data as much as you
would a big data solution. However, if you
use the example that Subramanian used
before with customers on an e-commerce
site searching for product recommendations, then its imperative that the data
capture and analysis is done as quickly as
possible to return usable information to
the consumer.
Another instance where big data may
be a better fit than fast data is with evaluating crop yields for corn. If youre
looking at each corn seed, how you watered, how you fertilized, and weather
patterns, thats a big data problem, says
Rouda. Youre not going to maybe respond on it in the moment. Maybe youll
respond tomorrow or two months from
now when deciding when to harvest.
However, if you are looking at something
like balancing power grid consumption,
you need to have instantaneous analysis and adjustments of electrical transmissions, he says. Using fast data or big
data in a given situation ultimately depends on the application and the goals of
the project.
But just because fast data and big data
can be used in different ways doesnt mean
there isnt overlap. In fact, there are quite a
few situations where youll want to compare your brand new streaming data with
historical data to get better context. For example, you may be getting gigabytes upon
gigabytes of information from customers
shopping on your e-commerce site, but it
may also be helpful to take that information and compare it against historical sales
data to see how much the company has
grown or to make sure past initiatives are
having the desired impact.

26

October 2015 / www.cybertrend.com

You see all kinds of vendors putting out benchmarks


about being able to read 1 million rows per second, but
being able to write and read 1 million rows per second
in the same system is a lot more challenging, but its a
lot more useful too. Youre using the information live as it
hits the system and you dont have to do a big operation
to move it somewhere else for analytics. Those hybrid
workloads are going to need to evolve to best faster and
easier combined systems.
NIK ROUDA
Senior Analyst
Enterprise Strategy Group

Subramanian points out that for many


organizations it doesnt matter to the employee or the consumer whether a project
is big data-focused or fast data-focused,
because theyll demand it be done as
quickly as possible. And in that case, youll
be using both together to achieve certain
performance goals. [Business users] want
to ask a question, get an answer, and then
ask the next question from the system, all
without being constrained to a pre-determined set of questions, says Subramanian.
Fast data can not only accelerate or even
automate decision making, but also empower users and create competitive differentiation in the way they understand and
run the business.

Where Fast Data Is Going & How


To Prepare
A key point to keep in mind about fast
data is that it is a relatively new concept
even if the core ideas behind it have been
around for quite some time. Companies
have always looked for ways to be faster
and more efficient, but only now with the
growing popularity of big data are they actually finding ways to achieve those goals.
Rouda says there are applications that can
do high-speed data capture, streaming
analytics, and automated response, but
the important thing is to figure out how
those systems fit into your specific business

environment. It may require changes to


business processes and other activities,
which means past experience wont necessarily ensure future success.
Maybe youve already learned a bit,
but it depends what steps you took, says
Rouda. Youre not necessarily ahead of
the game if youve made a huge investment in technology that is more oriented
toward batch processing. That said, you
may have already done a lot of really good
groundwork for it and that Hadoop data
lake you built could feed nicely into an
Apache Spark implementation. It depends
on the choices and what youve learned,
but certainly if youve spent a lot of time
investing in and building skills around big
data, those skills will be useful, even if you
have to adapt the environment a bit.
Even then, implementing fast data initiatives will be a constantly evolving process that will require education along the
way. Rouda says there will be new solutions in the near future that will handle
processing transactions and analytics at
the same time in the same system, which
is a major shift that may result in fast data
being much easier to implement. As an
organization, all you can do is keep up
with the changing technology, determine
whether it fits in with your business objectives, and then implement it in a way that
makes the most sense for you.

Law Offices of Donald M. Gindy


1925 Century Park East, Suite 650
Los Angeles, California 90067
(424) 284-3123 www.gindylaw.com

Greenovations
ENERGY-CONSCIOUS TECH

The technologies
that make our
lives easier also
produce some
unwanted side
effects on the
environment.
However, many
researchers,
manufacturers,
and businesses
are developing
solutions that are
designed to keep
us productive
while reducing
energy demands
to lessen our impact on the environment. Here's
a look at some of
the newest such
initiatives.

Have a Tesla Model S vehicle? There are now more than 500 Supercharger stations available
worldwide, and that number could double over the next year.

Tesla Offers More Supercharger Locations, Including Manhattan


Owners of Tesla Motors' Model S cars can stop at any Supercharger location and
plug in their vehicles for a quick charge. There are more than 500 Supercharger stations around the world, each with between four and 12 Superchargers, providing
more power (170A) than other high-voltage outlets and public charging stations
offer (typically 30A or 40A). Tesla Motors continues to add Supercharger locations
worldwide, and it is now introducing Destination Chargers around New York City.
The program consists of Tesla High Power Wall Connectors installed at popular
destinations, adding 58 miles of range per hour to Model S, says Tesla spokesperson Alexis Georgeson. Until now, Tesla has mainly partnered with hotels,
resorts, and restaurants to install these connectors, but in an effort to tackle a need
for urban charging, weve expanded the program and partnered with public garages
that offer parking both by the hour or the month. As part of the new program,
commuters simply drop off their Model S with the valet at one of the many garages
offering Destination Chargers, and their vehicle will be fully charged while they
work. Monthly parking spots and charging are also available for city residents.

U.S. Solar Industry Reaches Capacity Milestone


According to a new Solar Energy Industries Association report, the U.S. solar
industry produced more than 20 GW (gigawatts) of total operational solar PV (photovoltaic) capacity during Q2 2015, a new record. At over 20 GW of installed solar
electric capacity, we now have enough solar in the U.S. to power 4.6 million homes,
reducing harmful carbon emissions by more than 25 million metric tons a year," said
Rhone Resch, SEIA president and CEO, in a press release. As of 2015 40% of all new
U.S. electrical capacity coming online comes from solar. The SEIA forecasts that the
U.S. solar industry will reach 7.7 GW in new PV capacity for the year, another record.

28

October 2015 / www.cybertrend.com

Transforming Carbon From


The Air Into Raw Materials
CNFs (carbon nanofibers) are prized
for their strength and durability.
Carbon fibers are used in aerospace engineering and the transportation industry, but CNFs have been notoriously
complex and costly to produce. Also
complex are environmental researchers
efforts to scrub the air of carbon dioxide, or extract it in order to reduce
carbon-related dangers in the atmosphere. Now, however, a group of scientists at George Washington University
have demonstrated a method that addresses both of these issues. Their report describes how this method, called
STEP (solar thermal electrochemical
process), provides an inexpensive
means for extracting carbon directly
from the air or from exhaust systems,
and transforms that carbon into metals,
fuels, and other materials. The report
concludes, a range of carbon nanostructures is attainable and future
studies will probe conditions to characterize and optimize growth of these
structures. The report also emphasizes
that the STEP process doesn't generate
additional carbon dioxide.

Typical solar panels use flat-plate PV (photovoltaic) cell systems, which are tried and true but not
as efficient as they could be. Newer technologies stand to affordably improve PV systems.

New Solar Technologies To Receive $24 Million


Innovation Funding From Government
The most commonly used solar technology involves flat-plate PV (photovoltaic) cell systems; these are used in most fixed panel systems today. CPV (concentrated PV) technologies, by contrast, offer higher performance and greater
efficiency, but current CPV technologies require large, costly tracking systems.
CPV might not stay out of reach for too much longer, however, as the U.S.
Department of Energy's ARPA-E (Advanced Research Projects Agency-Energy)
announced $24 million in funding for 11 new projects, including projects focused on making CPV technology more accessible and affordable. Funding will
come through an ARPA-E program called MOSAIC (Micro-scale Optimized
Solar-cell Arrays with Integrated Concentration), with a stated goal of lowering
costs and improving efficiency of solar systems without increasing manufacturing costs.

Significant Growth Expected In Li-ion Batteries For Transportation

Scientists in the chemistry department at


George Washington University have developed
a way to create CNFs (carbon nanofibers) from
carbon dioxide drawn from the air and from
exhaust systems.

Li-ion (lithium-ion) batteries have become ubiquitous in mobile devices,


powering everything from tiny media players to smartphones to full-size notebook computers. The expansion of the Li-ion market among these devices is
clear, but a new report from Technavio asserts that device categories are rapidly
expanding to include such things as power tools and EVs (electric vehicles).
"With the decrease in cost of Li-ion batteries, manufacturers will increasingly
be able to leverage economies of scale with growth in market size and production scales," explained Faisal Ghaus, Technavio vice president, in a press release.
The market is shifting away from conventional graphite electrodes and toward
high-energy silicon electrodes, Ghaus adds, providing longer battery life for laptops and EVs. Technavio forecasts the worldwide market in Li-ion batteries for
transportation purposes will grow at a 31.8% CAGR from 2014 to 2019.

CyberTrend / October 2015

29

Recover Value From IT Assets


OPTIONS FOR DEALING WITH UNWANTED HARDWARE

IT HARDWARE ASSET recovery is often


overlooked, despite the fact that it represents an opportunity for enterprises to
recoup value from unwanted equipment.
Furthermore, poorly implemented IT asset
recovery can deal an organizations reputation and bank account a crippling blow
in terms of noncompliance with environmental and data-privacy regulations.
Broadly, IT asset recovery, or ITAD (IT
asset disposition), involves securely repurposing, donating, recycling, or destroying
IT equipment. It's important to note that
a companys responsibility for its equipment doesnt end after that equipment
exits its doors. A hard drive containing
unencrypted personal data lost while in
transit to an ITAD facility, for example,
can mean lawsuits and fines if data is exposed. Equipment irresponsibly tossed in
a landfill can lead to the same. This article
explores why IT personnel and executives
should take interest in IT asset recovery.

Recover Value
Hardware typically covered under
IT asset recovery includes PCs, laptops,
servers, monitors, fax machines, copiers,
printers, smartphones, and tablets.
Increasingly, wearables and IoT (Internet
of Things)-related devices are also included. For many companies, donating
such equipment to charities, schools, etc.
is a viable disposal option with possible
tax breaks. Further, says Sandi Conrad,
Info-Tech Research Group director, many
ITAD providers will manage the process,
including properly licensing OSes, ensuring equipment works, and transferring
the equipment.
Traditionally, though, IT asset recovery
has meant getting value back from unwanted equipment. This is changing as
useful life spans for equipment are extending. Companies, for example, are
keeping PCs and servers five or more
years vs. three. Thus, ITAD providers are

receiving older assets with less resale value.


Conrad says some disposal companies do
work to recover value beyond equipments
seven-year range, though recovering significant value is less likely if extending the
process this long.
Also impacting the recovery value of
PCs/laptops currently is that more companies are replacing them with mobile devices, thus driving PC/laptop prices down
and diminishing their recovery value, says
Rob Schafer, Gartner research director.
In general, recovering value may be possible for four- and five-year-old assets if an
ITAD provider excels at extracting value
from precious metals used in them, although the precious metal market is also
declining due to manufacturers using less
of such metals.
An oddball exception to the declining
recovery trend is mobile devices, Schafer
says, primarily because end users tend to
replace them after two or three years, well

CyberTrend / October 2015

31

short of their actual useful life. Thus, ITAD


providers receive younger equipment.
Further, mobile devices have smaller footprints, making them more economical to
ship to where resale markets reside. Many
larger ITAD providers have integrated mobile devices into their existing documentation, dismantling, recycling, and resale
processes, Conrad says.
Given the declining recovery value for
most IT hardware, Schafer says, executives expecting to receive checks and not
invoices following recovery processes
should change their thinking because disposal costs may well outweigh recovery
revenue. What's more, the cost of properly
performing ITAD are increasing substantially, he says.

The Associated Risks


Although seeking recovery value is understandable, executives must also remain
focused on their brands where risks associated with equipment recovery and disposal
are concerned. Risks cover two primary
areas: data security and environmental responsibility. Failure to safeguard in either
area could mean negative publicity that
isnt good for the enterprise, Schafer says.
Until recently, it was fairly normal for
organizations to just have IT equipment
picked up, after which it wound up in a
landfill in a third-world country where
locals dismantled it for precious metals.
Such practices led to poisons leeching
into water supplies and other environmental hazards. More recently, laws have
been enacted to curb these practices. It
still happens, Conrad says, but companies that get caught breaking laws face
huge fines.
For North America, e-Stewards and
R2 (Responsible Recycling) certifications
are the predominant guidelines for recycling electronics. When seeking an ITAD
provider, look for certification with one
or both.
Where data security is concerned,
companies should validate what process a provider uses to dispose of data on

32

October 2015 / www.cybertrend.com

In general, check for equipment destruction, transfer,


recycling, and other certifications; how equipment is
packaged, refurbished, and shipped; if equipment is sold
in bulk or individually; and how equipment is cleaned,
licensed, and restored to peak quality. All these little
things will increase the resale value.
SANDI CONRAD
Director
Info-Tech Research Group

memory and drives taken from PCs, laptops, servers, copiers, printers, and other
equipment. Traditionally, DOD 5500
was the standard in this domain, Schafer
says, although NIST 800-88 has largely
replaced it.
Also important is verifying the chain
of custody the transportation logistics a
provider uses for such drives. Schafer says
these areas, which represent a bulk of total
disposition costs, entail securely packing
and shipping assets to the providers facility. Chain-of-custody particulars also
include the encryption the enterprise uses
for drives. Preferably all enterprise data
is encrypted for the process, Schafer says,
because monitoring which drives are encrypted is a nightmare.
Logistics details can include whether
a provider seals drives at the company
and performs a one-to-one serial number
match at its facility, which is expensive
but secure, Schafer says. The inverse occurs when, for example, a provider packs
drives on a furniture truck that makes
11 stops on the way, he says. If half the
assets show up, count yourself lucky. In
other words, you get what you pay for
security-wise. Because some ITAD providers use third parties for transportation,
organizations should ensure that transport
employees have been well-vetted and that
background checks have been performed.

Whats In A Provider
Among the positive traits to look for
in an ITAD provider is its ability to help

calculate the value an organization can


expect to recover from its equipment. For
example, a hardware manufacturer may
offer to take equipment back for free at
time of disposal as part of the purchase
price, but this route could mean the manufacturerrather than the organization
profits from disposed equipment. ITAD
providers can generally help determine the
right time to dispose of equipment for the
most value, Conrad says.
Elsewhere, organizations should visit
a prospective ITAD provider's facility,
viewing how it handles and dismantles
equipment and noting whether it follows
environmental requirements. Also ensure
a provider can supply a list of disposed
equipment complete with serial numbers
that notes the state of piece of equipment.
In general, check for equipment destruction, transfer, recycling, and other
certifications; how equipment is packaged, refurbished, and shipped; if equipment is sold in bulk or individually; and
how equipment is cleaned, licensed, and
restored to peak quality. All these little
things will increase the resale value,
Conrad says.
Location-wise, Conrad says a local
ITAD provider isnt necessarily worse
choice than a national one, though local
providers probably use partners that must
be checked and lack multiple facilities that
a larger provider likely possesses, meaning
just one contact and contract to manage
vs. potentially many if using numerous
local providers.

The European Data Center Migration


U.S. ENTERPRISES ARE BUILDING OVERSEAS FOR DATA PRIVACY REASONS

TWO THINGS BECOME apparent when


speaking with analysts familiar with the
European data center market. First, more
U.S. businesses have builtor announced
plans to builddata centers in Europe
in recent years. Second, this increase is
tied to concerns that European customers
have regarding their data and privacy.
Events such as Edward Snowdens NSA
(National Security Agency) revelations and
Microsoft's current court battle involving a
U.S.-based warrant demanding it give up
email data located in Europe are among
the catalysts.
Another factor is the GDPR (General
Data Protection Regulation) thats expected to supplant the EU Data Protection
Directive. Proposed in 2012, the GDPR
would implement one framework for data
privacy protections covering all 28 EU
member states. Adoption could happen yet
this year, after which EU members have
two years to ratify GDPR.

In short, European customers of U.S.based businesses are increasingly making


it clear they desire their data to remain
local and expect companies to protect it
according to local laws. As Carsten Casper,
Gartner Europe managing vice president,
privacy and digital workplace security,
says, while such new rules may not necessarily appeal to U.S. business, what is
appealing is to do business, or not to lose
business, with European partners.

Rules Of The Game


Presently, the EU Data Protection Directive determines how companies operating in the EU must handle Europeans
personal data regardless of company headquarters. Enza Iannopollo, Forrester Research researcher serving security and
risk professionals, dubs the directive the
highest standard for protection of personal
data. Notably, she says, each EU country
member has a national version of the

directive. These pieces of legislation are


similar but not identical, so variations at
the national level apply, Iannopollo says.
Casper says the main principles of current rules include needing permission to
store personal data, ensuring that data is
correct, securing data against unauthorized access, and deleting data if the reason
it was collected ceases to exist. If sharing
data, the partner must follow the same
rules and provide the same level of protection, he explains. Currently, data residency is the main discussion point, he says.
Personal data can only leave the EU if its
adequately protected. Just what leave the
EU means is subject to intense debate,
says Casper.
In general, Casper believes the beauty
of the European system is its fairly harmonized across all EU member states. The
GDPR will further emphasize this, he says.
Unlike the U.S., where privacy laws are
federal, state-specific, or industry-specific,

CyberTrend / October 2015

33

all EU states use the same directive as the


basis for their national laws, he explains.

The Data Center Trend


To date, the largest U.S.-based technology companies have data centers in
Ireland, Germany, France, the Netherlands, and elsewhere in Europe. Several
companies have also recently announced
plans to build more. Casper says a 2014
Gartner survey found 26% of 908 respondents from the U.S., U.K., Canada,
Germany, India, and Brazil started data
center operations outside of the United
States following Snowdens NSA revelations. Rather than build huge new
data centers, however, companies are
often building facilities sufficient for only
hosting an additional service in Europe or
leasing extra rack space, he says.
The key requirement is to be able to say
that these U.S. organizations have their
data in Europe, even though most would
agree that this alone doesnt prevent data
access from abroad, Casper says. Notably,
its vendors and organizations seeking this
requirementvendors because theyre
concerned about losing existing deals and
winning new ones and IT departments because theyre concerned about not meeting
European compliance requirements.
Iannopollo says possibly more interesting than major U.S. companies making
plans to build EU-based data centers is that
Forrester is seeing strong demand for EUbased data centers coming from European
cloud users. Similarly, Giorgio Nebuloni,
IDC associate research director European
systems and infrasctructure solutions, says
large, multinational U.S.-based companies are seeking to build B2B (business-tobusiness) and B2C (business-to-consumer)
cloud services. In recent years, IDC has
seen U.S. companies changing their approach to Europe as a market for cloud
services, he says.

Privacys Influence
Nebuloni says whats driving this
change in approach primarily is that

34

October 2015 / www.cybertrend.com

"Unlike the U.S., where privacy laws are federal, statespecific, or industry-specific, all EU states use the same
directive as the basis for their national laws.
CARSTEN CASPER
Managing Vice President,
Privacy & Digital Workplace Security
Gartner Europe

European customers tend to be fairly conservative about where their data is located.
Furthermore, they want to ensure companies they interact with have at least a
subsidiary in their country.
Following the PRISM scandal, Nebuloni says European customers uncomfortable with their data residing outside
their countries pushed for local data centers. Local service providers in France and
Germany have tried to get an edge on U.S.
companies by emphasizing their local ties,
he says. In France, for example, an association of French cloud service providers
have worked to provide French-certified
cloud services, Nebuloni says. So it seems
like politics is intertwining more and more
with IT services and IT markets where the
cloud is concerned, he says.
Steve Wilson, Constellation Research
vice president and principal analyst, says
the writing has been on the wall in
Europe for some time. He points to the
Safe Harbor provision that essentially allows some U.S. businesses to escape the
full weight of EU expectations as being
on borrowed time. Wilson notes many
non-European countries that also have
strong privacy laws want their data processing to occur in Europe vs. the U.S.
Many laws in such countries take the form
of, If you export personal information
from our country, you must only send it
to places that have equivalent data protections, he says.
In general, Iannopollo says EU citizens
view privacy as a fundamental right and
part of their culture. History has shaped
this relation with privacy, she says, making
Europeans different than citizens in other
geographies. As customers, Europeans

are increasingly aware companies are


seeking out their data because of its value,
thus they expect companies to protect
their data as one would protect valuable
assets, she explains.
Iannopollo says for companies dealing
with EU customer data, complying with
rules means mitigating risk of fines but
more importantly, having the opportunity
to build a trustworthy relationship with
their customers, a stronger reputation for
their brand, and ultimately a competitive
differentiator for their business.

The Future
Nebuloni says many U.S. executives he
speaks with are cognizant of the great
regulatory and psychological problems
existing in Europe, particularly executives
involved with companies that offer cloud
products. Ultimately, Nebuloni believes,
larger U.S. companies will partner with
local providers to address privacy issues.
Casper, meanwhile, says the need to operate data centers in Europe is unlikely to
go away soon. How demand will change,
however, will depend on the legislative
initiatives on privacy in the U.S., in the
European Union, and the privacy discussions between the two parties, he says.
Similarly, Iannopollo says privacy and
data protection are "here to stay and to
change businesses culture and modus
operandi as necessary, she says. She expects more non-EU companies will open
European data centers and comply with
EU data protection rules. Moving forward, we also expect them to truly understand and operate against the cultural
background of their European customers,
partners, and employees, she says.

Laptop Improvement vs. Replacement


WHEN TO UPGRADE OR REFURBISH & WHEN TO BUY NEW

MANY COMPANIES probably picture


desktop systems when they think about
upgrading employee computers, but its
also possible to boost performance of
company-issued laptops to give them
some extra life. Not only can you upgrade
various components of a laptop, but you
can also clean it up and give it a refresh for
future use. But just because you can upgrade or refurbish a laptop doesnt mean
you should, so its important to put a solid
life cycle management plan in place to
ensure your enterprise laptops arent overstaying their welcome.

What You Can & Cant Upgrade


Before you think about whether to upgrade or replace your enterprise laptops,
its important to understand what laptop
components are upgradeable and which
improvements will have the biggest impact. The No. 1 upgrade you can make to
boost laptop performance is to add more

memory. A new laptop will come with


built-in RAM, but you have the ability
with many models to add more memory.
For example, you could upgrade a laptop
with 4GB of RAM to 8GB, giving you a
two-fold boost in performance.
Storage is another area where you can
gain some performance ground without
much trouble. However, you might need
to decide if you want to prioritize capacity or performance. Its possible to
just upgrade the hard drive to get more
capacity, but if you want to actually increase read and write speeds for better
OS (operating system) and application
performance, then you might want to go
with a solid-state drive instead.
It may seem obvious, especially since
new OS releases are quite visible and
well-advertised, but you shouldnt forget
the OS when performing system upgrades. Sometimes older OSes have
quirks or limitations that can cause a

laptop to run more poorly than it


should, so upgrading to the newest version could be what the system needs to
get where it needs to be.
When upgrading laptops, its important to understand that some components either cant or shouldnt be
upgraded. For example, many processors and video cards are specifically designed for the laptops theyre built into,
so it would be difficult, if not impossible,
to find a similar component that would
work. If your company's laptops are

THE NO. 1
UPGRADE YOU
CAN MAKE TO
BOOST LAPTOP
PERFORMANCE
IS TO ADD MORE
MEMORY.

CyberTrend / October 2015

35

falling behind mostly due to processor


power or other similar components,
then you may be better off going with a
full replacement.
Regardless of what changes you plan
to make, whether theyre physical- or
software-based, you should keep in
mind that all upgrades add up over
time. Its important to run a cost-benefit analysis each time you consider upgrading a laptop to make sure you are
not spending more money to refresh
your current model than you would to
buy a brand new system.

programs, and tell you whether its safe


to delete them and what impact removal
would have on the rest of the system.
OS upgrades are also important at this
step, because you could be missing out
on performance improvements that will
breathe new life into your laptop.
As for who should do the laptop refurbishing, that depends on your staff
and whether someone has experience
refurbishing and upgrading electronics.
Although its possible to clean the
outside and even a select few internal
compartments of a laptop on your own
with compressed air, it becomes much

This process will be different for


every company. You may have employees, such as graphic designers
or video editors, who need the latest
and greatest tools, which could require a two- or three-year replacement
strategy. But at the same time, you may
also have general office employees who
can use the same laptop for six years or
more with no problems and only need
a memory or hard drive upgrade here
and there to support new OSes or applications. If you have quite a few employees who seem to require upgraded
systems on a frequent basis and dont

The Laptop Refurbishing Process


In addition to full-on upgrades, another way to refresh your laptop fleet is
to refurbish each device. The actual refurbishing process depends entirely on
whats wrong with the laptop or what
youd like to improve. For example, the
first step in refurbishing a laptop is typically the cleaning process. Its important to clean the laptop inside and out,
either with compressed air or laptopsafe cleaning chemicals, to remove as
much dust and debris as possible. The
build-up of dust and other items can
not only impact the cooling of the device, but also the connections between
different components.
Another step in the refurbishing
process is dealing with replacing damaged or broken parts. This can range
anywhere from replacing a trackpad or
a few keys on the keyboard to replacing
a laptop's display. Its possible during
this step that you may also find you
need to replace memory, hard drives,
or other components as you are refurbishing the laptop.
In addition to physical refurbishing,
its possible to take a more virtual approach. For example, you can go
through your hard drive and find as
many unused files or programs as possible and delete them to free up capacity.
There are also software tools available
that will help you pinpoint these files and

36

October 2015 / www.cybertrend.com

WHILE IT IS POSSIBLE TO UPGRADE OR


REFURBISH YOUR ENTERPRISE LAPTOPS,
THERE MAY COME A POINT WHEN IT
SIMPLY MAKES MORE SENSE TO BUY
NEW LAPTOPS.
more troublesome when you get into replacing parts or cleaning internal components. If you dont have the necessary
expertise in-house to perform these
functions, look to a third-party provider that can refurbish laptops without
causing any unintended damage.

Know When Its Time To Replace


While it is possible to upgrade or
refurbish your enterprise laptops,
there may come a point when it simply
makes more sense to buy new laptops.
Your first indicator will be employee
complaints. If employees are able to
use their laptops with no problem and
perform all necessary tasks, then you
may be best served by doing nothing
and maintaining the status quo. If they
begin to complain consistently about
the performance of their laptops and
how they cant run certain applications
or those apps are running slow, then
you need to listen to that feedback and
determine whether it makes sense to
replace the laptops.

need to travel often, then you may want


to consider giving them desktop systems, which are typically much more
upgradeable than laptops and have
easily swappable components.
Its always disappointing to get the
it depends answer, especially when it
comes to product life cycle management, but with laptops, thats unfortunately the case. You may get six years
out of a laptop or you may get two years.
It ultimately depends on how the system
is used. However, you can put yourself
at an advantage by setting up a proper
life cycle management plan and refresh
cycle. For example, you may be able to
cycle laptops throughout the organization and make it so that when graphic
designers get their new laptops, you pass
the older ones down to the next department, and so on and so forth. You have
to come up with a strategy that works
best for your organization, upgrade and
refurbish along the way, if necessary,
and get a feel for when it makes sense to
buy new equipment.

What Is Shadow IT?


WORK WITH EMPLOYEES TO IDENTIFY & ADDRESS PROBLEMS

SHADOW IT is the concept of employees


bringing in personal applications or
hardware to solve a problem theyre
having at work. This can be something
as simple as a smartphone thats too old
to function properly or something more
complicated, such as an entire sales
platform. While it may seem obvious
that shadow IT can introduce risks to
the business, its important to realize
that there are also positives to take away
from these types of situations. It all
starts with understanding why shadow
IT is showing up in businesses and then
coming up with ways for IT to better
meet the needs of the employees and
the needs of the business.

Why It Exists
& Common Examples
The main reason why shadow IT
tends to show up in an organization is
because various areas of the business

either dont know or dont believe that


IT can fill their needs, says David
Yackness, director, CIO Advisory, at
Info-Tech Research Group. In these
situations, individual users or entire
teams will seek out their own solutions
to perceived problems, whether that be
through the use of software, services,
or hardware.
Shadow IT is commonly used with
sales applications. It grows very organically once its in place and is extremely
successful at what you would call the
land and expand model of sales,"
says Yackness. "They get a couple of
adopters, and then it just expands quite
extensively. That would be driven by
the organization trying to sell its technology. If there isnt a good solution already available from IT, thats one way
that it gains traction.
Cloud file-sharing services can be
another type of shadow IT, because

some companies simply dont have the


infrastructure in place to offer proper
sharing and collaboration tools for employees. Yackness points out that many
file-sharing services are free but hose
that have a cost are relatively inexpensive. This could be a problem for the IT
team and the business as a whole if employees start storing sensitive or even
confidential information on an unsupported platform.
Its important to remember that
shadow IT is not a software-only phenomenon. In fact, with the influx of
personal mobile devices in the workplace, IT teams are now struggling like
never before to monitor, manage, and
support the range of consumer devices that employees bring with them
to the office. And, if an employee isnt
careful about what data he is accessing
or storing on his mobile device, the information may be at risk to loss or theft.

CyberTrend / October 2015

37

Know The Risks

38

The biggest problems occur when people stop working


together. The other challenge you have to worry about is
when different areas of the business all come up with the
same problem but have different solutions to it and all
think that each solution is the best. Hopefully, standardizing on one of these different technologies can be done
and maybe IT can be the broker for that conversation
with the different business areas and help them to better
understand what their issues are and how technology
A might be better or cheaper than technology B. Its a
matter of facilitation and brokering as opposed to being a
gatekeeper or policeman.

Perhaps one of the biggest risks of


shadow IT is user error. In many instances employees are unaware or not
educated about how unsupported software and hardware can harm a business. When an employee resorts to
shadow IT, he risks compromising crucial company data, as well as support
and integration. Inexperienced employees may buy solutions that appear
to solve problems on the surface, but
they may work in a silo, which could
limit the effectiveness.
Using the sales application example,
its easy to see how quickly shadow IT
can get out of hand. If your company
currently has a well-integrated CRM
system and the sales team decides to
bring in its own sales platform, theres
no way to know if the two solutions
will communicate or work well with
one another. How do you get the data
to actually traverse those different systems?" Yackness asks. "And how do you
manage it in such a way that you dont
end up with islands of data that cant be
integrated or well understood?
But there are also more pressing concerns regarding compliance and business continuity to consider, especially if
you work in a regulated industry. How
do I protect [the data] and comply with
legislation? If its private information
with respect to health, financial, or personal information, how do you protect
all of that when someone has gone out
and just done it without consulting the
experts? Its almost like saying, OK, Im
going to go patent a piece of technology
but I really dont know anything about
patents. Im just going to put a copyright symbol on top of it and think that
will be sufficient to protect it. You have
to talk to a lawyer about it.

doesnt handle these situations with


care. In fact, Yackness goes so far as to
say that shadow IT is always positive
and is only negative when IT has to
react to it. If IT can actually get engaged with shadow IT from the beginning, then they can use what they learn
to improve how they meet business
needs and find weak spots in their own
technology approaches.
They can identify the functionality
that they need or look at ways that
they can be more efficient or effective
at work, but there may be different
ways of delivering that capability to
the end user that are more in line
with ITs objectives, says Yackness.
I would almost look at it as an opportunity to rapidly identify problems
to be solved and be able to collaborate
very closelyIT with the users and the
users with IT. Its just a matter of overcoming some of the trust barriers that
can come up.

Potential Benefits

Look At Shadow IT In A New Way

Although there are quite a few risks


associated with shadow IT, they really
only come into play when your IT team

Problems with shadow IT tend to


crop up when IT serves as a gatekeeper
to new technologies rather than as a

October 2015 / www.cybertrend.com

DAVID YACKNESS
Director, CIO Advisory
Info-Tech Research Group

partner with the business. Rather than


outlawing the use of Dropbox across
the board, for instance, and not offering
up an alternative solution, IT needs to
work with employees who are using the
service to figure out their true needs.
From there, employees can determine
whether they can come up with an internal solution or if maybe theres a way
to incorporate that popular application
into ITs existing approach.
Stop looking at it as something that
has to be squashed, says Yackness.
People have problems, and theyre
going to find ways to solve them, and a
better lock builds a better thief. Youre
far better off trying to partner with
the areas that are getting engaged with
shadow IT, finding out what problems
theyre trying to solve, working with
them, and realizing that the problems
they have and the solutions the end
user happens to find could be better
than what IT can provide. IT has to
disassociate itself from ownership of
the technology and look at itself as the
facilitator to access technology. Its
really looking at ways to be positive
about it.

GENERATORS

AUTOMATIC TRANSFER SWITCHES

UPS
PRE-OWNED GEN SETS
20-3000KW
LOW HOUR WITH WARRANTY

We buy and sell complete systems.


CALL FOR PRICING AND SPECIFICATIONS.
w w w. e m p i r e - c a t . c o m

INQUIRIES
Kris Davenport: 602.622.5619
kris.davenport@empire-cat.com

You Cant Afford A Bad Network


DEMANDS WILL ONLY CONTINUE TO GROW, SO MAKE SURE YOU START PREPARING NOW

KEY POINTS
Make sure youre able to spot
the tell-tale signs that your network is overtaxed and be proactive in addressing issues.
Keep users and applications
in mind when designing your
network and dont work in a
vacuum or you could end up
running into preventable issues.
Consider building a hybrid
WAN so you can have multiple
network connections in play depending on the use case.
WAN orchestration tools can
help you better manage your
network and control applications for the best performance.

40

October 2015 / www.cybertrend.com

MOST COMPANIES THESE days have


strong and capable WANs (wide-area
networks) in place to handle the day-today rigors of a bustling office environment. But when it comes to introducing
technologies such as hybrid cloud computing, or when simply considering the
sheer number of network-enabled devices
out there fighting for bandwidth, you
might find that your network is bumping
up against maximum capacity much
more quickly than you imagined.
As companies move an increasing
number of workloads offsite and as hardware and software solutions demand larger
quantities of bandwidth and network
resources, this problem is only going to
grow. For that reason, you need to start
preparing now for where you want your
network to be in the future, otherwise you
could end up in the troubling position of
playing catch-up with the needs of your
customers and employees.

Spot The Signs Of An


Overtaxed Network
Before you can start deciding how
much capacity youll need in the future,
you need to first understand what your
limitations are right now. Network issues
can manifest themselves in a number of
ways, but one of the most common issues comes from applications performing
poorly. Andrew Lerner, research director
at Gartner, says that application performance issues often arise in remote offices,
especially when they are international. For
example, a SaaS (software as a service) application perform perfectly in your Dallas
and San Francisco offices, but just chug
along in Singapore. Thats just the nature
of increased latency, Lerner says.
Lerner says that newer technologies, including cloud computing, can also cause
problems for networks and overall performance. Traditionally, applications are
run out of the data center and then the

WAN is designed to deliver traffic from a


remote branch to a data center, he says.
Now, when you move the application out
of the data center into a cloud provider, it
changes the equation. In networking, you
cannot overcome the speed of light, so if
you add hundreds or thousands of miles
between users and their applications, you
can have poor performance as a result.
In addition to finding specific issues in
your network infrastructure, you can also
get a feeling of just how well or poorly your
network is performing by talking to your
users. Andre Kindness, principal analyst
at Forrester Research, says the key to determining the state of your network and
figuring out how to fix it is to speak to
customers and employees. You have to
base it on the employee experience and the
way you can do that is with the amount of
tickets coming in for it, or set up metrics,
he says. But it fundamentally goes back to
the customer or employee experience. That
should be No. 1, and it should be the first
place people go to.

Be More Proactive In How You


Solve Networking Issues
Another concept network administrators need to embrace is being proactive, in general but also just in terms of
responding to and solving network issues.
This is where network monitoring tools
come into play, which are solutions companies dont focus on enough, according
to Kindness. Typically, your fallback position is if you are having a lot of tickets
coming in or if the business is complaining, then you start using monitoring
and testing tools, he says. The problem
is that people always do it afterward, but
monitoring money should be spent equal
to what you spend on infrastructure or
other things. You need to have a lot of
good information about what goes on so
you can solve the problems.
Kindness offers up the example of a university that developed a unique application that not only offers information about
the school, but also gives users a conduit

You have a bunch of players out there that are focused


on WAN orchestration to make that much easier to do. . . .
[Instead of] IPs, ports, and CLI, they basically give you a GUI
so you can drag and drop YouTube. Its a combination of
workflow, centralized intelligence and visibility, and automated changes to keep up with the application requirements. Thats where orchestration would fit in.
ANDREW LERNER
Research Director
Gartner

from which they can send technical service


tickets directly to networking teams. If a
user is experiencing a spotty connection
or poor performance, he can report it. At
that time, the app sends the information
directly to the networking department and
monitoring is automatically increased in
that specific area. Using the app, network
administrators can gather data about what
the user was doing at the time of the incident and where they were in the facility.
Imagine being able to pinpoint dead zones
in your office or your data center using
a similar application. It could drastically
reduce the amount of time it takes to solve
networking issues and give users the support they need.

Be Mindful When Upgrading Or


Reconfiguring Your Network
Something else you need to consider
once you start retooling your network
is how important it is not to design in
a vacuum. Lerner says no matter how
much it seems like common sense, network administrators dont pay enough
attention to where applications and users
are when designing the network. But if
you keep those facts in mind throughout
the process, you can plan out your capacity and coverage accordingly to avoid
potential issues in the future.
We like to call it right-sizing, says
Lerner. The first step to right-sizing is
figuring out where your users and application are. Dont just upgrade your MPLS
[Multiprotocol Label Switching] network

to add bandwidth to it. Take a step back.


Maybe you dont need to add bandwidth
to your MPLS network; maybe you need
to deploy Internet to your branches. Its
really grassroots. Start with your user community and the applications youre running and take out a map. Thats step one.
It sounds like common sense, but youd be
surprised how many people dont do it.

Prepare Your Network Now For


Future Growth
In the same way you need to be proactive when handling network performance
issues, you also need to be proactive
when designing and planning out your
network. The key to doing this successfully, according to Kindness, is to get
networking people involved early in the
process whenever a new technology is
implemented. He says that networking
people assume they are involved on projects from the start 89% of the time, when
in fact, app developers only tend to involve them in the process from the very
beginning 50% of the time. This creates a
major disconnect between teams and can
lead to poor planning and execution.
What we recommend is that networking people get out in front of it and
market themselves, says Kindness. You
have to have a mind set that you understand and are embedded in the business.
Its more than just a network. Its understanding what customers do at a retail
site as a networking professional. You
have to work with the GM at the site and

CyberTrend / October 2015

41

the developers creating an app for the


retail store. Part of the networking job
is being part of that business team out
there, and helping set the overall strategy
or direction of IT, and not come in afterward. Engaging and working with the end
user is typically not done, but it should be
done in todays world.
When it comes to actually implementing new networking approaches to
be able to meet capacity requirements and
user demand, Lerner recommends looking
into hybrid WAN layouts. Hybrid WAN
is a combination of an MPLS connection,
which is usually the primary network for
a data center and business-critical applications, and the Internet, which is used for
almost everything else. With hybrid WAN,
you can decide whether an MPLS connection or regular Internet connection is a
better fit for a specific site and make sure
performance will always on par.
Hybrid WANs are important to consider also because they can help you save
money. People are building hybrid networks and optimizing the speed, latency,
and bandwidth out to their SaaS applications and IaaS (infrastructure as a service)
cloud locations as well as to their corporate data center, says Lerner. Thats not
just for application performance, because
its cost optimization as well. In North
America, a T1 connection is $250 to $300
a month vs. getting residential broadband
at 50Mbps for $70 a month. Thats a hard
conversation to have with your CEO,
CFO, or CIO to justify the existence of
MPLS, because the price per megabyte
is just so much higher than consumer
Internet. People need a way to bridge
those two together and hybrid WAN is
the current trend.
Lerner also envisions a future where
companies dont have to centralize all of
their network traffic at the data center, but
can actually just focus on connecting the
home office or remote sites directly to the
colocation facility that hosts a given SaaS
application. He says there are somewhere
between 50 and 200 shared-location data

42

October 2015 / www.cybertrend.com

Typically, your fallback position is if you are having a lot


of tickets coming in or the business complaining, then you
start using monitoring and testing tools. The problem is
that people always do it afterward, but monitoring money
should be spent equal to what you spend on infrastructure
or other things.
ANDRE KINDNESS
Principal Analyst
Forrester Research

centers out there that many major SaaS


vendors use to host their applications, so
in the future you might be able to connect directly to that facility for the lowest
possible latency. Thats a very early-stage
trend and not many people are doing that,
probably less than one-tenth of 1%, but its
an early indicator of the way people might
start to think about their WANs in the future, Lerner says.

Take Advantage Of WAN Orchestration Technologies


Once you have all of these networking
pipelines in place, you need a way to
manage them and route traffic for specific
applications. Thats where WAN orchestration solutions come into play. Using
the hybrid WAN model as a baseline, for
example, you can decide that you want
your CRM system to run on MPLS while
applications such as YouTube run on the
Internet connection. Then, if the MPLS
fails, you want the CRM system to move
over to the Internet and take precedence
over YouTube.
Lerner says that granular policies like
these were much more difficult in the past,
but are more manageable now because of
WAN orchestration. And the great thing
about WAN orchestration solutions is that
they are coming from startups and newer
vendors as well as well-established incumbents, says Lerner, so you should have
plenty of options to choose from.
The interesting thing about hybrid
WAN and the idea of a two-lane network
highway is that most companies already

have those MPLS and Internet connections


in place, but use one as a backup in case
the other fails. Kindness says that in a perfect world, companies with this setup are
wasting 50% of their potential capacity, but
in the real world, companies are already
underutilizing their network connections
so much that they may only be using as
much as 11% of their primary network
connection, let alone the backup lane sitting there entirely unused.
At any one time, youre not using all of
the capacity, says Kindness. Youre only
using about 11%, which is ridiculous. You
can actually make both pipes smaller. If
you have the ability to flip back and forth,
you can optimize both of them and make
the links smaller, because combined together, both links are for my worst-case
scenario. But individually, he adds, they
can be smaller than what I have today and
I can leverage both of them.
Once you start looking at those pipelines as two active connections rather than
one active and one backup, you can take
advantage of WAN orchestration and
other helpful tools. In essence, you can
pick and choose which applications run on
which networks to find perfect matches,
such as putting YouTube or SaaS-based
application on the Internet rather than the
MPLS. Those examples right there could
save a lot of money and improve the user
experience, says Kindness. If you dont
have recreational traffic on the link going
back to the data center, then youre freeing
up more bandwidth for the critical business apps.

SDN & Hardware


SHOULD YOUR COMPANY BUY NEW OR USE EXISTING EQUIPMENT?

SDN (SOFTWARE-DEFINED networking)


by its very definition is about making
software the primary tool you use to
manage your network. But just because
you are adding more automation and removing some of the controls from the
hardware itself doesnt make the networking infrastructure, switches, and
other equipment any less important. In
fact, depending on your approach, you
may have the hardware already in place
to properly handle an SDN implementation. The key is making sure that the
hardware you use, whether new or existing, has the capabilities and features
you need to move toward a softwaredefined future.

Everything Depends On Your


Definition Of SDN
Dan Conde, an Enterprise Strategy
Group analyst, says the definition of SDN
has changed since its introduction and

many companies differ regarding the


terms actual meaning. In the beginning, it
was all about OpenFlow, which was one of
the first SDN protocols, and making sure
the hardware was capable of supporting it.
But now, according to Conde, surveys are
showing that businesses cant really agree
on one definition and that SDN is being
broken down into different categories.
For example, the traditional definition had to do with the separation of the
controller from the networking hardware
and instead move it to a more centralized
management software solution, but according to Conde that definition only fits
the mold for 11% of companies. Others
say it is about taking typically hardwareoriented network functions, such as load
balancing, and moving them to a virtual
machine as part of network function virtualization. Meanwhile, others say that as
long as theres a centralized network controller of some kind, its SDN.

Depending on which category you


fall into, your IT department will have to
look at the hardware you have in place
and determine whether it supports the
type of SDN you want to implement. For
some companies, it will be enough to use
a virtualized network overlay to essentially replace a traditional network with
a more software-oriented one without
changing the underlying hardware.
Others will require new equipment with
capabilities built specifically with SDN
in mind.

Buy New Or Use Existing


Equipment?
Building on the idea of new vs. existing hardware, you not only have to
decide what type of SDN you want to
implement, but also how you want to
use the network. For example, Conde
says, approximately 10% of companies
surveyed about SDN equated it with

CyberTrend / October 2015

43

A lot of people have different definitions, and thats


the tricky part, but I think most people just want a lot
of flexibility and programmability so that if [they] want
to make changes, [they] can just do that through
software automation.
DAN CONDE
Analyst
Enterprise Strategy Group

automation and programmability.


What that means is that those companies
would prioritize hardware that can be
customized and coded in unique ways to
fit their specific network environment.
This same idea goes for businesses that
have adopted the DevOps mindset of
application developers and operations
teams working more closely together.
For those organizations, programmability and the potential for scripting is
key, so you have to make sure your existing network switches support automation and programming, otherwise you
may have to invest in new ones.
For companies that want to take the
virtualized route and dont need those
in-depth programmability functions, its
easier to overlay existing networks with
a virtual network. A lot of people have
virtual switches that are inside of their
hypervisors, like VMware vSphere, so in
that case they can do overlay networks
and as long as they have a virtual switch
thats inside the software, they can lay
that on top of their existing switch, regardless of what the hardware is, Conde
says. That offers quite a bit of flexibility
and freedom in terms of choosing to use
existing hardware or upgrade to new.
In the end, Conde says, much of the
decision depends on the hardware vendors and the functions and capabilities
they offer in their products. For example,
Brocade, HP, and NEC all have solutions
that support OpenFlow, so if you already
have networking solutions from those
vendors in place, then you may just be
able to take the virtual network overlay

44

October 2015 / www.cybertrend.com

approach. Cisco, on the other hand, offers network switches that are highly
programmable and can be used in those
DevOps environments. A lot of people
have different definitions, and thats the
tricky part, but I think most people just
want a lot of flexibility and programmability so that if [they] want to make
changes, [they] can just do that through
software automation, Conde says.

SD-WAN
A newer form of networking some
companies are considering is SD-WAN
(software defined wide area networks),
which typically focus more on remote
branches or offices that are connecting
back either to the data center or to the
cloud, Conde says. These networks are
different from traditional SDNs as they
typically deal with more SaaS (software as
a service) solutions and spread networking
needs across both cloud and data center.
SD-WAN is particularly helpful in
dealing with the issue of outstripping network capacity. Conde says most of these
businesses use MPLS (Multiprotocol
Label Switching) networks, which are
private circuits that you buy from telecoms that connect remote offices to data
centers. The problem is that adding
MPLS lines can get quite expensive, so
many companies turn to standard broadband Internet for some use cases. Using
SD-WAN lets you decide what primary,
mission-critical traffic, such as VoIP,
should be on the primary MPLS line and
which secondary or low-priority traffic
can be moved to general broadband.

From the hardware point of view,


you could buy brand new software and
hardware from companies that are specialists in SD-WAN, and there are a lot
of startups like that, or if youre willing
to configure classic hardware thats used
for WAN, theyre actually capable of
doing SD-WAN, as well, Conde says.
They just dont have SD-WAN attached
to them, but with enough configuration,
they could do the very same thing. The
new generation of SD-WAN does it right
out of the box, though, so you just plug it
in, and there you go.

Potential Challenges & Benefits


Whether you choose SDN or
SD-WAN, the biggest challenge is scalability. Obviously, larger enterprises or
cloud providers have the resources necessary to put these types of technologies
in place with little trouble, but smaller
organizations may assume they dont
have the capability to support it. The
good news is that SDN has the ability
to scale, but you just have to choose the
right version of the technology. In fact,
smaller companies, through the use
of SDN, can start to take advantage of
that DevOps mindset and develop new
solutions to put them on a more even
playing field in the Web-focused business world.
Plus, implementing SDN can save
money in the long run, which means
that smaller businesses will not only be
able to keep up with larger enterprises,
but they also wont have to break their
budgets to do so. The main reason
that people want to go through SDN
is because they want to do cost reduction, says Conde. If they could replace,
through network function virtualization,
a lot of hardware with software-based
systems, that would be good. And obviously cost is CAPEX and OPEX, so
if you could automatically change networks, thats a huge OPEX savings. Cost
reduction is the No. 1 driver for enterprise usage and adoption of SDN.

CORPORATE TRAVEL?
NEED A VACATION?
Let our #missionbird take you where you need to go.
Our diverse fleet of 22 aircraft offers a travel experience above the rest.

Ready when you are

STAjets exclusive membership program allows our members


to earn cash in addition to flying at an industry discount.
It requires no complicated contracts, deposits, hidden fees,
or blackout dates.
We offer our exclusive members discounted flights
while earning cash rewards on every flight.

ITS THE LOWEST COST MEMBERSHIP PROGRAM IN THE INDUSTRY


AND THE ONLY PROGRAM THAT PAYS BACK!

844 FLY-STA1 | charter@stajets.com | www.stajets.com


359-7821

Include Software In Your DR Plan


WHY ITS IMPORTANT TO BACK UP DATA & CORRESPONDING PROGRAMS

KEY POINTS
Back up both data and applications to ensure programs run
properly after recovery.
Back up every install file,
firmware update, patch, and
customization or the software
may not work as intended.
Consider using a storage resource management platform or
purpose-built backup appliance
to back up virtualized and traditional software and workloads.
Prioritize the applications
that need to be backed up and
recovered first and make sure
you revisit your DR plan on a
consistent basis.

46

October 2015 / www.cybertrend.com

THE PEOPLE IN CHARGE of backups,


archiving, and DR (disaster recovery)
for businesses are sometimes so focused
on the data they are backing up that
they forget to think about the software
and applications that use that data.
Although it may seem like enough to
have a few copies of a particular program on hand so you can access company information in the future, there is
quite a bit more that goes into backing
up software than some other forms of
data. You not only have to back up the
software, but also every other component thats necessary to make that software run properly.

Why You Need To Include


Software
Phil Goodwin, research director at
IDC, says, People often look at data
recovery and data replication as disaster recovery, but he says its much

more than that and actually involves


recovering an entire workload, which
is going to include your hardware, software, data, and network. Instead of
focusing only on the data, you have to
consider every individual component
in your infrastructure that uses that
data to make sure everything will work
properly during the recovery process.
This idea applies to programs on
your personal computer just as much
as it does to software in an enterprise
setting. Dave Russell, vice president and
distinguished analyst at Gartner, uses
the example of Quicken, which many
people use as finance management software at home. While many Quicken
users remember to back up the QSF
file the program uses, Russell explains
that they often forget to back up the
software, as well. If your hard drive or
computer bursts into flames and you
buy a new one, great, you have your

data file, but you have no way to actually access it without that software application, he says.
The same goes for major pieces of
software and data in the enterprise.
You may properly back up and protect
your Oracle database, for example, but
if you dont have every component of
the applications that uses that data up
and running, then youre just stuck with
an unusable database. Russell says that
some of these applications also have
built-in Web components, so to get everything up and running, you not only
need to protect the database and application, but also any necessary Webenabled functionality.

Special Considerations
For Software
Backing up every component of a
program goes even deeper though, because you also have to consider every
firmware update, OS (operating system)
version, missing patches, and so on.
When it comes to disaster recovery, its
very detail-oriented, and you have to pay
attention to the software stack because
the thing that tends to trip people up the
most is when they get incompatibilities
of software, says Goodwin. Theres just
a tremendous number of details and a
lot of those details, frankly, are on the
software side.
Goodwin says companies will often
overlook these incremental changes and
not realize that they can involve software
as well as an OS. These little changes may
seem so minor at the time that you either
forget about them or ignore them, but he
warns that they can become so frequent
that a disaster recovery system gradually but quickly becomes out of sync with
the primary system. So when you try
to recover your primary system using
backed up assets, you could be using
incompatible versions of the software or
OS and workloads wont run properly.
Its a lot of those oops moments
when people try to do a test or, worse

People do tend to overlook the notion of migrating an


entire application stack, and I think once you start to think
of disaster recovery in terms of workload migration rather
than in disaster recovery terms, it tends to help people
understand whats involved in DR. You have to migrate
this workload. They kind of get it. It takes it away from the
abstract and puts it into a much more concrete term.
PHIL GOODWIN
Research Director
IDC

yet, an actual disaster failover that they


discover all of those little details they
missed over time, says Goodwin. It requires a high degree of process maturity
for organizations. When we update our
primary systems, we will also update our
disaster recovery systems. It really is a
process thing. Disaster recovery is what
I call the classic triad of people, process,
and technology.
To better understand what components you need to back up, Russell
recommends working backward and
determining all the steps needs you
need to complete to get a workload
running the way it should. Because enterprise infrastructure today is so interconnected, and multiple systems and
platforms are integrated, its necessary
to trace the path as well as you can and
gain a clearer understanding of how
things work together.
For example, in a typical DR scenario, youll need to think about the
personnel and employees in charge of
customer support. Russell suggests they
may need their phones up and running
as a priority because customers cant
get through to us any other way. If
you have a VoIP system, for example,
then you not only need the hardware
up and running, but also the underlying
applications. And if that phone system
is connected to a larger unified communications platform, then you also need
to make sure that system is protected,

as well. Many of todays phone systems


are so software-based and Web-enabled
that it takes much more than hardware
to actually make and receive calls.

The Process Of Backing Up


& Protecting Software
When backing up and protecting
software its important to remember
to keep everything in check. Russell
once again stresses that its not just a
matter of backing up everything you
downloaded or installed off of a disc,
because you also need all of the updates, customizations, license keys, and
every other component added to that
software over its lifetime. Licenses are
particularly important, because you
need to ensure a vendor will honor a
license and not charge for a new one
should you need to completely reinstall
a piece of software. And if you plan to
run older software on a newer infrastructure, you cant assume that it will
be a simple plug-and-play process. A
lot goes into this planning process that
needs to be done up front so you arent
rushing to fix major issues after a disaster has already come and gone.
So, how do you go about protecting
the software? What needs to be done?
Well, fortunately for many organizations, according to Goodwin, a lot of
it has been simplified by virtual computing. He says IDC estimates that
fewer than 70% of all workloads in the

CyberTrend / October 2015

47

If its truly a disaster, we cant do 50 things at once and


we have to prioritize. The medical analogy of triage where
you deal with the most critical first is our guiding principle
here. The BIA is something that has been done before
the situation arises to take the emotion and politics out
of it and to say, Whats really most vital for our business
operations to focus on first?
DAVE RUSSELL
Vice President & Distinguished Analyst
Gartner

x86 world are virtualized. What that


means is that whether youre migrating
one application or an entire workload
over to a DR site, it can be done with the
help of something like an SRM (storage
resource management) platform or similar solution.
SRM platforms are designed to speed
up the utilization efficiency of SANs
(storage area networks). This means
that if you want to protect your software workloads and they are already
virtualized, you can just migrate them
over to the secondary facility for quick
backup and future recovery. When
people are doing backups, they have to
do a backup of the virtual machine and
virtual machine images to make sure
the entire application workload stack is
actually backed up and migrated from
the primary systems to the DR systems,
Goodwin says. If everything is already
optimized and as efficient as possible beforehand, then backup and recovery will
be much easier.
If you are backing up software that
isnt virtualized or just want a different
option, Goodwin points out that many
companies use PBBAs (purpose-built
backup appliances) to back up their
software for DR purposes. What those
appliances do is deduplicate the data,
compress it, and often encrypt it, which
is an important point whenever youre
moving it offsite, he says. You can then
replicate that to your disaster recovery

48

October 2015 / www.cybertrend.com

site or some kind of offsite repository to


reduce the risk of data loss.

How To Prioritize Which


Applications To Back Up
Once you have a firm grasp on what
components actually need to be backed
up and what technologies youre going
to use to get the job done, you can focus
on prioritizing which applications to
back up first and which ones need to be
ready for recovery before others. As a
start, Russell recommends performing a
BIA (business impact analysis) on your
software to figure out what is most critical to your organization.
In a world without scarcity, you
could apply the premium solution to everything, he says, but in the real world
there is a finite amount of time and resources to work with. For that reason,
you have to prioritize certain applications over others. A BIA will aid in this
endeavor, but Russell warns that its not
always the same answer for every organization. For some companies, its crucial to get CRM systems up and running
to maintain customer contact, but for
others it might be getting the e-commerce site back up and running, because its how we generate revenue, he
says. You have to look at every softwarebased component of your business and
build a tiered list of how long you could
go without a certain application before it
impedes productivity.

Goodwin agrees with the idea of prioritizing applications for backup and
recovery and says its often as simple as
starting off with mission-critical, then
you go to business-critical, and then the
rest are operational. He says you absolutely have to recover in priority order
and apply service levels to those different
things, meaning that you have to not
only make sure something is up and running, but also that it will perform at an
acceptable level.
Goodwin explains, You might say,
I need to be able to recover my mission-critical applications within four
hours. My business-critical I can recover
within 24 to 72 hours. All of the operational, maybe its OK to recover them
within five days. Oftentimes, it boils
down to what generates your revenue. If
its something that generates revenue,
Goodwin adds, its mission-critical.
Business-critical would be things like
your back office accounting. You have
to do it, but if you do it today or tomorrow, it doesnt matter that much.
More operational things might be file
and print, reporting, and other stuff that
is more of a back office, periodic, once
every 30 days.

DR Isnt Set It & Forget It


One final note regarding software and
disaster recovery, and one that Russell
stresses, is that it is by no means a static
process and its constantly evolving. Its
not that people should go and do a BIA
and create a DR plan and then claim victory, he says. Its something that needs
to be revisited on an ongoing basis, annually at a minimum, but ideally twice
a year. In your home life, you may occasionally review your insurance stance.
Ten years ago you didnt have children
and you hardly had any of the possessions
that you have now. Are the same kinds
of policies still appropriate? Thats what
were dealing with here. These are insurance mechanisms for the data center and
priorities can change over time.

Beware Insider Threats


IMPLEMENT POLICIES & TECHNOLOGIES TO PREVENT ATTACKS

KEY POINTS
Insider threats can be disgruntled employees or third-party
contractors, but they may also
be the result of human error or
spear phishing schemes.
UBA (user behavior analytics)
and cloud-based application access solutions can help prevent
insider threats.
Investigations are necessary
to determine if a threat was
malicious or accidental and how
the incident should be handled.
People-centric security
makes your workforce the first
line of defense against insider
threats and other security risks.

50

October 2015 / www.cybertrend.com

EVER SINCE the Target breach in 2013,


companies have been taking the concept
of insider threats more seriously. Although
organizations are still trying to get a better
grasp on what constitutes an insider threat
and how it can happen, its clear that
something needs to be done to prevent
employees or third-party contractors from
negatively impacting a company. Security
technology is important, as are corporate
policies that encourage safe user habits, but
key to preventing or dealing with the aftermath of insider threats is understanding
there are different types of threats.

Threat Types
One type of insider threat, according to
Doug Cahill, senior analyst at Enterprise
Strategy Group, is the disgruntled employee or insider with purely malicious intent. This employees main goal is to either
steal information or damage the company
in some way from the inside.

Another type can stem from temporary third-party contractors. One of the
most prominent examples of a breach
that happened via third-party is Targets
HVAC contractor, says Cahill. Thats
how [its] network became compromised.
Another type of insider threat is one in
which an employees login credentials are
co-opted as the result of a targeted attack.
An example of this, according to Cahill,
can occur in a spear phishing campaign
when someone receives what appears
to be a legitimate email from his bank.
Because it is well-engineered email, he
clicks an attached file or weblink within
the message. Unbeknownst to the user,
his actions result in the installation of a
keystroke logger that records everything
he types. A malicious person can use this
same method to target organizations,
giving him access to a variety of login
credentials, such as cloud-based apps,
internal corporate databases, and more.

Dont start with an assumption that its only malice.


Ninety-nine times out of 100, its just people hitting
the wrong key, trying to find a shortcut and gain new
efficiencies in a work process, or just being sloppy. Its
a very exciting, fun day when you actually have a real
attacker who is really trying to do something bad, but
most of the time its just errors.
ANDREW WALLS
Managing Vice President
Gartner

Although it may appear as though an insider is accessing the information, in this


type of threat an outside attacker is actually perpetrating the data remotely.
Andrew Walls, managing vice president at Gartner, agrees that there are different types of insider threats and that
even organizations such as CERT.org define them specifically as being a malicious attack from an insider threat. And
while many people tend to see insider
threats in this light, Walls warns that security teams dont have the freedom to
make assumptions about intent of security event. He points out that theres no
way to know whether an attack was a
mistake or purely malicious until after
an investigation is complete, so when it
comes to insider threats, you should assume its bad.
Instead of putting labels on insider
threats before they are known quantities,
Wall instead defines them as anything
that appears to be a violation of your desired security outcomes. And then, once
you know what actually happened, you
can decide what course of action to take.
Dont start with an assumption that its
only malice, says Walls. Ninety-nine
times out of 100, its just people hitting the
wrong key, trying to find a shortcut and
gain new efficiencies in a work process, or
just being sloppy. Its a very exciting, fun
day when you actually have a real attacker
who is really trying to do something bad,
but most of the time its just errors.

Important Technologies
To Consider
When it comes to the technology side
of preventing and managing insider
threats, Cahill says its important to prioritize your systems and applications to
determine which ones require extra protection and authentication. He recommends organizations protect the systems
that hold the most business-critical data
with 2FA (two-factor authentication),
which requires not only a username and
password, but also a unique code that is
sent to a separate device, such as a smartphone, thereby adding an extra layer of
security. Were in a world today where
2FA needs to become a standard authentication mechanism, says Cahill. Thats
the first thing and thats really table stakes
relative to todays landscape.
The next technological step, according
to Cahill, is to consider implementing a
UBA (user behavior analytics) solution,
which, he explains, is a new category of
products that very often integrate with
SIEM (security information and event
management) products such as Splunk
or Arcsight. UBA solutions help establish a baseline of normal usage for an
employee and give you a foundation to
work from when trying to identify potential anomalies. For example, maybe an
employee typically logs into his email or
CRM system at 8 a.m. from his desk but
one day he accesses it at midnight from a
different location. The UBA system would

log that activity and compare it against


the users past history. The UBA product
can also tag what parts of the database
youre typically accessing, Cahill says,
so if an employee is viewing a data set he
doesnt typically access, you can investigate and determine if its a threat.
This same idea also applies to user activity with SaaS (software as a service)
and cloud applications, and it actually ties
into the idea of Shadow IT where users
bring in outside technologies to solve perceived productivity issues. Cahill uses the
example of an employee using Dropbox
to share not only personal information
with friends and family, but also business
information with colleagues. For organizations, this becomes a decision of allowing the activity to continue or putting
a stop to it completely. Fortunately, there
are quite a few solutions out there for
companies to consider if they want to let
employees use file-sharing services and
other cloud-based applications.
Theres a whole new class of products that are centered on cloud access
and control that allow IT organizations
to get an inventory of what SaaS applications are in use but then to also apply
to controls around those applications,
says Cahill. Companies like Skyhigh
Networks, Netskope, and CipherCloud
allow you to apply policy that says, OK,
[this employee] is allowed to use Dropbox
for file-sharing, but Im going to look at
the content that goes out and make sure
there are no social security numbers or
other personally identifiable information
for which I could be held liable.

How To Deal With The Aftermath


When it comes to insider threats, prevention is obviously ideal, but there will
be situations where an insider does have
an intended or unintended security impact on your organization. First of all, according to Cahill, you need to have some
kind of security policy laid out in your
employee handbook and require that
your employees certify they have in fact

CyberTrend / October 2015

51

read the security policy. This is an important thing to remember because if you
need to educate or train an employee after
an incident or use it as leverage in an investigation, its important that you have
something solid to fall back on.
From there, you either have to perform
an internal investigation or have a thirdparty investigator come in to determine
what happened and what the ramifications are. This is the part in the process
where you find out if the attack had pure
malicious intent or if you can chalk it up
to human error. Walls says that the type
of insider threat has a direct impact on
what needs to be done, so if it was an accident, it may only require some education
and training, but if its malicious, then
you may need to involve law enforcement
and consider criminal prosecution.
At that point, it ceases to be a security matter and becomes an HR disciplinary matter, or in certain cases a law
enforcement criminal activity matter,
says Walls. The investigating team will
notify the correct person in HR, or legal
counsel if they think theres criminal risk.
There should be defined points of contact
for them. Then, its the organizations decision as to how they invoke disciplinary
activities or legal proceedings, whether
they want to ignore it or move on it. Its
no longer a security matter.
These types of situations illustrate how
important it is to have a UBA or similar
solution in place to create an audit trail,
so that everything is recorded, Cahill
says. You can go back and review that
trail not only for forensics purposes,
but also for compliance. And you can
use these systems to decide what should
happen following a certain type of threat.
Upon detecting inappropriate use of a
system and data leakage, you can terminate access right away, says Cahill. If
its an insider threat from a third party,
all of those accounts should be timebombed. If I have an engagement at a
company for a two week statement of
work, my credentials should absolutely

52

October 2015 / www.cybertrend.com

The first thing organizations should do is recognize that


not all systems and applications are created equal, and
those that contain their most business-critical data should
be accessed with two-factor authentication. Were in
a world today where 2FA needs to become a standard
authentication mechanism. I know whenever I log into my
online banking app, I use 2FA, because I want that extra
layer of authentication. Thats the first thing and thats
really table stakes relative to todays landscape.
DOUG CAHILL
Senior Analyst
Enterprise Strategy Group

time out after those weeks and then there


should be a full audit of what I did during
those two weeks.

People-Centric vs.
Technology-Centric Approach
While technology is a viable option
for preventing insider threats, and one
that many organizations embrace, its also
possible to take a more people-centric approach to security. Instead of investing
in more and more technology to detect
what people are doing, you start investing
in the people and building communities
of trust where the people who are in the
workforce have expectations of each other
in terms of trustworthiness, performance,
observation, and co-supervision of each
others activities, so that the people in
the organization are your first line of detection, mitigation, and response, Walls
says. In essence, you can make your employees the initial gateway to stopping
insider threats and let them govern themselves in a more organic way.
Walls admits that technology-centric
security and people-centric security are
very different approaches, but they also
dont have to be mutually exclusive.
He warns that having a technocratic
mind set can actually result in a more
fragile organization. Your people, especially these days with the advent of

digital business, are expected to be flexible, adopt new patterns of work, act independently, collaborate and form teams
on the spot, and employ whatever technology is necessary to complete their
tasks in an efficient matter, says Walls.
Theyre being handed a lot of responsibilities, and if we constantly build rigid
technological structures around them
that inhibit their ability to do what they
need to do, they will vote with their feet
and work around those structures.
Its not a matter of choosing one approach over the other, but rather coming
up with a proper balance between technology and policy that makes sense. Its
important to employ 2FA and SIEM technologies, and cloud-based access management solutions, but you also have to
have the employee education, behavior
influence, and personal responsibility
programs in place that empower your
employees to make their own decisions
and put security first in everything they
do. This has to be a conscious investment
and it cant be something you stumble
into, says Walls. If youre taking the
people-centric approach, the CEO needs
to be on board. Its not buying a new
next-generation firewall; its changing the
fundamental relationship between your
employees and your security performance
objectives as a corporation.

THE LATEST PREMIUM ELECTRONICS

Toshiba Announces A Small, Sleek Convertible Computer


WWW.TOSHIBA.COM
Waiting for a computer that takes advantage of the very latest HD (high-definition) display technology and Intels newest,
high-performance/low-power processors? You wont have long to wait, as Toshibas Satellite Radius 12 convertible laptop
fits the bill, and is set for availability in Q4 this year. Billed by Toshiba as the worlds first 12.5-inch convertible laptop with
a 4K Ultra HD display, the Radius 12 uses Intels 6th Gen Intel Core processors, runs Windows 10, and works as either a
laptop or a touchscreen tablet. Weighing just 2.9 pounds, the Radius 12 features a compact design, an aluminum construction for durability, and Gorilla Glass NBT for added screen strength. The IPS touchscreen has a native 3,840 x 2,160 resolution, which is four times sharper than a Full HD display. And to top it off, the Radius 12 is the first compact PC to use the
new Windows Hello authentication, which lets you use your fingerprint, iris, or face to log on.

54

October 2015 / www.cybertrend.com

Samsungs Long-Lasting Smartwatch Comes In Two Styles


WWW.SAMSUNG.COM
Just when you thought your smartphone was going to replace your wristwatch and your wallet, along came the smartwatch. You cant use just any smartwatch to pay the bills, however, and smartwatches can be a pain when you end up
having to charge them frequently. Samsung has you covered in these areas, though, with its new Gear S2 smartwatches,
which employ NFC (near field communication) technology for making payments (and perhaps even for replacing your remote controls and car keys) and hold a charge for up to three days. With wireless charging, you dont even have to attach
the Gear S2 to a charge cable. The Gear S2 uses a dual-core 1GHz processor and offers 4GB of storage, more than enough
to make it useful for managing communications, monitoring your health, recording voice memos, exchanging text messages, and more. The Gear S2 comes in two varieties: a classic version with a black case and leather band, and a streamlined version in dark gray or silver and white. As of press time, Samsung had not yet announced pricing or availability.

CyberTrend / October 2015

55

Images, clockwise from top left, courtesy of Apple (1), Samsung (2), Sony (3), Microsoft (4, 5), and BlackBerry (6)

Smartphone Tips
ADVICE FOR HANDLING EMAIL, CALENDAR & CONTACTS

WINDOWS PHONE
Delete Email In Bulk

Share Contact Information

To delete a single email, on Start tap


an email account that has the unwanted
email(s), tap the email you want to delete,
and tap Delete. To trash multiple emails
simultaneously, tap to the left of the email
author to display a row of checkboxes beside each email. Next, tap the checkboxes
that correspond to each email you want to
delete and then tap the Delete (trash can)
icon to complete the operation.

With Windows Phone 8, you can quickly share contact informationwhether


its yours or someones in your contact listwith someone else via text messaging.
Open the Messaging app, tap New, type the recipients information until the appropriate name or phone number appear, tap Attach, tap Contact, locate the contact
file youd like to share, tape Share, and tap Send. This method can serve as a quick,
convenient, and business card-free way to swap your own contact information with
someone else.

Get Help
Need more help? Tap the Help+Tips
tile. This app provides access to quick
hints, how-to articles, frequently asked
questions, and video walk-throughs.

56

October 2015 / www.cybertrend.com

Move Contacts To A New Nokia Phone


If you are switching from a non-Windows Phone smartphone to a new Windows
Phone 8 smartphone from Nokia, and both smartphones support Bluetooth, there
is a simple way to transfer all of your contact information from the old device to the
new one. To begin, switch Bluetooth on in both phones, making sure that the old
phone is discoverable. Launch the Transfer My Data app on the new Nokia phone,
tap Continue, and tap to connect to the old phone. Follow the on-screen instructions to complete the process of copying contact information. Finally, remove the
information from the old smartphone as appropriate.

ANDROID
Set Up Encryption

Keep Private Events Private

For added security, you can encrypt


all of the settings, account information,
apps, and other data on your Android
device or on a SD card installed in your
device. With encryption established,
you will be required to enter a PIN or
password every time you wish to access anything on the device or SD card.
Before setting up encryption, make
sure your device is fully charged and
is attached to a charger, as the entire
process can take an hour or more and
must not be interrupted to complete
successfully. To apply encryption, access the Settings screen, tap More,
and then tap the appropriate option:
Encrypt Device or Encrypt External SD
Card. Tap Set Screen Lock Type and
then follow the on-screen prompts to
complete the process. You can also use
encryption for email communications
on your Android device. System administrators have the option to activate the
encryption option when setting up your
device with a Microsoft Exchange or
ActiveSync account.

If you have added an event to your calendar, the entry can be public or private depending on the setting used. If there is an event in your calendar you
would like to keep private, launch Calendar, access the event, tap the event
name to open it, press the Menu key, tap Edit, and make sure the event is set to
Private rather than Public.

Get All Of Your Messages,


All In One Place
With the Hangouts app, Android
smartphone users can access all of
their messages in one place. Open the
Hangouts app to access text and MMS
(Multimedia Messaging Service) messages
sent and received via your smartphone,
as well as other Google Voice and video
calls. The idea behind the Hangouts app
is to unify the messaging screens into one
location and make it easier to keep track
of past and ongoing conversations with
specific contacts. With the app on your
smartphone, you can make free group
video calls with as many as 10 people.

Report Spam
Just because youre using your smartphone for a majority of your emailing
these days doesnt mean you have to put up with spam. To take care of unwanted
messages, just tap the offending piece of email to view it, press the Menu button,
and then tap Report Spam. Depending on your phone, you may need to tap the
More icon to see the Report Spam option.

Import Outlook Contacts


Adding your Microsoft Outlook contacts to your Android phone is easy. Launch
the application on your PC, click File, Open & Export, and Import & Export. Select
Export To File, click Next,
select Comma Separated
Values (Windows), click
Next, select your Contacts
folder, click Next, choose
a location for the exported
document, click OK,
click Next, and then click
Finish.
Now, launch a Web
browser and navigate to You can use the Web version of Gmail and Google Contacts to
your Gmail account. Click import Outlook contacts into your Android smartphone.
Gmail (on the left side of
your Gmail page below
the Google logo), select Contacts, click the More button, select Import, and click
Choose File. (If you are using the preview of the new Google Contacts version,
you will have to select Go To Old Contacts before you will be able to choose a file.)
Navigate to the CSV file you just exported from Outlook and click Open. Choose
whether or not to add the new contacts to a specific group using the checkbox, and
then click Import. If you chose to add your new contacts to a new group, youll be
prompted to name that group. Gmail will provide you with a report of how many
contacts were added and how many of the new contacts were merged with existing
contacts. Click OK to dismiss the report, click the Find Duplicates button, and
then click OK to merge as many repeated contacts as possible. Now, syncing your
Android-based phone with Gmail will bring your new contacts over.

CyberTrend / October 2015

57

iOS
Manage Your Email, Calendar
& Other Notifications
Are you getting so many unnecessary
notifications that your email and calendar
ones are getting lost in the shuffle? You
can control that. Access Settings and tap
Notifications. Tap Sort Manually, and then
tap Edit. Now you can slide app/notification types around to determine which are
(and arent) included in the notification
area and the order in which they appear.

Quickly Type Numbers


& Special Characters
When typing messages or filling out text
boxes, switching to the special characters
keys, by pressing the 123 button in the
bottom left corner of the screen, can involve two extraneous taps just to input
a single special character or number. To
quickly input numbers or special characters, especially if youre only entering one
at a time, just press and hold your finger
over the 123 button, drag your finger to
the number or special character you want,
then release.

Change Mail Settings To Save Battery Life


By default, your iPhones Mail accounts are set to push notifications, so you get
a heads up the instant an email arrives. But to save your battery you might want to
fetch at intervals of your choosing. Access Settings; tap Mail, Contacts, Calendars,
and Fetch New Data; and move the slider adjacent to Push to Off. Now you can
use the interval settings below to deliver mail less frequently. Bumping your mail
fetch setting to Hourly will help a bit, but fetching data manually can mean even
greater battery savings. To fetch new data manually, tap Manually from the Fetch
New Data screen.
If youre not inclined
to fetch less often, then
consider limiting the
email accounts that your
iPhone checks. To turn
off an email account, access Settings; tap Mail,
Contacts, and Calendars;
select an email account;
and then set the account
to Off. You can also delete
an account by accessing
Settings; tapping Mail,
Contacts, and Calendars;
selecting the unnecessary
email account; and tapping Delete Account.

Save More Email


Messages
Link Similar Contact Information
If you have duplicate or similar information for multiple contacts in your iPhone
(say, after importing Facebook contact
data), you can link those contacts together.
Open one of the relevant contacts, tap Edit,
scroll down until you see Linked Contacts,
tap Link Contacts, select the other contact you want to link, and tap Link. If you
should need to unlink contacts, open one
of the contacts, tap Edit, scroll down until
you see Linked Contacts, tap the red circle
containing a minus sign next to the contact
you wish to unlink, tap the Unlink button,
and tap Done.

58

October 2015 / www.cybertrend.com

By default, the iOS Mail


app only keeps a limited
number of incoming email
messages on the device for
Exchange users. In versions
prior to iOS 8, you could Disable Push to lighten your batterys burden.
increase the number of
messages retained on the
phone; in iOS 8, you can set a designated length of time for retaining messages. Start
by accessing Settings and tapping Mail, Contacts, Calendars. Tap the name of an
account and then tap Mail Days To Sync. Choose the length of time you would like
email messages to be available before they are no longer stored on or accessible from
your iPhone. For an Exchange account, for example, you can select 1 Day, 3 Days, 1
Week, 2 Weeks, 1 Month, or No Limit (watch your storage space if you choose the
last option); this only affects messages on the phone itself, not in Exchange.

BLACKBERRY
Add Contacts To Your
Home Screen
Launch the Contacts app and highlight the contact you wish to have
on your home screen. Press the
Menu key and select Add to Home
Screen. A small box will appear, with
an icon for the contact and the contacts name. You can change either
by tapping on it. When youre satisfied with the name and icon, tap the
Add button.

Merge Contacts & Calendar Items


If you find yourself with multiple contact and calendar entries on your
BlackBerry 10 smartphone, it could be the result of BlackBerry Link synchronizing items between your computer and your device but leaving behind items that you created on your smartphone. If thats the case, you can
rectify the situation by deleting the smartphone-only entries.
To do this, access Settings on your BlackBerry, select Accounts, select
the More (three vertical dots) icon, and then select Clear Local Contacts
and/or Clear Local Calendar. Keep in mind that this action permanently
deletes the entries that exist solely on the device.

Calendar View Hot Keys


If youre a regular user of the
Calendar app on your BlackBerry,
you may be tired of using the menu
and trackpad to switch between the
various calendar views. Try these
keyboard shortcuts instead:
Agenda View: A
Day View: D
Week View: W
Month View: M
Pressing the D key may not work,
because it may already be assigned
to the Quick Entry function. If you
would rather have the D key bring up
the Day view, follow these steps:
Launch the Calendar app
Press the Menu key
Select Options
Select the Calendar Display
Select Actions
Locate the Enable Quick Entry
option
Remove the tick mark from its
box

BlackBerry Link software simplifies the process of transferring data from another device.

Control Roaming Costs


Roaming, the ability of a phone to stay connected even when youre
outside of the service area offered by your mobile provider, is a handy option but it can also be an expensive one. In many cases, roaming can incur
additional costs, especially if you use data services when youre outside of
your carriers coverage area.
You can control the services that are available when youre roaming,
and either change the settings to have your BlackBerry turn data services off automatically or ask you what to do when you start roaming.
From your BlackBerrys Home screen, access Settings, select Network
Connections, and Mobile Network. Select the Data Services While
Roaming drop-down menu and tap either Off (to turn data services off) or
Prompt (to have your BlackBerry ask you what you want to do when you
start roaming).

CyberTrend / October 2015

59

PC Problems On The Road?


HERE ARE SOME QUICK FIXES

IF YOU HAVE USED a computer for any


amount of time, then you know that
PC problems can often occur with little
warning. Maybe you are having trouble
connecting to a Wi-Fi hotspot, or you
cant get your mouse to work. We explore how to troubleshoot these and
other common PC problems so you can
get back to work quickly.

Hotspot Troubleshooting
Ordinarily, when you carry your
laptop into an airline lounge, it will automatically connect to the available Wi-Fi
hotspot. But what if that doesnt happen?
First, check that your notebooks Wi-Fi
adapter is turned on. Often, youll see
a backlit Wi-Fi icon near the keyboard.
If the icon isnt illuminated, look for a
physical switch that you can flip to enable
the adapter. Sometimes, the state of your
network connection is easily determined
by an icon in the notification area of the

60

October 2015 / www.cybertrend.com

Taskbar. For instance, a red X on the network icon indicates the adapter is disabled while an asterisk means the adapter
is in the process of detecting the available
networks. You can right-click the network icon in Windows 7 or Win8 and
select Troubleshoot Problems. When the
Windows Net-work Diagnostics utility
opens, it will reset your connection, disable the wireless adapter, and then enable
the adapter again.
The utility will display descriptions of the problems it detects along
with some recommended solutions. In most instances the utility
will repair the connection and report
the issue as Fixed. To enable a disabled adapter, right-click the Network
Connections icon, click Open Network
And Sharing Center, select Change
Adapter Settings, and then right-click
the name of the wireless adapter. In
the resulting menu, you can choose to

disable or enable the adapter, connect to


or disconnect a network, and diagnose
problems, among other options. Click
Properties to access detailed options that
may help you troubleshoot the problem.
When your adapter is working properly, Windows may display a message
indicating there are several available
wireless networks. Select the message and choose a network SSID (service set identifier, or name) from the
list. (You may need to input a security
password.) To display a list of available
networks in Win8, go to the Settings
option in the charm bar and click the
Available Networks icon. If the adapter
is working and your system appears to
be connected, but you still cant access
the Internet, then check for a browserbased splash screen and/or a Terms Of
Use statement to agree to. Launch a fresh
browser session and click the Home icon
to redirect.

Fix Broken Outlook PST


& OST Files
The PST (personal storage table) file
and the offline OST (Outlook Data File) is
where Outlook stores messages, calendar
events, and notes specific to your email
account. If this file becomes corrupted,
you may find yourself ousted from
Outlook. There are a few things, however,
that you can do to get a foot in the door.
Scanpst.exe (Outlook 97-2003, 2007,
2010, and 2013), Microsofts Inbox
Repair tool, lets you solve busted PST/
OST problems quickly. To access the
tool, close Outlook and navigate to
C:\Program Files\Microsoft Office
\OFFICE12. (This last folder may have a
different number; for instance, our version of Office 2013 stores the utility in
the \OFFICE15 folder.) Double-click
Scanpst.exe. By default, the address for
our OST file was already listed, but if
the field is blank, look in the C:\Users\
USERNAME\AppData\Local\Microsoft\
Outlook\ folder. Click the Options
button to access Replace, Append, or No
Log functions and click OK. Click Start
to begin the scanning process. Windows
will inform you of any errors and prompt
you to perform a repair when the scan
is complete. Before clicking the Repair
button, make note of the scanned files
backup location. Click Repair and OK
when you see the Repair Complete message. Launch Outlook to see if this fixes
the problem.
If the file structure was corrupted beyond repair, Scanpst.exe resets your file
structure and rebuilds the headers. The
Recovered Personal Folders item in your
Outlook folders list, if it appears, will
contain all the data that is recovered. You
can then drag the data to your new PST
file and delete the Recovered Personal
Folders item from Outlook.

The Microsoft Outlook Inbox


Repair Tool (Scanpst.exe)
lets you quickly recover
corrupted Outlook PST
and OST files.

A Touchy Touchpad
If you use your laptop on a dock (and
use an external mouse and keyboard),
you can go weeks or months with a deactivated touchpad and never realize
it until you hit the road. If you find
yourself in this situation, you can activate the touchpad by pressing the Fn
(function) key simultaneously with the
F number key associated with the laptops touchpad (often labeled with an
image of a touchpad). Using this key
combination will either automatically
activate the touchpad or display a device settings dialog box that gives you
the option to enable your touchpad.
Alternatively, you can check the notification area in the lower-right corner
of the screen for a touchpad icon. Click
the icon and the touchpad control panel
appears where you can enable or disable
an input device.

An Unresponsive Keyboard
Or Mouse
If your programs and applications
dont respond to keyboard commands,
use your mouse to shut down the computer by clicking Start, then Shut Down
(in Win7) or tap the Power Button and

tap Shut Down (in Win8). Unplug the


keyboard from your PC and then reconnect it. Restart your PC to determine whether this process corrected the
problem. (If both input devices are unresponsive, you can press and hold the
Power Button on the tower to manually
shut down your system.)
If your mouse isnt responding, but
your keyboard is, press the Windows key
in Win7 to open the Start menu, use the
Right-Arrow key to select Shut Down,
and then press ENTER. In Win8, press
CTRL-ALT-DELETE, press the Tab key
until the power icon is highlighted, and
then press ENTER. Unplug your mouse
and then reconnect it. (If necessary, you
can press and hold the Power button to
shut down the PC.) Then restart your
computer to see if these instructions fix
your problem.
If youre using a wireless keyboard and
mouse, ensure that the peripherals are
synced and in range of the wireless receiver. You may also need to install new
batteries. If these steps dont enable peripheral communication with the PC, try
reinstalling device drivers. You can often
download these from the mouse and keyboard manufacturer websites.

IF YOU USE YOUR LAPTOP ON A DOCK, YOU CAN GO WEEKS OR


MONTHS WITH A DEACTIVATED TOUCHPAD AND NEVER REALIZE IT
UNTIL YOU HIT THE ROAD.

CyberTrend / October 2015

61

Laptop-Projector Setup Problems


TROUBLESHOOT COMMON ISSUES WITH THESE HANDY TIPS

YOURE READY TO give your presentation, but until that first slide appears on the big screen, you can never
be sure that your equipment has got
your back. We cant tell you not to
worry, but these handy tips should
help bail you out if your presentation
goes south.

Hardware & Cable Connections


It can be difficult to track down the
source of problems that occur when
you are connecting a notebook and
projector. Following are some things to
watch for.
Video. Turn off all equipment and
connect your notebooks video out port
to the projector. The usual connection
choices for a notebook are VGA (Video
Graphics Array), DVI (Digital Visual
Interface), HDMI (HD Multimedia
Inter-face), and DisplayPort. Many
projectors have VGA and one or more

62

October 2015 / www.cybertrend.com

digital connections. If possible, use a


digital connection for high quality.
Sound. Some HDMI and DisplayPort digital video connections can
carry audio through the same port,
but both notebook and projector must
support audio over the digital video
connection. Traditionally, audio is
connected using the notebooks audio
out jacks and the projectors audio in
ports; both of these are often RCA or
3.5mm. If youre not using the projectors built-in speakers, make sure you
connect your notebooks audio out to
the sound system you intend to use and
turn the volume down on the projectors speakers.
Mouse. If you are using a mouse, or
a remote mouse controller, make sure
the controller/mouse is connected, usually through the notebooks USB port.
If you are using a wireless device, make
sure the notebook has the appropriate

wireless connection enabled. This is


typically Bluetooth or a USB port wireless dongle.

Network Connection
Many venues supply network projectors, which are made available as
a shared resource. Making a connection to a network projector is as easy as
plugging your notebook into the corporate network via wired or wireless
Ethernet. Check with the companys
IT staff for specifics. Once connected,
use the network connection wizard in
Windows 7 to find the projector you
wish to use:
Click Start (the Windows button in the bottom-left corner
of the screen).
Click All Programs.
Click Accessories.
Click Connect To A Network
Projector.

The network connection wizard


may inform you that your
notebooks firewall is blocking
the ability to connect with the
projector. Click to establish the
network connection.
Either have the wizard search
for available network projectors
or enter the projectors address
manually if it is available.
Once the device is connected, a
Network Presentation window will
minimize to your Taskbar. When
youre ready to make your presentation, open the Network Presentation
window and select Resume. Your notebook will treat the network projector
like an external monitor.

No Video
In many cases, your notebook will
detect that you have a projector plugged
into one of its video outputs and will
automatically turn on the port. Not all
notebooks do this, however; and even
those that can still have missing video
if the notebook isnt set to duplicate the
Desktop or extend it to the secondary
monitor (the projector). Many notebooks use a function key combination
to toggle the projector port on or off
and set how you can use the display.
We recommend using the control
panels in Win7:
Right-click a blank area on the
Desktop.
Select Screen Resolution.
Select the second display from
the drop-down menu.
Select Extend These Displays
from the Multiple Displays
drop-down menu. Your Desktop
background should now appear
on the projector.
Win7 also has a pop-up display for
selecting the content that is sent to the
projector. Press the Windows-P keys
to bring up the four possible selections:

Disconnect Projector (turns the


projector display off)
Duplicate (mirrors your computers Desktop on the projector)
Extend (uses the projector as an
extension of your Desktop)
Projector Only (turns off your
notebooks display and uses the
projector as the main display)

Video Is Out Of Range


When the projector cant reconcile
a video signal from a notebook with
its preset resolution, it displays an
out-of-range message. To solve this
in Win7:
Right-click a blank area on
the Desktop.
Select Screen Resolution.
Select the display associated
with the projector.
Use the resolution drop-down
menu to adjust the resolution to
the correct value. Try 800 x 600
or 1,024 x 768 as these are resolutions that many projectors
can handle.

Display Turns Off


If the projectors display turns off
during your presentation, you'll want
to check your notebooks power management feature, especially if youre
running the notebook off of its battery. Whenever possible, use your AC
adapter to run your notebook.

Video Wont Display Or Is


Choppy
Your slide presentation works fine,
but when you try to show a video, all
you see is a blank window or a choppy
rendition of the video. Trying to display a video on two monitors can be
too much for a video card that has
marginal graphics capabilities. If video
isnt displaying correctly, change the
Display settings to make the projector
the primary display.

NOTEBOOK-PROJECTOR
TROUBLESHOOTING
TIPS
Turn off all equipment before
connecting the notebook to the
projector.
If possible, use a digital connection to ensure a high-quality
presentation.
If youre not using the projectors built-in speakers, turn
them down and connect the
notebooks audio out to the
sound system.
If youre using a wireless
mouse or controller, make sure
you can establish the wireless
connection.
Use the network connection
feature in Windows 7 to connect to a network projector.
No video? Check the ports and
Windows Screen Resolution
settings.
Adjust the screen resolution to
resolve out-of-range messages.
When a projected image isnt
proportionally correct, reposition
the projector and/or change the
projectors keystone setting.
If a display turns off during a
presentation, check the notebooks power management
settings.
If video isnt displaying correctly, change the Display settings to make the projector the
primary display.

CyberTrend / October 2015

63

Social Media Privacy Tips


TAKE CONTROL OF YOUR ONLINE PRIVACY

SOCIAL MEDIA IS ALL ABOUT sharing


our lives with friends and family, and
vice versa. From daily musings about life,
such as a friend thats excited about an
upcoming vacation, to important events,
like the birth of a new grandchild. And
although it might not seem like the news,
photos, personal achievements, failures,
and videos you post would be of much
interest to people you dont know, the
information could be useful to cybercriminals trying to steal your identity. The
default privacy settings on many social
media websites make it so your posts,
tweets, and photos are visible to the public.
Fortunately, its easy to adjust these settings, so that only the people you know will
see the updates.

Facebook
When setting up your Facebook profile,
the service will ask for a lot of personal
informationincluding education history,

64

October 2015 / www.cybertrend.com

workplace, and phone numberthat you


might not want visible to everyone. To
complicate matters, Facebook hasnt exactly been known for consistency when
it comes to users' privacy settings, as past
interface changes have reset settings and
forced users to continually ensure their
posts and personal information remain
private. To correct some of these issues,
Facebook has made changes in the last year
to simplify its privacy controls.
Click Privacy and youll see a list
of configurable options. For example,
in the Who Can See My Stuff? section,
manage who can see your future posts by
selecting Public, Friends, Friends Except
Acquaintances, Only Me, or Custom. This
way, you can make certain that your posts
won't be viewable to the public at large if
you forget to change the privacy settings
when you post an update. You can also
review the posts youve been tagged in, as
well as change the audience for updates

youve previously posted. This way, you


can control whether any old updates are
available to the public.
There are also Who Can Contact Me?
and Who Can Look Me Up? settings that
let you filter access to non-friends.
One easy way to assess the entirety
of your Facebook privacy is to use Facebooks Privacy Checkup (click the Lock
icon in the top-right corner of Facebook).
Select Privacy Checkup and, in the resulting pop-up window, Facebook shows you
the controls for who can see your posts.
If youre following our steps, youve already addressed this step. Click Next Step
to see what apps youve logged into with
Facebook. Delete the apps you no longer
use. When you're done, click Next Step.
Finally, Facebook will bring up the information shared on your profile. Here,
youll see options to add a phone number,
email, birthday, hometown, and other information. Click Finish Up to finalize your

new privacy settings. All of the information


in the last step can be found in the About
section of your profile, which also contains other information you might want to
make private. To do so, click your personal
timeline and select About. Under the tabs
for Work And Education, Place Youve
Lived, and Contact And Basic Info, you
can adjust the privacy settings for details
that werent part of the Privacy Checkup.

Facebooks primary privacy settings can be


found in the Privacy window.

Twitter
By default, Twitters account settings
make your tweets available for all to see.
The alternative is a protected mode, where
your tweets are only visible to your approved Twitter followers. Protected tweets
are not retweetable, so even approved users
cant share your tweets. You also cannot
share permanent links to your tweets with
anyone but approved followers. If you
want to use Twitter to drive Web traffic,
the restrictions in the protected mode
might undermine why you joined Twitter
in the first place.
If you want to adjust your tweet privacy level, or the other privacy controls
on Twitter, sign into Twitter and open
your account settings. Next, click Security
And Privacy and scroll down to Privacy.
If you only want approved followers to see
your tweets, click the Protect My Tweets
checkbox. You can also control who can
tag you in photos, whether your tweets
include a location, and how others can find

you. After making your privacy selections,


click the Save Changes button.

Google+
For Google+, privacy has been a key
consideration from the very beginning. For
example, youve always been able to assign
a privacy level for each post you share. And
based on the Circles (friend groups) youve
set up, its easy to share content with only
a specific crowd. Google+ also offers detailed privacy settings where you can control most every aspect of your profile. Visit
your Google+ page, click your name, select
the drop-down menu under the Google+
logo, and choose Settings.
In the Settings window, you can customize who can send you notifications,
comment on your public posts, and
manage subscriptions. If you want to
configure the audience settings for your
posts, photos, and profile updates, scroll
down to the Your Circles section and click
Customize. By default, Google+ pushes
updates to the people in your Friends,
Family, and Acquaintances groups. To
block a particular group, remove the check
from the checkbox. If you want to reach a
larger group of people, you might want to
add a check to the Following checkbox, so
followers of your Google+ profile will be
added to Your Circles list.
Next, scroll down to the Profile section where you can configure how people
are able to find your profile and control
what content displays in your profile. A
setting of interest for businesses is Allow
People To Send You A Message From
Your Profile, as this setting offers a way for
consumers to reach out to you. If the setting is limited to Your Circles or Extended
Circles, customers might not be able to
contact you.
If you use Google+ on your mobile
device, youll also want to examine the
Location Settings section. These settings
let you enable or disable location reporting
via your smartphone and tablet. If enabled,
you can control who can see your current city and/or exact location. The precise

Google+ offers a variety of privacy controls.

location is ideal for those who wish


to share their location with friends and
family. If thats something you dont plan
to do, then it might be best to disable location settings.

LinkedIn
The business-focused nature of LinkedIn ensures that privacy is a priority. To
examine your settings, log in to LinkedIn,
hover your pointer over your profile
photo in the right-hand corner, and select
Manage next to the Privacy & Settings option. In Privacy Controls, youll find a host
of options to control what others can see
on your profile and activity feed.
If you use LinkedIn to search for new
clients and key connections within an organization, you can opt to remain anonymous, so people wont know that you
looked at their profile. To do so, click
Select What Others See When Youve
Viewed Their Profile. There are two anonymous options, one where others will see
an industry and title, or you can opt to
be completely anonymous. You can also
manage who can follow your updates, edit
blocked connections, and shut down users'
ability to view your connections.

Manage All Your Online Accounts


Now that we've explored the basic steps
of managing your privacy settings, it would
be wise to check your privacy settings for
other social networks you might use. This
way, you can have a measure of control of
your publicly available online data.

CyberTrend / October 2015

65

PROCESSOR

SPECIAL ADVERTISING & CONTENT


FROM OUR PROCESSOR PARTNERS

Helping IT stay on pace with the

SPEED OF CHANGE

Processor is designed for the IT world, covering the hardware


and technologies that power todays data centers.

PROCESSOR
SIX QUICK TIPS

Detect Sources Of Excessive


Noise In The Data Center
Methods For Reducing Potentially Damaging Noise In The Data Center
Data center workers function under
conditions many others in the organization dont realize. Noise is one example. Anyone uncertain of just how loud
a data center can get need only to listen
to audio clips searchable online recorded in actual data centers. Depending on
the environment, the constant din that
power supply fans, UPSes, air handlers,
and other equipment produce can be
excessive enough to cause headaches,
fatigue, ringing in the ears, loss of concentration, and more.
Many estimates place a typical data
centers noise level between 70 and
80dBs. That falls below OSHA guidelines that state employees shouldnt
work in environments with a noise
level at 85dBs for more than 16 hours.
For data centers hovering around this
mark, the following strategies can help
detect and lessen noise.

Get Serious
If most of a companys operations
are performed remotely outside the
data center, excessive noise can be
viewed as being a minor concern, says
Roy Illsley, Ovum principal analyst.
Still, excessive noise can impact others outside the data center, potentially
requiring adding insulation to prevent

noise from interfering with employees


located nearby, Illsley says. Further,
noise can cause general mechanical vibrations that lead to equipment
wearing down, he says.
Ken Koty, PDU Cables sales engineer, says in data centers he has
worked in, noise was made a high priority. As part of standard procedure
while walking through sites, in addition to looking for weak points of failure and places where best practices for
energy saving werent being implemented, Koty used a decibel meter
to check throughout the environment.
When building new sites, he says, an
acoustic engineer was hired to deal
with noise in server rooms and noise

Go Remote
Although Roy Illsley, Ovum principal analyst, hasnt personally heard of
cases in which noise exposure has caused a data center employee to
suffer from long-term physical damage, hes sure there have likely been
cases of such. Positively, many data center operations are now performed remotely, he says, thus employees dont need to spend as much
time in the data center as previously.

created outside the building where


cooling towers, diesel generators,
cooling engines, and exhaust noise
were in proximity to neighbors.

Know Whats Acceptable


As mentioned, OSHA states that
a noise level topping 85db must be
addressed. Koty believes organizations should attempt to bring levels
lower when possible. Considering
that the dB level of normal conversation is around 60dB, he says, anything much above that will cause people to have to compete with the noise
in the room. Lower noise levels will
make a more desirable and productive
work environment.
Common advice for determining if
the noise level is accessible is simply
trying to hold a conversation in different locations in the data center.
Your ears can tell you if something
is noisy, Koty says. Employees will
also note that noise is a problem. If
its hard to communicate with each
other and people mention theyre not
comfortable working in the data center due to noise, get a decibel meter
and record noise levels in multiple
areas of the room, Koty says.

CyberTrend / October 2015

67

PROCESSOR
SIX QUICK TIPS

The Primary Causes


To illustrate how data center noise
is produced, many experts commonly say to imagine the noise that one
server fan generates and then expand
that to an entire room or floor of servers. Related to this, picture servers
sitting in open racks vs. servers sitting in closed cabinets.
Overall, Koty says data centers
are higher capacity entities, meaning more watts per square foot. Thus,
youre bound to have more noise,
he says. This will require more cooling for the data center. The two biggest culprits [for noise] are servers
and AC equipment.
After pinpointing equipment thats
causing excessive noise levels, general advice is to first ask the manufacturer or vendor if it has a remedy.
This may involve changing parts or
installing new fans. Elsewhere, Illsley
says hes seeing DCIM tools being
used to gather insight related to fans
drawing more power, from which
noise would be a possible byproduct.

What To Do
Among easier-to-implement steps
that many data centers already have
taken to combat noise is providing

BONUS TIPS:
Replace The Old With
The New
Ken Koty, PDU Cables
sales engineer, advises
that organizations replace
their older equipment that
is showing evidence of
wear and tear with newer
and quieter equipment.
For example, he says,
technological advancements have led to a generation of fans today being

68

October 2015 / www.cybertrend.com

workers and visitors with earplugs


and noise-cancelling headphones.
Other remedies include installing acoustic ceiling and wall tiles to
dampen sound and prevent sounds
from bouncing around. (Ceiling
options include tiles that can be suspended.)
Another option is using server
cabinets featuring noise-deadening
construction. This will greatly help
cut down noise levels, Koty says.
Reportedly, such cabinets can cut
noise in the 5dB range. Koty also
advises only using those AC units
needed to do the job. N+1 is OK,
but theres no need to run way more

units that necessary to cool the


room, he says.
Illsley believes the best approach
for lessening noise levels is immersed
liquid-cooling technology, which acts
not only as a cooling agent but also
insulates the noise, he says. Other
technologies include the greater use
of flash storage, and generally matching demand with supply so the data
center is operating optimally, or in
other words, ensuring the data center
isnt drawing more power than needed
and thus cooling more than needed.
DCIM helps with this by identifying
power and temperature so you can balance the data center, he says. P

Take The Talk Test


A common piece of advice for determining if the noise level in a data
center is problematic is simply trying to conduct a conversation in different locations throughout the data center. If its difficult to hear someone
or convey a message in person or via phone call, noise is a problem.
Important for data center managers to note is that besides being
uncomfortable to work in, a noisy environment can lead to less productivity among employees, including due to less concentration in completing tasks.

quieter than previously


generations. An indicator that a fan operating in
older equipment is showing some wear and is thus
a candidate for replacement is audible screeching
and whining coming from
the equipment. Notably,
in addition to being noisy,
a worn fan can also be an
early sign of the equipment its running in will
potential fail soon, he
says.

Considering Raising
The Temperature
Some experts suggest
that because cooling
systems play a significant role in generating
noise within data centers, including possibly
to the point that noise
exceeds acceptable
levels, organizations
should consider raising the temperature
in the data center.
The reasoning is that

doing so would mean a


HVAC unit, for example,
wouldnt need to run at
as an intense of level.
Key to this taking this
approach, however, is
to carefully monitor the
conditions after raising
the temperature and
ensuring that it doesnt
lead to unsafe operating levels for equipment
resulting in possible
damage or failure.

PROCESSOR
SIX QUICK TIPS

Make Your Network


Telecommuter & Mobile Friendly
Tips Thatll Help Remote Workers Be More Productive
An increasing number of employees are working away from the office
these days. And whether you have a
full-blown group of telecommuters or
simply a handful of people that want
to get work done via mobile devices,
its important that those workers can
easily connect back to the corporate
network. Here, well discuss some
things youll want to consider when
making your network more out-ofoffice friendly.

Single Sign-on & IAM


Also known as SSO, single sign-on
is one of the easiest ways to eliminate
steps when remotely accessing files and
data. Basically, its a centralized application that manages all your passwords
that prevents you from having to logon
over-and-over againonce youre
within the network, says Jon Arnold,
principal at J. Arnold & Associates.
SSO services are part of many
Windows enterprise editions, and allow
you to integrate sign-on with middleware applications, as well as front- and
back-end applications. This way, youll
have a true single sign-on for every

corporate PC, in addition to things like


domain boundaries and resources.
SSO services are also part of many
Identity and Access Management
(IAM) tools, which often extending
SSO capabilities for mobile devices.
An SSO-enabled IAM can help you to
further reduce the number of devices
and times employees must provide a
username and password. If you havent
updated your IAM approach in the past
few years, now might be the time to

What Do You Need For Single Sign-on?


In general, an SSO system will include a credential database, as well as a SSO
server (or servers). Many SSO systems also have a master secret server, which
holds the master secret codes and directs the various SSO servers in the network.
The credential database will typically store data necessary for affiliated applications, including encrypted user credentials that those applications might need
to use. An administrator, of course, will be in control of access rights and group
policies within the SSO environment. SSO might also be combined with tools for
web access management, identity federation, and cloud IAM (Identity Access
Management).

re-evaluate and see how modern IAM


tools can make it easier for remote
workers to get the job done.

The Future Is EMM


If youre managing mobility on a larger scale, you might consider going with
an Enterprise Mobility Management
(EMM) tool. This relatively new technology generally includes elements
of both Mobile Device Management
(MDM) and Mobile Application
Management (MAM), which make
EMM a more comprehensive approach
to connect with remote workers. Gartner
expects that, by 2017, EMM integration
will become a critical IAM requirement
for 40% of buyers, up from less than 5%
in todays enterprises.
Currently, one of the limiting factors
with EMM is that the present-day solutions might not have complete functionality with Windows operating systems.
Garter expects the EMM providers will
begin to offer more universal endpoint
management that will encompass traditional desktop, laptop, mobile devices,
and IoT (Internet of Things) hardware.

CyberTrend / October 2015

69

PROCESSOR
SIX QUICK TIPS

At this point in time, though, you might


need to use disparate IAM and EMM
controls to make everything work
seamlessly.

The Consequences Of
Complications
The whole point of SSO, IAM, and
EMM tools are to keep it simple for
the workforce. If what you provide
employee isnt simple, quick, or robust
enough, its likely theyll move to a
less secure alternative. Just as with
BYOD, they WILL ignore the rules if
you make it too hard for them to get the
job done, says Kate Lister, president at
Global Workplace Analytics. I cant
tell you how many times Im trying to
connect with someone, but they cant
get past the firewall, so they instead
switch to their personal technology.
Arnold echoes Listers thoughts.
IT must also understand that theyll
need to keep it simple, because the
alternative is consumer-grade applications. Arnold gave us the example of
an enterprise that has setup a secure
file-sharing servicebut the service
requires the user to logon each time and
the interface is difficult to useso the
workforce opts to use their unsecured
third-party personal account to share
files. And as soon as you steer workers
away from what youve setup, its hard
to get them back again, says Arnold.

BONUS TIPS:
Training
Kate Lister, president
at Global Workplace
Analytics, tells us that
Unless people are
adequately trained in and
regularly use the technologies offered, they will not
use them effectively or
efficiently, and she adds
that you should Make it
safe for them to say they

70

October 2015 / www.cybertrend.com

You also lose capabilities for integrating that data into the rest of the applications you use within the company.
Even worse, your organization might
have spent a good deal of money and
resources to create a secure solution. If
employees are going out of their way to
use something else, the time and funds
spent developing the solution becomes
a big waste. The lesson here is that
application development isnt complete
until youve made it easy for everyone
in the company to use.

Are Your Network & Devices


Compatible?
The regularity of mobile OS
updates can create issues with security and authentication controls. Those
upgrades have to propagate back to
IT, and if that upgrade isnt on your
network, you wont be able to prop-

erly support the phone, says Arnold.


With a list of supported devices, IT
can focus on securing everything, while
also providing users with a consistent
experience. Organizations experimenting with a Bring Your Own Device
(BYOD) policy might run into several
headaches when it comes to authentication, as you might have a wide-variety
of devices to secure, and each might
have their own quirks.
Even if everyone in the organization is
using the same devices, things not might
run smoothly. For example, employees
using an out-of-date version might not
be able to access important CRM and
other company resources, says Arnold.
Its important that you have a way
to push updates onto the devices and
ensure theres no issues. Alternatively,
youll need to implement a system that
works well with legacy technology. P

A Chance To Make A Good Impression


Ease-of-use is another way for IT to add value to the business, says Jon
Arnold, principal at J. Arnold & Associates. Upper management might not
ask many questions about the technology, but they care about the results
employees generate. If IT can make it easy for employees to do things that
they couldnt do before, management has a good example of how IT can
make a positive impact on the business.

need help. By advising


them both on how to use
the technology and why it
will help them, IT can help
daunted workers buy into
the process. One way to
standardize training is to
make the login processes
the same across all enterprise devices. Equip onsite workers the same as
virtual workerswhether
they are nine floors, nine
miles, or nine time zones

away, they are all connecting virtually, says Lister.

Biometric Authentication
The enterprise environment has long been on
the bleeding edge of
biometrics as a form of
authentication. By 2020,
Gartner expects that
advanced biometrics, such
as face recognition, voice
recognition, and keystroke
dynamics, will overtake

passwords and fingerprints


for access to endpoint
devices. Compared to fingerprints, Gartner points
out that these advanced
biometrics wont require
any specific sensors,
instead, youll need support from the endpoint OS.
And when combined with
single sign-on, biometrics
could become a powerful
tool to ease connectivity to
the enterprise.

PROCESSOR
HOW TO

Solve Common Wi-Fi Security


Problems
Tips That Will Help You To Transform Your Organizations Wi-Fi Protection
Wi-Fi Protected Access 2
(WPA2) Enterprise has been a stalwart standard for protecting wireless
networks since its release in 2004.
And while the use of Advanced
Encryption Standard (AES) algorithms
and Counter Cipher mode with BlockChaining Message authentication code
Protocol (CCMP) offer a solid foundation for encrypting the data going over
an organizations Wi-Fi network, its
just the starting point for a business
Wi-Fi security.

WPA2 Enterprise Advantages


WPA2 Enterprise is full of features
that will help you to block out and detect
intruders, yet not all enterprises implement or fully monitor these advanced
capabilities. SANS Certified Instructor

Larry Pesce, says Many organizations


favor WPA-PSK over Enterprise, dont
implement WIDS [Wireless Intrusion
Prevention System]/Rogue-AP detection, and dont monitor logs and alerts
nor does the staff do due diligence on
certificate validation on EAP [Extensible
Authentication Protocol] implementation.
The biggest issue with the WPA2
Personal standard is that its much
more difficult to manage at scale, and
when faced with complications, workers may connect to nearby unsecure
Wi-Fi networks, which might create
vulnerabilities in your overall network
security. WIDS-ready network devices
monitor the local area network for any
unauthorized access, as well as rogue
access points and other wireless attack

Key Points
Employ WPA2 Enterprise
encryption
Validate users and setup a
guest network
Eliminate weak points in your
network

tools. In many cases, the WIDS devices


will automatically prevent the threats
from acting on the network, and network administrators should immediately
follow up on WIDS alerts, rather than
assuming that the hardware will always
deal with the problem.
The EAP framework is one of the
most powerful tools an enterprise can

CyberTrend / October 2015

71

PROCESSOR
HOW TO

use. Employees are assigned login


credentials that theyll enter when
connecting to the network. A Remote
Authentication Dial-in User Service
(RADIUS) server performs the authentication, while your device serves as
the actual authenticator. In this way,
employees dont have to deal with
encryption keys, nor are the keys stored
on the device.

Check All The Boxes


When improving your Wi-Fi security,
its important that avoid shortcuts and
take the time to set it up correctly. I
often see wireless networks implemented with what seems like excellent security, but when you dig in to the details,
simple things have been missed, says
Pesce, who points out missed chances
for appropriate validation of TLS certificates with EAP types under Enterprise
WPA, failure to appropriately configure
captive portals and network services on
guest networks, self-signed TLS certificates on wireless resources, and even
IPSEC VPNs that allow split tunneling.

Reduce Weak Spots


A strong wireless security posture
within the enterprise network should
also be supplemented with policies for
secure access to sensitive data outside
the office. As an attacker, Im interested

in attacking clients at the area where


security is the weakest; following a high
value target to the coffee shop, on an airplane, or even at home, says Pesce. But
preventing users from reaching those
resources outside the office has its own
consequences. Pesce says Restricting
which networks corporate assets can
connect to is a challenge especially
where a mobile workforce is concerned
and this is where some of the largest
problems can arise.
With networks that are outside your
control, you might be able to create
some rules that will allow for people
to securely access the files and apps

they need. Security barriers are necessary, but are increasingly moving away
from the network, says Mike Battista,
secure manager and analyst, infrastructure, at Info-Tech Research Group.
For example, an individual application can only reveal itself to an authorized user, using a safe device, at the
right time. In-house apps might also
let you create restrictions about what
remote workers can do, or what types
of files they can access. Many enterprise mobility management (EMM),
cloud management, and identity management tools are available to help you
secure mobile devices. P

Action Plan
Isolate Guest Traffic. Enterprises need to allow the occasional guest to
have access, and while most enterprise WLAN access points support this
with a separate sign-in portal, youll need to manage the VLAN to ensure
they have limited file access. Typically, IT managers will limit visitors to
Internet access, unless guests have a critical need for corporate assets.
Another good idea is to set a time limit on guest accounts.
Monitor Guest Access. So, now that youve setup guest access, you
might considering if you need to monitor it for legal reasons. For example,
do you need to follow regulations that require you to collect the data for
a given amount of time? If so, you might need to to record what guests
access over your network. On the flip side, you probably also need to
stick within the privacy laws of your state.

Top Tips
BYOD & Wi-Fi Security. We all know that BYOD makes life tough for IT, and since most of these personal devices connect via Wi-Fi, its important to utilize solid security and create detailed policies about how workers can connect their
personal devices. Some enterprises opt to offer only guest access to BYOD, while others have employees register the
device and use a corporate login for authentication. With the latter, IT can treat BYOD options similar to corporate-owned
devices. Often, employees will need to sign an agreement that they will abide by organizational policies for what data will
and wont be allowed on the personal device.
HotSpot 2.0. In an effort to better secure public Wi-Fi, the Wi-Fi Alliance has created a certification program for
passpoint devices that will allow Wi-Fi connections to be made in a similar way to cellular connections (and with similar security). Mobile devices can be authenticate at hotspots over Wi-Fi using EAP, just like in your enterprise network,
based on a Subscriber Identity Module (SIM), a username or password, or a certificate. All connections are automatically
secured using WPA2 Enterprise. The passpoint certification is based on the Wi-Fi Alliances Hotspot 2.0 specifications.
While I dont quite think that its ready for deployment across the board, Im really excited about Hotspot 2.0 and EAPSIM, says SANS Certified Instructor Larry Pesce, I think that these two new upcoming technologies will eventually drastically change the way we authenticate to wireless networks of all types, including in the enterprise.

72

October 2015 / www.cybertrend.com

PROCESSOR

F E AT U R E D P R O D U C T

Cabinet-Level Access Control


Austin Hughes InfraSolution X-800 Secures Your Cabinets,
Providing Monitoring, Control, Alarm, Reporting & More
Protecting your cabinets, and the
equipment inside of them, is more
important than ever. But its about more
than just securing the doorsits about
knowing who has opened the cabinet,
what they did, and, perhaps most importantly, controlling who should even
have access to that cabinet.
More data center managers are
recognizing this need and upgrading
cabinets with smartcard access control, says Anthony Yim, general manager at Austin Hughes. The companys
InfraSolution X-800 series goes beyond
access control to provide a complete
networked solution with monitoring,
control, alarm, and reporting.
You can expand the InfraSolution
X to cover intelligent PDUs, fans, and
environmental sensors and remotely

manage those devices via the InfraSolution X Management software, saving travel time and costs and increasing
data center efficiency, Yim says. For
basic needs, Austin Hughes offers the
InfraSolution S for rack access control
with standalone smartcard handles.
The InfraSolution product offers several advantages over its competitors,
Yim says, particularly when it comes
to compatibility. The InfraSolution
X-800 handle is designed for global
IT-branded cabinets, he says, and its
universal mounting cut-out allows easy
integration with most third-party cabinets, avoiding costly and complicated
door customization.
InfraSolution X comes with its own
software for remote management, but if
you have an existing building manage-

ment system, InfraSolution X supports


SNMP for easy integration. Plus, InfraSolution X works with proximity and
MIFARE smartcards, the two most
common formats, so you can keep
using existing smartcards. P

Austin Hughes InfraSolution


Available both standalone and as a networked version with software for remote
management
Easy integration with most third-party
cabinets

(510) 794-2888
www.austin-hughes.com

PROCESSOR
I T & FAC I L I T I E S M A N AG E M E N T

Environmental Monitoring Without


The Need To Plug In To The Network
New AVTECH Room Alert 3 Wi-Fi Monitor Expands Facilities Monitoring To New Markets
AVTECH Softwares new Room
Alert 3 Wi-Fi marks the latest in the
companys long line of powerful
and popular IT and facilities temperature and environment monitors.
AVTECH, which has been in
business since 1988, now has more
than 130,000 customers in 180
countries. Michael Sigourney, president and CEO, says this new Room
Alert model will reach an even bigger audience because it eliminates
the need to physically plug in to
the network.

Small Footprint
The unique footprint and features
of Room Alert 3 Wi-Fi make it perfectly designed to assist with monitoring temperature and other environmental conditions where a small
footprint is needed, where a wired
connection may not exist, when the
investment cost needs to be minimal, or where deployment volume
may be high.
With one digital temperature sensor built-in, users can expand monitoring by adding another digital
sensor (i.e., temperature, humidity,

outdoor and fluid temperature) as


well as a switch sensor for conditions such as flood/water, power,
smoke/fire, airflow, and room
entry, motion, and more.

Use Anywhere Theres Wi-Fi


Room Alert 3 Wi-Fi has many
benefits and can be used anywhere a Wi-Fi connection is available. There are no cables to run, so
you can use Room Alert 3 devices around the world and moni-

AVTECH Room Alert 3 Wi-Fi


Monitor, alert, log, graph, view, map, report, manage and protect any facility.
Includes powerful Device ManageR software (free) and one year GoToMyDevices
Personal cloud service (free).
Ideal in areas that require a small footprint and have no network connection. Includes a
built-in digital temperature sensor. Use in computer rooms, warehouse, medical, cold
storage, restaurant, residential, more.

$175 price makes it affordable to deploy or use in areas requiring a large number of
devices. Over 30 sensor options.

(888) 220-6700
Sales@AVTECH.com
Go to AVTECH.com and click Store

74

October 2015 / www.cybertrend.com

tor them all together on a single


screen through AVTECHs Device
ManageR software (included
free) or GoToMyDevices (www.
GoToMyDevices.com) cloud service. Advanced alerting, mapping
and graphing features provide easy
overview and analysis. New customers receive a one-year Personal
subscription to GoToMyDevices at
no charge.

Affordable Price Opens


New Markets, New Uses
Based on its ease of use and price
of just $175, Sigourney says, Room
Alert 3 Wi-Fi can help bring temperature and environmental monitoring into markets that have previously been slow to adopt this
technology. Key markets include
IT, medical, cold storage, housing,
retail, food service, museums, public buildings, farming, transportation, warehousing, and distribution.
AVTECH Room Alert 3 Wi-Fi is
available direct and from professional resellers in 180 countries. P

F E AT U R E D P R O D U C T

A Purpose-Built JBOD
AIC SAS3 JBODs Deliver High Performance, Serviceability &
Reliability, Complete With Intelligent Enclosure Management
There are a lot of ways to expand
storage capacity in the data center, and
JBOD has always been one of the most
inexpensive methods to do so. AIC
optimizes its SAS3 JBODs with performance and reliability in mindperfect
for high-availability applications.
The J2012-01, for example, is a 2U,
12-bay JBOD that boasts 48Gbps connectivity per Mini-SAS HD cable. You
can attach up to three Mini-SAS HD
connectors per expander tray. Excellent
scaling performance is handled by LSIs
SAS3x28R expander chip. Youll be
able to use almost any hard drives you
want, as AICs firmware and LSIs
expander chip support mainstream
HDDs, HBAs, and RAID controllers
(SAS2 and SAS3).

For reliability, AIC provides redundant 549 watt hot-swap power supplies,
which meet the 80 PLUS Platinum
certification. There are also two hotswappable 6038 fans. Just about every
component in the J2012-01 is hot-swappable, and all the field-replaceable units
are tool-less, so you can upgrade and fix
issues without any downtime.
AIC also works to help you avoid
downtime by including its Intelligent
Enclosure Management and support for
SCSI Enclosure Service (SES-2). You
also have the ability to manage power
on individual drives, which can help
to reduce power consumption in warm
or cold storage applications. To ease
deployment, AIC provides external port
self-configuration to host or expansion.

The J2012-01 is a mere 21 inches


deep, making it perfect for data centers where space is at a premium. The
short depth allows for increased density and ensures it will work with shallow racks. P

AIC SAS3 JBOD


Available for use with either single or
dual expander modules.
Dual fans and redundant PSUs allow for
high availability.
Dual fans and redundant PSUs allow for
high availability.

(866) 800-0056
www.aicipc.com

(888)-865-4639
www.lindy-usa.com

USB Port Blocker


System administrators can physically prevent
users from connecting Pen Drives, MP3 Players
and other USB Mass Storage Devices to their
computers to copy data, introduce viruses, etc.
The USB Port Blocker is a combined key
and lock assembly which plugs into the USB
port. Simply plug the keylock into the port
and release the latch - the lock remains in place!
Plug the key back into the lock to remove. Easy!
Physically blocks access to a USB port
Consists of 4 locks and 1 key
5 different color code versions available:
Pink, Green, Blue, Orange, White

USB 2.0 Slimline Active Extension Cable


Compatible with both USB 2.0 and USB 1.1
devices
Connectors:
Type A USB Male to Type A USB Female

FireWire Active Extension Cable


FireWire Repeater Cable for extending the cable
length between FireWire devices over distances
greater than the IEEE 1394 specified limit of 4.5m

Wireless VGA Compact Projector Server


Ideal for use with PowerPoint and other
presentation applications
Connects to the VGA port on your projector

SPECIAL ADVERTISING & CONTENT


FROM OUR COMPUTER POWER USER PARTNERS

Computer Power User offers technically sophisticated readers


a unique blend of product reviews, PC industry news and
trends, and detailed how-to articles.

Can We Tock?

Skylake Perfects Intels 14nm Platform


he PC is undergoing a
series of major technological shifts. Gaming at
2,560 is giving way to
gaming at 4K, single-monitor
setups are being replaced by
triple-screen configurations,
and DirectX 11.2 is stepping aside for DirectX 12.
Additionally, DDR3 memory
is being replaced by much
faster DDR4, PCIe-based
storage devices are becoming
more common, and Windows
8.1 is passing the baton to
Windows 10.
Whether you are planning
a new build or upgrading your
current system, its important
to choose a platform designed
to embrace these new technologies and ensure the
highest performance, the
greatest flexibility, and the
absolute best experience.
The new 6th Generation Intel Core
desktop processors and Intels Z170
chipset represent the perfect platform
to take advantage of all this new tech
and change the way you game, stream,
create, and work.

In other words, not only are


these blazing-fast processors
perfect for gaming, editing
video, working, multitasking,
and more, but they also
provide the highest level of
graphics output available
without a discrete GPU.

Skylake
Code-named Skylake, Intels 6th
Generation Core processors take Intels
14nm manufacturing process and refine
it for even greater performance. The
flagship processor, the Core i7-6700K,
is a quad-core CPU with HyperThreading technology that allows it
to run up to eight instruction threads
simultaneously. It has a stock clock
speed of 4GHz and a maximum singlecore frequency of 4.2GHz through
Advertisement

78

October 2015 / www.cybertrend.com

Intels Turbo Boost 2.0 Technology, and


its equipped with 8MB of Intel Smart
Cache memory.
The second processor in the Skylake
lineup, the Core i5-6600K, is also a
quad-core chip. It has a base frequency
of 3.5GHz (max Turbo frequency of
3.9GHz) and 6MB of Intel Smart Cache.
Both processors provide support for
up to 64GB of dual-channel DDR4
memory, both have a 91-watt TDP
(thermal design power), and both are
equipped with Intels new HD Graphics
530 processor graphics. Intel HD
Graphics 530 runs at a base frequency
of 350MHz, with a maximum dynamic
frequency of 1.15GHz, and provides
4K support at 60Hz, triple-display
support, and support for DirectX 12
and OpenGL 4.4.

Z170
A good CPU needs a
good chipset to unlock its
full potential, and for 6th
Ge n e r a t i o n In t e l C o re
processors, that chipset is
Z170. Designed to mesh
perfectly with the Core i76700K and Core i5-6600K,
the Z170 chipset provides
dual-channel support for
DDR4, the fastest desktop
memory spec on the market
today. DDR4 also operates
at lower voltage settings than
DDR3, which adds efficiency to this new
platforms list of benefits.
Z170 comes equipped with Intels
new DMI (Direct Media Interface)
3.0, which provides four connection
lanes between the CPU and PCH
at 8GTps per lane, for a total of
nearly 4GBps. Another upgrade
from previous chipsets is Z170s
larger Flex-IO hub, which increases
the number of ports available to
motherboard manufacturers for use as
PCIe lanes, USB 3.0 ports, or SATA
6Gbps ports from 18 in Z97 to 26.
In addition to allowing for greater
flexibility in motherboard design, the
new hub dedicates more bandwidth to
PCIe devices for use with Intels RST
(Rapid Storage Technology), paving
the way for more motherboards with

overclocking, allowing adjustments in


100/133MHz increments, as opposed to
the previous generations 200/266MHz
s t e p s . Ava i l a b l e D D R f re q u e n c y
overrides allow for memory clocks of up
to 4,133MHz and higher. (Intel reports
non-typical results of up to 4,795MTps
with LN2.) Of course, you will also
have access to simplified memory
overclocking controls via Intels XMP
2.0 memory profiles.
Skylake and Z170 also come with
an unlocked processor graphics ratio
and unlocked voltage controls, the
latter of which should allow experts to
fine-tune increases in the performance
of their CPU without also increasing
temps more than necessary.

RAID support for SATA Express and M.2


storage devices.
Other supported technologies include
Intel Ready Mode Technology (which
can keep your system and its applications
up-to-date via a constant connection),
Intel Smart Sound Technology (an
integrated DSP for audio offload and
audio/voice features such as voice
commands), Intel Device Protection
Technology with Boot Guard (protects
your system from malicious code prior to
the OS launching), and more.

liquid-nitrogen cooling. And in some


cases, Z170 motherboard manufacturers
will include overclocking utilities that
will give you even finer-grained control.
Additionally, the platform provides increased granularity in memory

Look Inside
As the Tock to Broadwells Tick, Skylake is a big step forward. When teamed
up with a Z170 motherboard, the Core
i7-6700K and Core i5-6600K give you
improved performance, greater power
efficiency, increased overclocking
control, and support for the latest
PC technologies.

Born To Overclock
Most power users are familiar with
Intels K series processors and their
unlocked multipliers, but with Skylake
and Z170, Intel has provided the
highest level of overclocking control
yet. For starters, you have access to
unlocked core ratios in 83 100MHz
increments, as well as complete Turbo
overrides for voltage and power limits.
But you also get enhanced full-range
BCLK (base clock) overclocking,
which allows for adjustments in 1MHz
increments up to 200MHz or higher
some sources report frequency gains
greater than 400MHz when using

CyberTrend / October 2015

79

805 features a see through skeleton case


structure design offering versatile HDD mounting
positions creating excellent flexibility. 805 supports
the all-new reversible USB 3.1 Type-C which
increases your high speed data transmissions. 805 has
inherited the spirit of pursuing exquisite craftsmanship
and innovative designing that brings not only the visual
aesthetics but also gaming performance.

Certied Piedmontese beef tastes great: lean and tender, juicy and delicious.
But theres more to it than just avor. Certied Piedmontese is also low in fat
and calories. At the same time, its protein-rich with robust avor and premium
tenderness. Incredibly lean, unbelievably tender: Its the best of both worlds.

piedmontese.com

You might also like