You are on page 1of 72

SEPTEMBER 2015 VOL. 13 ISS.

09
CYBERTREND.COM

TIME TO
RETHINK
STORAGE?
SMART STORAGE
CHOICES BRING
BETTER SPEED,
RELIABILITY
& SECURITY

Volume 13 : Issue 9 : September 2015

10

22

BETTER STORAGE CHOICES CAN BRING


BETTER SPEED, RELIABILITY & SECURITY
10 COVER STORY
this months cover story articles delve into the
benefits of flash storage, cloud storage and
data security issues, purpose-built backup
appliances, and storage management
22 BUSINESS
we take a tour of NEC Corporations
history and the focuses of NEC Corporation
of America, and we explore ideas for thinking
visually in online marketing
28 DATA
a look at the role of the data scientist within
todays data-focused organizations, and a
primer on data lakes

CONTACT US
P.O. Box 82545
Lincoln, NE 68501
or
120 W. Harvest Drive
Lincoln, NE 68521

NECS UNIQUE PATH THROUGH


THE IT INDUSTRY

32 ENERGY
the latest news and research into
energy-conscious tech

50 STORAGE
how to make wise decisions when it comes
to long-term enterprise storage

34 IT
Web-scale IT and what it could mean to
your organization, and a look at Supermicros
newest supercomputing servers

55 WEB
insights into social media usage for
professionals

38 NETWORKING
how to get more control over the corporate
network with software-defined networking,
and how to boost your cellular signal

58 ELECTRONICS
the latest in premium consumer electronics
60 TIPS
advice for mobile professionals

46 SECURITY
methods for getting a better understanding
of enterprise risk so you can more easily
avoid digital security headaches

Advertising: (800) 247-4880


Fax: (402) 479-2104
Circulation: (800) 334-7458
Fax: (402) 479-2123
www.cybertrend.com
email: feedback@cybertrend.com

Copyright 2015 by Sandhills Publishing Company. CyberTrend TM is a trademark of Sandhills Publishing Company. All rights reserved.
Reproduction of material appearing in CyberTrend TM is strictly prohibited without written permission.

Cloud IT Infrastructure Segment


Outperforms Overall IT Market

Execs Say Theres Work To Be Done


To Improve Cybersecurity

According to recent data from IDC,


cloud IT infrastructure growth continues
to outperform growth of the overall IT infrastructure market. One major factor? The
large number of workloads transitioning to
cloud-based platforms. A recent IDC report indicates that vendor revenue sales for
infrastructure productsincluding servers,
storage, and Ethernet switchesfor cloud
computing-related IT (both public and
private clouds) reached about $6.3 billion
during the first quarter, which represents
25.1% year-over-year growth. Spending
on cloud IT infrastructure totaled nearly
30% of overall IT infrastructure spending
for the quarter, up from 26.4% last year,
IDC states. The non-cloud IT infrastructure segment, meanwhile, increased 6.1%
during the quarter, IDC states.

For its Business Resilience In The Face


Of Cyber Risk report, Accenture surveyed
900 C-level executives to get first-hand information about how companies operate.
Not surprisingly, 63% say their organizations face significant cyberattacks daily
or weekly, but only 25% say their organizations have built cyberattack resilience
into their overall operating models. And
while 86% say they test their resilience to
determine what changes might be necessary, only 9% run attack and failure scenarios continuously to ensure their systems
can repel attacks. Brian Walker, managing director with Accenture, says, While
savvy executives know where their weak spots are, and work across the C-suite to
prepare accordingly, testing systems, planning for various scenarios, and producing
response and continuity plans that guide quick actions when a breach occurs, the
data clearly shows that companies by and large have more work to do. Among the
reports multiple suggestions: integrate resiliency into the operating model so that
strategies, processes, solutions, and company awareness are united.

New Cloud-Based Solutions To


Triple In Number

IoT Integration Driving


Competition & Investment

Slowdown In China To Impact


Smartphone Market & China

The simplicity and affordability


of cloud-based services has driven its
popularity sky-high, which in turn has
prompted service providers to offer even
more cloud products. A new report from
IDC states that as big data, mobile, and
various IT solutions increasingly move
to the cloud, the number of new cloud
products will triple by around 2020. The
research firm adds that public cloud
computing, particularly popular, will be
a $70 billion industry this year, and that
industry-specific applications will be a
primary driver.

Data from ABI Research confirms the


IoT (Internet of Things) growth hype.
ABI expects more than 8 million BMSes
(building management systems) will be
integrated with an IoT platform by 2020.
With support for BMS connectivity and
third-party applications, BMS operations
can incorporate a variety of external inputs. For example, sensors for changing
weather conditions could help control
variable energy pricing, and actuators inside a building could manage BMS processes based on space allocation, building
occupancy, and other critical factors.

Market research firm IDC expects


smartphone shipments to increase
11.3% this year, close to its 11.8% forecast. This marks a slowdown from 27.6%
growth last year. Ryan Reith, program
director at IDC, says a Q1 2015 smartphone shipment decline within China is
partly to blame, showing that the largest
market in the world has reached a level
of maturity . . . . This has implications
for Android because China has been a
critical market for Android smartphone
shipments in recent years, accounting for
36% of total volume in 2014.

September 2015 / www.cybertrend.com

Providers Search For The Right


Strategy In Cloud IaaS Market
Spending on cloud IaaS (infrastructure
as a service) should increase by 32.8%
and reach $16.5 billion this year, based on
Gartners latest forecasts. However, the
IaaS market is shaping up to be a competitive space for providers, as they launch
new cloud IaaS platforms, augment current platforms, or migrate to providing
services on leading cloud IaaS platforms.
Although options are expanding and diversifying, Lydia Leong, Gartner vice
president and distinguished analyst, urges
caution when comparing and selecting
providers: Ask specific and detailed questions about the providers road map for the
service and seek contractual commitments
that do not permit the provider to modify
substantially or to discontinue the offering
without at least 12 months notice.

Add Chief Digital Officer To C-Level Position Opportunities


Between 2013 and last year, the number of CDOs (chief digital officers) doubled
to almost 1,000. The CDO Club expects that the total number of global CDOs will
also double to 2,000 by the end of this year. According to David Mathison, CDO
Club founder, three CDOs took on the CEO title and four CDOs joined boards,
in this years first quarter alone. The following statistics are from Mathisons 2014
keynote about the rise of CDOs:
The advertising sector claims the largest percentage of CDOs,
with 36%, followed by media (18%), publishing (13%), nonprofit (10%),
retail (7%), financial (4%), and services (4%).
Globally, 68% of CDOs are located in North America (significantly
down from 88% in 2013), followed by 23% in Europe (up from 7%);
6% in Asia; 2% in the Middle East and Africa, and 1% in South America.
81% of CDOs are male, 70% are Caucasian, and 98% are between
30 and 59 years old, with 54% being between 40 and 49, about the
same as last year.
54% of CDOs hold a BA degree, 12% an additional MA, 26% an MBA,
and 9% a Ph.D.

Too Much Screen Time


For Your Kids Or Yourself?

More Workers Rely


Exclusively On Tablets

More Businesses Seeing


Value In Digital Signage

Its old news that adults think kids


these days spend too much time with
an electronic screen in front of their
faces. A recent YouGov/Huffington Post
survey of 1,000 U.S. adults validated that,
with 53% of parents saying their kids
spend too much time with phones, TVs,
and other devices. But what about you,
the adult? In the survey, 54% admitted
to spending too much time in front of
screens themselves, and 43% of parents
said they spend more time on digital
devices than their kids, so perhaps entire
families need more screen-free time.

Although a majority of enterprise tablet


users still use at least one other device for
business purposes, 40% use tablets as their
sole business device, according to an IDC
study of European IT decision-makers.
The number of respondents using only
a tablet increases considerably when hybrid two-in-one or convertible devices
are considered, as these devices are being
deployed to replace portable and desktop
PCs because they include a keyboard.
The study found that 64% of production
workers, 38% of executives, and 44% of
white-collar workers use only tablet slates.

As companies start to see the benefits


of digital signage, the market for those
products is changing, says IDC Vice
President Keith Kmetz. Current digital
signage implementations are producing
significant value by enabling content that
is changeable and interactive for its target
audience, he says. This new level of
communication is an effective broadcast
medium that will continue to increase in
usage. About half of all companies are
using digital signage, IDC reports, and
more than 80% of those companies are
very satisfied with the technology.

CyberTrend / September 2015

Wi-Fi Networks To Carry


More Mobile Traffic
New research contained in a Mobile
Data Offload & Onload: Wi-Fi, Small
Cell & Network Strategies 2015-2019
report from Juniper Research estimates that Wi-Fi networks will carry
about 60% of smartphone and tablet
data traffic by 2019, amounting to more
than 115,000 petabytes. That compares
with less than 30,000 petabytes this
year. Juniper Research states that mobile data offload, or data moving from
a mobile network to a Wi-Fi network,
presents multiple benefits to industry
stakeholders, including the ability to address patchy coverage and potentially
create new services, such as VoWi-Fi,
or Wi-Fi calling. Juniper Research notes
Wi-Fi offloading brings challenges concerning effective deployment and ROI.

2015 Global IT Spending To Decline


Gartner forecasts global IT spending will sink 5.5% from 2014 to total $3.5 trillion this year. In constant-currency terms, the market will climb 2.5%, Gartner
states. John-David Lovelock, Gartner research vice president, says the decline is
not a market crash, despite illusions created by large swings in the value of the U.S.
dollar compared with other currencies. This year, communication services will be
the largest IT spending segment among five IT sectors but will also experience the
strongest decline. Gartners global forecasts (in billions of U.S. dollars) for the five
sectors include:

WORLDWIDE IT SPENDING FORECAST BY SECTOR


(Billions Of U.S. Dollars)

2014
Spending

2014
Growth (%)

2015
Spending

2015
Growth (%)

Devices

693

2.4

654

-5.7

Data Center Systems

142

1.8

136

-3.8

Enterprise Software

314

5.7

310

-1.2

IT Services

955

1.9

914

-4.3

Communications Services

1,607

0.2

1,492

-7.2

Overall IT

3,711

1.6

3,507

-5.5

Mobile Workforce To Surpass


100 Million By 2020

Tablet Market Losing Its


Momentum, Says ABI

Apps For Wearable Devices Next


Big Developer Frontier

By 2020s end, IDC expects mobile


workers will make up nearly three-quarters
of the total workforce in the United States,
with the mobile work population forecast
to expand from 96.2 million workers to
105.4 million. Factors contributing to the
growth include the affordability of smartphones and tablets and the increasing acceptance of corporate BYOD (bring your
own device) programs. IDC notes that according to a recent survey, 69.1% of enterprise mobility stakeholders experienced
a decrease in OPEX or CAPEX costs by
implementing BYOD programs.

Theres no denying the tablet market


is losing its momentum and leading vendors are feeling the squeeze, ABI Research
reports. The research firm reports a 16%
decline for year-over-year shipments, but
adds that tablets are still popular with consumers and enjoy practical applications
in business and education. Broadly, the
tablet market is simply adjusting to tablets
finding their niche, ABI reports. The firm
expects the market will remain relatively
stagnant until smaller vendors step up to
shake up the market and forge a truly
competitive landscape.

IDC anticipates a huge increase in the


number of third-party applications for
wearable devices by 2019. Currently, there
are about 2,500 such apps, IDC says, and
by 2019 there will be 349,000 apps. To
succeed in what we expect will quickly
become a very crowded category, consumer-oriented app developers need to
focus on intelligent service delivery and
always on you experiences that leverage
the human factor improvements that
smart wearable devices offer, says John
Jackson, IDCs research vice president,
mobile and connected platforms.

September 2015 / www.cybertrend.com

CORPORATE TRAVEL?
NEED A VACATION?
Let our #missionbird take you where you need to go.
Our diverse fleet of 22 aircraft offers a travel experience above the rest.

Ready when you are

STAjets exclusive membership program allows our members


to earn cash in addition to flying at an industry discount.
It requires no complicated contracts, deposits, hidden fees,
or blackout dates.
We offer our exclusive members discounted flights
while earning cash rewards on every flight.

ITS THE LOWEST COST MEMBERSHIP PROGRAM IN THE INDUSTRY


AND THE ONLY PROGRAM THAT PAYS BACK!

949.756.1111 | charter@stajets.com | www.stajets.com

STARTUPS
PersistIQ Gets $1.7 Million For
Sales-Oriented Platform

Zeetings Wants To Add Zip To Your Meetings


& Wake Up Your Audience

Whether your sales reps are hitting


the pavement or trying to qualify incoming leads, PersistIQ claims its
platform will help them be more efficient and not let potential follow-ups
fall through the cracks. The San Mateo,
Calif.,-based startup raised $1.7 million in seed funding in a round that
ended in July, from investors including
Point Nine Capital and Y Combinator.
Co-founder and CEO Pouyan Salehi
says the added funds will accelerate
the companys product development
roadmap. The strong traction and results were seeing among PersistIQs
early customers reinforced our initial
belief that todays sales organizations
need better solutions for effective outbound sales, he says.

Wouldnt it be helpful
to know if and when
people actually opened
the presentation deck
you emailed to them?
And wouldnt meetings
be more interesting if
presentations were interactive via any device,
allowing for such things
as on-the-fly feedback
and surveys during the
meeting? The Australian
startup Zeetings offers a service that does these things and more, allowing people
to build dynamic presentations that can include Web content and videos, and
enabling both in-person and remote participation. As reported in TechCrunch,
Zeetings has added analytics capabilities to its service to allow presenters to learn
just how engaging their presentations are and track participation. Active participants, ratings, and interactions are a few of the statistics available to presenters.

Investors Pour $9 Million Into


Amplitudes Mobile Analytics

DataScience Puts Multiple Data


Scientists On Call For You

Nantero Leverages Carbon


Nanotubes For Very Fast Memory

Amplitude began in 2012 by addressing


two common mobile community complaints, says co-founder and CEO Spenser
Skates: They routinely complained about
prohibitive volume-based pricing or analytics that couldnt scale to the data volumes they faced. Amplitude now offers
real-time analytics, the ability to dig into
the details of specific usage trends, and
the ability to generally better understand
the users of companies mobile apps, all at
a relatively inexpensive price. In August,
Amplitude announced it raised $9 million
in a Series A funding round.

Perhaps youd like to harness your organizations data to find ways to increase
efficiency or solve certain problems,
but you dont want to hire a data scientist. DataScience hopes youll consider
its service, which can put multiple data
scientists to the task of answering your
most pressing data questions, such as why
sales might have decreased during the last
quarter. The startup, based in Culver City,
Calif., announced it had closed a $4.5 million Series A round of financing in June.
The company plans to build awareness for
its personal brand of business intelligence.

NRAM (non-volatile random-access


memory) is fast and in high demand,
and Nantero fuels the market with its
carbon nanotube-based NRAM. The result is memory that is extremely fast (100
times faster than standard flash memory),
relatively inexpensive, energy efficiency
(consuming almost nothing in standby
mode), and captivating of investors attention. Nantero raised $31.5 million in
Series E funding in June, which company
advisor Stefan Lai says will be used to
ensure long-term success and continued
advancement of Nanteros NRAM.

CyberTrend / September 2015

Why It May Be Time To Move To Flash


PERFORMANCE & TCO HEAVILY FAVOR FLASH STORAGE OVER HDD-BASED STORAGE

KEY POINTS
SSDs (solid-state drives) are
better in terms of latency and
performance, but they also
offer secondary cost benefits
that HDDs (hard disk drives)
cant match.
SSDs are more durable than
HDDs, but SSDs can wear out
with constant rewriting.
Total cost of ownership
should be a consideration when
comparing SSDs and HDDs.
All-flash arrays in the data
center are quickly becoming
the norm as companies are
seeing the TCO benefits.

10

September 2015 / www.cybertrend.com

WHEREAS IT USED TO BE more realistic to compare HDDs (hard


disk drives) and SSDs (solid-state
drives) in terms of cost and performance, we have reached the point
where the benefits of SSDs far outweigh those of HDDs for most use
cases. In fact, Deepti Reddy, SSD
Product Marketing Manager at PNY
Technologies, says the bottleneck
really is your spinning storage media.
At PNY, we are working to create
SSD solutions to accelerate applications. With that in mind, it may be
time for you to reassess the types of
storage solutions your company uses.

Latency, Read-Write Speeds


& Secondary Economic Benefits
One of the biggest benefits of SSDs
over HDDs, and one that has existed
since the first time flash was introduced, is that of decreased latency,

as SSDs dont have moving parts and


HDDs include mechanical, rotating
media. For that reason alone, SSDs
have much faster response times than
HDDs. For every extra one second it
takes to load a page, you lose potentially billions in sales every year and
you may see that for an extra half a
second of search delay, traffic to a site
drops by 20%. Latency matters and
HDDs dont scale well with latency
the way that SSDs do. Faster response
time is a key advantage of SSDs.
Reddy also points out that SSDs
offer read and write speeds that are
superior, especially when it comes
to random workloads. She says that
HDDs work well with sequential
data access. But with more random
workloads where you cant predict
what type of data accesses your application is requesting, its always
better to use SSDs because SSDs

boast ultra-low IO latencies, Reddy


says.
In addition to pure performance,
flash storage also offers secondary
economic benefits you may not think
about right from the start. For example, in order to meet the same
performance requirements for your
applications, youll end up buying
50% to 80% fewer SSDs than you
would HDDs, according to Eric
Burgener, research director at IDC.
Storage performance is measured
in IOPS (Input/Output Operations
Per Second), and Burgener points
out that SSDs can offer as much as
10,000 to 20,000 IOPS compared to
HDDs 200 IOPS. He admits that not
all of that performance will be utilized by applications due to other
potential bottlenecks, but that the
difference is so large that you have
performance to burn.
Buying and running fewer
disks leads to even more benefits,
Burgener says. For example, when
using fewer disks, youll need less
energy to power and cool them, he
explains. And youll also be saving
quite a bit of space in the data center,
so you might be able to consolidate
servers and open up some floor space
for other equipment. Processors are
so fast that they spend most of their
time waiting for the storage to respond with HDDs, says Burgener.
If you put storage with much lower
latency in there, you can keep those
CPUs busier and the utilization goes
up significantly. Were seeing a range
of server reduction that typically
[amounts to] between 5% and 30%
fewer x86 servers needed once you
deploy all-flash in order to deliver

You need to think in terms of TCO [total cost of


ownership] to see the bigger picture. With the influx of
cloud computing and hyperscale data centers, which are
specifically designed to scale quickly, companies need to
take into consideration power, cooling, and energy costs
when comparing storage media as well.
TONY CARBONE
Head of SSD Enterprise Development
PNY Technologies

the same performance experienced


previously with HDDs.

you can increase the longevity of the


SSD. In fact, our drives are warrantied for five years in the data center.

Longevity & Endurance


Durability and longevity are also
important considerations. Reddy
says that when it comes to pure durability, SSDs trump HDDs simply
because youve eliminated moving
parts. When it comes to longevity,
SSDs are prone to wearing out if data
is written to the same location on the
drive over a period of time. Reddy
compares this to writing on a piece
of paper with a pencil, erasing it,
and then writing over that same area
again and again until it wears out.
Using flash block management
helps you avoid these issues, alternating where data is written and
maximizing drive life cycles. PNY
SSDs have algorithms that will
manage where data is written such
that data isnt written to the same location again and again, says Reddy.
These days you could say that a lot
of the intelligence for your SSDs is
in the firmware and software algorithms. Over time, if you can tweak
your firmware and have more intelligent flash management algorithms,

Cost Comparison
If there has been one major
sticking point in the debate of HDDs
vs. SSDs, its cost. And while its true
that the price of SSDs is typically
higher than HDDs on a per-gigabyte
basis, that should not be used as your
only cost evaluation metric.
Tony Carbone, the Head of SSD
Enterprise Business Development at
PNY, recommends thinking in terms
of TCO (total cost of ownership) to
see the bigger picture. He explains
that with the influx of cloud computing and hyperscale data centers,
which are specifically designed to
scale quickly, companies need to take
into consideration power, cooling,
and energy costs when comparing
storage media as well. Thats why
large companies including Amazon
Facebook, Google, and Microsoft are
moving to SSDs. They need to be
able to provide highly scalable performance to their users, and quickly.
Flash storage meets those needs
much better than HDDs.

WHILE ITS TRUE THAT THE PRICE OF SSDs IS TYPICALLY HIGHER


THAN HDDs ON A PER-GIGABYTE BASIS, THAT SHOULD NOT BE
USED AS YOUR ONLY COST EVALUATION METRIC.

CyberTrend / September 2015

11

Carbone uses Google as an example. In the past, you had to


type an entire word or phrase into
Googles search engine, and the
HDD-based engine would slowly
return results. But SSDs enabled
the company to introduce Google
Instant, which sped up searches significantly. This led the company to a
return on its flash storage investment
in less than a month, Carbone
says. This is a perfect illustration of
why per-gigabyte cost comparisons
simply does not do justice to the debate anymore. You have to look at
TCO and take potential secondary
benefits into account.

Popularity Of Flash Storage Now


& In The Future
Burgener says that the implementation of all-flash arrays has changed
rapidly; they were new on the scene
in 2007 and became more widely
available in 2012. He says companies originally used these arrays for
single, dedicated application use,
maybe to run a database or a VDI
(virtual desktop infrastructure) environment. But now, as IT teams
have gotten more comfortable and
familiar with SSDs, theyve seen how
much performance they can get out
of all-flash arrays, which is opening
up new uses cases. Instead of one ap-

Think about flash as writing on a piece of paper with


a pencil and you keep writing to the same location,
erasing and then rewriting. Over time, that area of the
paper wears out. PNY SSDs have intelligent algorithms
that manage where data is written to so data wont be
written to the same location repeatedly. These algorithms
help increase the longevity of our SSDs.
DEEPTI REDDY
Product Marketing Manager
PNY Technologies

plication, companies are finding they


can run as many as 10 to 20 applications on an array, Burgener says.
Still, all-flash arrays, and especially all-flash data centers as a whole,
are still developing and evolving.
Burgener says IDC is very bullish on
this, adding that by 2019 its going
to be rare for a company not to use
SSDs and all-flash arrays for primary
storage. He also points out that it may
take some time for this to happen and
that hybrid solutions may be used in
certain situations so companies can
incorporate both SSDs and HDDs in
their environments. But as of now,
Burgener hasnt talked to many companies that are all the way there in
terms of using flash solely for primary
storage. He says that a few companies
are close, and more and more of them

I wouldnt say that most people today are using the four
secondary economic benefits as a cost justification for an
array purchase right now. Some are, but not very many.
But clearly going forward, more will be using that. A lot
more people this year are aware of those benefits and the
magnitude of them and what they can bring to the data
center than they were even a year ago.
ERIC BURGENER
Research Director
IDC

12

September 2015 / www.cybertrend.com

are trying to find ways to transition


away from using HDDs.
The problem, Burgener says, is
that people dont take the secondary
economic benefits into account
enough when making purchasing
decisions. The key is to understand
that theres more to SSDs than the
up-front or per-gigabyte cost. And
there has been some progress in that
regard, because Burgener says people
are generally more aware of these
benefits than they were a year ago
and that awareness will only continue to grow.
The one challenge to overcome is
this perception of higher cost, says
Burgener. Something people need to
keep in mind is what sort of schedule
are they on for a technology refresh. If
they just bought a new HDD system a
year and a half ago, they probably still
have another two or three years left
before its depreciated. They may not
be looking to replace that with an allflash array in the near-term, he explains. Maybe they just want to buy a
couple of SDDs and put them in that
box to make it run a little faster than
it used to. But when it comes time for
the technology refresh, thats when
these people will take a good look at
the all-flash arrays, see what they can
do to bring those in, and start putting
a strategy together for moving more
workloads onto them.

Do You Need Storage Management?


COMPANIES WITH MANY DIFFERENT TYPES OF DATA WILL BENEFIT MOST

AS THE STORAGE systems in data centers


continue to grow and companies demand
more and more capacity for running applications and storing important data,
the need arises for a dedicated storage
management solution. The challenge is
determining whether or not your storage
environment would actually benefit from
such a solution and if you should consider
implementing one now to prepare for
the future. The key is to understand how
storage management solutions can benefit
enterprises in general and then decide if
that fits your companys needs.

How Storage Management


Helps Businesses
To better understand what storage
management is today, it helps to look at
the evolution of the technology over the
past few years. For example, says Scott
Sinclair, an analyst with Enterprise Strategy
Group, when storage management

14

September 2015 / www.cybertrend.com

solutions were first introduced, they were


primarily used for alerting and error reporting, which are still important functions
that are now grouped in with other capabilities. Now, a major component of these
solutions is storage provisioning, whereby
you can go to a specific storage device and
determine how much of its capacity you
want to use and whether you want to store
data or workloads at that location. This
gives storage administrators much more
control over their storage solutions across
the board.
Another major feature of newer
storage management solutions is data
availability and protection across multiple
sites, Sinclair says. If I want to replicate
a volume from this storage system to another storage system for disaster recovery,
I can do that. This is important because
you can decide to have a high-performance storage solution as your primary
storage and then go with a less expensive

or even olderbut still workingexisting


storage system for secondary storage
purposes. And with that comes another
newer capability for some storage management solutions, which is the ability
to tier your storage, so you can actually
categorize Array A as fast and Array B
as affordable, Sinclair says. Its a way of
prioritizing and organizing your storage
systems so you can utilize them as efficiently as possible.

Who Needs Storage


Management?
While its true that companies with
large amounts of data may intrinsically
require storage management solutions,
the overall complexity of your data also
plays a role. What this means is that if
a company has a lot of different types
of data for different workloads and they
have different requirements, then they
have a diverse infrastructure that may

require a storage management solution as


well, Sinclair says.
For example, he points out that if you
have multiple petabytes worth of storage
that is used predominantly by the same
application or workload and with the
same access patterns, then you could theoretically put it all in one large array and
not have to worry much about additional
storage management solutions. However,
if you have that same amount of data, but
every other petabyte, terabyte, or gigabyte
is used by a different application that requires a unique storage characteristic, then
you may need a storage management solution to keep everything under control.
This issue only gets more complicated
when you take events such as mergers
and acquisitions into account. If a large
company acquires a smaller company
and they combine the two data centers,
you could have a pool of resources and
equipment from multiple vendors, says
Sinclair. Managing and capturing all of
that can be a challenge. In addition, if you
ever want to switch vendors, its a challenge, too. Some of these storage management solutions provide that management
or orchestration layer that presides over a
whole bunch of different types of storage
and allow you to get that single view into
your storage management.

Who Should Be In Charge Of It?


Sinclair admits that its easy to say
its the storage administrators job to
run a storage management solution, but
he adds that this decision ultimately depends on the companys size and what
roles employees fill on-staff. For example,
large organizations may have dedicated
storage administrators onsite who are
well-versed in running storage management solutions, Sinclair says, but smaller
businesses may have to rely on IT generalists. Fortunately, storage management solutions are typically much easier
to use than they have been in the past,
so it gives companies flexibility to decide
who should be in charge.

In addition to high availability and replication, [there are]


systems that provide deduplication, compression, and
encryption at a single layer where you can provide some
of those capabilities as an overlay on top of more
affordable systems that dont offer those.
SCOTT SINCLAIR
Analyst
Enterprise Strategy Group

Another factor to consider is whether


or not you use a significant amount of
virtualized storage in your business. Ive
heard of some companies offloading these
storage provisioning duties to the VM
[virtual machine] management team and
having their storage management guys
focus more on servicing, making sure the
storage network is working OK, and focusing on data protection and backups
rather than new data provisioning,
Sinclair says. It ultimately comes down
to looking at your storage environment
and determining who would benefit
most from storage management and who
would have the greatest insight into how
to best use it.

Vendors That Offer Storage Management Solutions


There numerous major vendors out
there, including EMC, IBM, HewlettPackard, and even Symantec, that offer
storage management solutions. For businesses that are heavily invested in virtualization, Sinclair adds that VMware offers
relevant tools that are built into vSphere
(one of which is called vVOLS) which are
designed to handle storage provisioning,
even though they are technically called
storage management solutions. But these
arent the only players in the market.
Sinclair says that if a vendor offers a
broad portfolio of storage offerings, theyd
be likely to have a storage management
framework. If you work primarily with a
single storage vendor, then it wouldnt hurt
to ask that vendor whether or not it has
a storage management solution, because

youll know it will integrate and work well


with your existing infrastructure.

Other Considerations
Another major consideration to factor
into any storage management solution
purchasing decision is whether or not
your organization might move from one
storage medium to another in the foreseeable future. For example, if youre planning to move from hard disk drives to
solid-state drives, then a storage management solution will make that transition easier. You will know where every
byte of data is stored throughout your
organization, and then you can use provisioning, categorization, and prioritization
capabilities to ensure that every workload gets moved to the correct storage
medium. This line of thinking also applies to cloud computing, because many
storage management solutions include
cloud components that help make those
environments extensions of your internal
storage system.
Lastly, Sinclair stresses the importance of finding a storage management
vendor you can trust. It can be a major
challenge to move from one storage
system to another, but the same applies
to storage management solutions from
different vendors. Thats one of the big
challenges and a risk to picking a solution, Sinclair says. Thats probably
also why you see [that] a lot of the guys
providing it [are] the larger players, he
adds, because you dont want to pick
the management console and then have
that vendor go out of business.

CyberTrend / September 2015

15

Purpose-Built Backup Appliances


PBBAs INCLUDE BACKUP SOFTWARE & HARDWARE IN ONE SOLUTION

REGARDLESS OF what your current data


center and business environments look
like, every organization needs some form
of backup in place to protect and restore
assets in case of an outage, hardware
failure, or disaster. One such backup solution is the PBBA (purpose-built backup
appliance), which is helpful for companies
that want to take care of the hardware and
software components of data backup in
one fell swoop. But it isnt the only appliance out there and it will almost certainly
need help from other tools, especially if
you want to improve backup efficiency and
support cloud backups.

PBBA Is One Of Four Common


Data Protection Appliances
One of the major problems with the
PBBA nomenclature, as with many other
terms in the tech world, its often overused
and misunderstood. In fact, according
to Jason Buffington, senior analyst at the

16

September 2015 / www.cybertrend.com

Enterprise Strategy Group, purpose-built


backup appliances are one of four potential
categories for what he refers to as data
protection appliances. PBBAs are appliances that contain both the software and
hardware necessary to back up data. They
can either be used in physical or virtual environments, but regardless of implementation, the primary role of these solutions is
to back up data.
Where the confusion comes in is with
other data protection appliances that serve
roles other than PBBAs and offer capabilities not necessarily included in the PBBA
package. The other categories that complement this are deduplication appliances,
which is where you have optimized storage
but the backup software is still somewhere
else and it has to be fed by something,
says Buffington. You have cloud gateways
that have to be fed by a backup engine of
some type. And then you have a failover
appliance where instead of the primary job

being to back up and then restore, the primary job is actually to resume functionality
within the appliance.

What Makes PBBAs Unique


Because purpose-built backup appliances include both the software and hardware components all in one package,
customers get a turnkey, self-contained,
and complete solution, Buffington says.
This is particularly beneficial to organizations because backup is a major priority
for businesses of all sizes, and so being able
to have both components in one removes
some of the guesswork around integration
and management. Plus, with PBBAs, the
hardware is right-sized for the software,
meaning you dont have to try to figure
out which pieces and parts are going to
best operate together, Buffington says.
You know for a fact that the hardware and
software will work in harmony to give you
the most reliable backup scenario.

Especially for mid-sized organizations, there are a lot of reasons why you
want to embrace partner services and
third-party expertise, but having someone
run setup.exe for you should not be one
of them, says Buffington. The requirements for just getting the core software
installed and basic functionality is something you really want to get out of the
business of doing sooner rather than later.
Having the software pre-installed and
knowing with confidence that its built
for the hardware that youve chosen not
only helps with ease of deployment, it also
gives ease of acquisition since all of the
parts are already right-sized.
The fact that the backup hardware
and software are bundled together can
also provide some financial benefits.
Buffington says its a fine approach for
some companies to buy backup appliances where the software is just pre-installed and preconfigured, but the added
value really comes into play when the
vendor goes above and beyond delivering
basic software installed on generic hardware. If it really has been right-sized for
you, theres a single support channel, or
if the vendor has added capabilities that
may not have been available if youd built
it yourself, thats where the additional
value comes from, he says.

PBBAs & Other Appliances Work


Well In Virtualized Environments
As weve mentioned, purpose-built appliances are available both as physical and
virtual solutions, but the PBBA concept
also applies to other data protection appliances. If your company fully embraces
virtualization and is trying to get away
from all of the physical solutions that are
in play, then turning around and adding
a physical backup appliance may not be
your preferred method, Buffington says.
Instead, you could opt for an appliance
that fully supports virtualized environments and gives you ease of deployment
without adding another physical asset,
he adds.

Backup is still absolutely a requirement in organizations


of all sizes and what a PBBA gives you is a turnkey, selfcontained and complete solution. It contains the software
and the hardware, and more notably, the hardware is rightsized for the software. You dont have to try to figure out
which pieces and parts are going to best operate together.
It comes from the manufacturer as a turnkey solution.
JASON BUFFINGTON
Senior Analyst
Enterprise Strategy Group

Virtualized backup appliances can also


aid in the consolidation process for data
centers. Instead of introducing yet another space-consuming physical component, you can simply add a backup
software layer that protects your virtualized assets. This is important to consider
because even though its a virtualized
environment, the data still needs to be
backed up.

How To Choose
The Best Appliance
PBBAs may be turnkey, but that
doesnt mean that theyre the only appliances needed to support a complete
backup and recovery strategy. Buffington
stresses that companies should look at
backup appliances as building blocks or
nodes that need to be deployed, but once
they are in place, thats when the important part of configuration and customizing for your environment really
starts. Its great to have one solution that
does everything you need, but in some
situations, you may need to come up with
your own configuration in order to best
match your unique requirements.
Two of the other three categories are
really around improving the efficiency
of what youre already doing, says
Buffington. Whether its a deduplication appliance, which has the primary job
of providing you [with] a more efficient
CAPEX problem so that you store data in
less footprint, or a cloud gateway, where

again, youre solving that storage consumption problem but instead of storing
it locally, youre actually caching that to
the cloud, both of those are really meant
to supplement an existing backup solution and provide more efficiency to it.
Organizations essentially have two
choices: they can either stick with their
existing backup system and mix and
match appliances as needed, or they can
go with a vendor that offers add-ons. For
example, if your existing backup solution is working well and you are mainly
focused on improving efficiency, then you
may want to add a deduplication appliance or a cloud gateway, as Buffington
mentioned. He says what you choose ultimately depends on whether you want
all the data on-premises or you want a hybrid [arrangement] of off- and on-premises. Regardless of how you go about
it, Buffington stresses that mixing and
matching appliances is a 100% viable approach if your existing backup solution is
missing some capabilities.
However, if you do want to stick with
a single brand, many backup appliance
vendors are not only offering their base
software, theyre also offering add-on appliances that provide the same functionality as a deduplication appliance, cloud
gateway, or failover appliance. Buffington
says vendors are starting to bridge that
gap where you can stay with that turnkey
solution form factor, so you have lots of
options.

CyberTrend / September 2015

17

Cloud Storage, Privacy & Legal Issues


WHAT TO KEEP IN MIND WHEN CHOOSING A CLOUD PROVIDER

MOVING DATA and applications to the


cloud is a big step for many organizations. Rather than having all of those
assets in-house for easier security and
management, you are outsourcing them
to a third-party vendor and making it
their responsibility to protect your data.
For that reason, its important to understand what your cloud security options
are and what you can do to maintain
privacy, avoid legal issues, and enjoy
the benefits of the cloud while introducing as little risk as possible.

What To Look For


In A Cloud Service Provider
The first step in making sure your
cloud data is safe is to compare available CSPs (cloud service providers)
and find out which one offers the services you need. James McCloskey is
director, advisory services for security
and risk, at Info-Tech Research Group,

18

September 2015 / www.cybertrend.com

and he recommends that companies


look for vendors that provide SSAE-16
(Statement on Standards for Attestation
Engagements No. 16), SOC (Service
Organization Control) 2, or SOC 3 reports. Any of these reports will give you
an idea of what controls a given CSP has
in place.
In lieu of these reports, you can also
look for CSPs that are certified in one
security standard or another, depending
on what type of data you plan to store.
For example, some cloud vendors may
support HIPAA (regulated health care)
data, and other CSPs may be certified to
handle PCI (Payment Card Industry)related data. McCloskey adds that some
CSPs are even FedRAMP-certified,
which mean that federal agencies in the
United States have this independent assurance that the service provider theyre
working with meets federal information
security guidelines. Needless to say,

if a cloud provider is up to the task of


storing and maintaining sensitive government data, then it should be able to
handle your companys data as well.
Another important thing to remember, especially when looking at
larger CSPs, is that they may actually
be able to distribute the costs of security across hundreds or even thousands
of clients, which means they are able
to raise their security standards up
to levels that the average corporation
wouldnt be able to reach. This means
that instead of only relying on encryption, firewalls, gateways, and other basic
security tools, many CSPs also offer
24/7 monitoring of cloud environments
and all of the security solutions that are
in place.
On-staff experts can actually track potentially malicious behavior in a given
cloud environment and thwart outside
attacks. If your organization is strapped

for resources and doesnt necessarily have


the priority placed on dealing with those
vulnerability management concerns, you
may find that the CSP is actually better
positioned to be responsive to those types
of issues, McCloskey says.

Enterprise-Level Encryption
& The Cloud
For companies that want to store data
in the cloud but still dont trust CSPs to
handle security from every angle, it is possible to encrypt data before it goes to the
cloud and then decrypt it when you need
to use it again, but there are pros and cons
to this approach. On the pro side, you can
add an extra layer of security to your data
that the cloud provider cant get through.
This means that your data will be safe
from prying eyes. There are many examples of products that can achieve this,
but one example is Boxcryptor.
Not only does Boxcryptor offer a free
version for private use, it also offers paid
options for businesses of various sizes.
A primary feature of this particular encryption tool is its Master Key, which lets
approved users within an organization
decrypt any needed file in a cloud environment with no hassle. Depending on
the solution you choose, youll find different features and different prices, but
these offerings prove that there is a way to
boost security via encryption and still take
advantage of the cloud.
One disadvantage of encrypting data
before it goes to the cloud is that this may
ultimately be a redundant step with an
unnecessary additional cost. Check to
ensure that your cloud provider doesnt
already have some form of encryption in
place. Encryption can sometimes occur
on the network layer and be practically
invisible to the everyday user as data
passes back and forth. Also, if you feel
the need to encrypt your data before it
goes to the cloud, then you may want to
reevaluate whether or not you can trust
that CSP, or whether that data should be
stored in the cloud at all.

The cloud is a no-brainer for start-ups and small


business, but for entrenched applications it can be
difficult. Smaller businesses dont want to be heavily
involved in computer operations and cloud solutions are
probably appropriate for them. While larger groups are
most likely already set up with backup and may be
focusing instead on development and quality assurance.
DONALD M. GINDY
Attorney
Law Offices of Donald M. Gindy

Regulated Industries
& Legal Concerns
Although the cloud offers many benefits, there are situations where companies should use it minimally or not
at all. Donald M. Gindy, an attorney
specializing in copyright law at the
Law Offices of Donald M. Gindy, says
that because regulated companies are
those that must answer to agencies of
the federal, state, or local government,
their records must be accessible to those
agencies, with the power of subpoena
to compel records that have been withheld. As examples, Gindy points out
that neither the Federal Reserve nor
the Department of Agriculture use the
cloud, and that its possible certain
public utilities and banking institutions are barred from using an outside
vendor to store their data.
In addition to regulatory compliance, companies also need to consider
other legal issues and liabilities when
comparing vendors or building SLAs
(service level agreements). In most instances, it is the vendor who bears liability; however, many vendors reject
liability in their terms of use for loss of
use, data, or profits, says Gindy. You
are compelled to agree to these terms
as a condition of accessing their storage
facility. Liability and damages may be
limited or unavailable in certain circumstances. One has to take a look at
the agreement to properly assess rights

and duties. Moreover, Gindy adds,


the customer may be limited to arbitration in resolving any dispute with
the vendor. Affirming the terms of use
may constitute the waiver of a jury trial.
Cloud storage may often mean you are
stuck with a less than satisfactory judicial resolution. Your matter will be
decided by an arbitrator rather than a
jury of your peers and such hearings
have, historically, favored the large entity over the individual.

Ensuring Proper Performance


Gindy also points out that companies should take performance into account when thinking about moving
a large number of workloads to a
third-party environment. Imagine 20
servers are working together in support of a program, says Gindy. You
cant just move five of them to the cloud
or it could cause the entire program
to be slow. Its simply not as efficient
as having the computers adjacent to
one another. For this exact reason,
Gindy says the cloud is essentially a
no-brainer for start-ups and small businesses, but it may not be a fit for larger
organizations with entrenched applications. In essence, the cloud is not a fit
for every company in terms of security,
but it also may not be a viable option if
it would be cause performance degradation to move certain workloads to a
cloud environment.

CyberTrend / September 2015

19

Storage Terms To Know


GET A BETTER UNDERSTANDING OF YOUR EXISTING & FUTURE SYSTEMS

MANUFACTURERS and experts throw


around quite a few terms when talking
about storage. Many of these terms
can be difficult to decipher, and often
it isnt made clear how various storage
technologies apply to daily business
life. If youre interested in getting a
better grasp on storage technology
generally, or if you need to know how
to compare different systems, here are
a few terms to keep in mind.

Deduplication
When storing or backing up data,
you want to avoid copying the same
information over and over again. One
way to do this is to compress as much
information as possible. This free sup
valuable capacity on your storage systems. Deduplication identifies patterns, spotting redundant data and
eliminating it so that you only store
the version of that data you truly need.

20

September 2015 / www.cybertrend.com

This process can either be done in realtime as the data moves or after its
written (stored).

Fibre Channel
In a data center, storage systems are
often connected using a SAN (storage
area network) or other networking
technology. For optimal efficiency, you
need high-speed, high-performance
cables. Fibre channel supports speeds
of 2Gbps (gigabits per second) up to
16Gbps, and work is being done to
develop 32Gbps and 128Gbps speeds
by 2016.

Provisioning
Storage provisioning is all about
maximizing your organizations
storage and making sure you use all
the available capacity as efficiently as
possible. This could be as simple as
making sure backups are stored on

HDDs (hard disk drives) and applications are run on SSDs (solid-state
drives), or it could involve assessing
each individual storage solution and
how much capacity should be given to
each workload.

IOPS
IOPS (input/output operations
per second) is a measurement of
the performance of storage systems.
Interpreting an IOPS number is not always an exact science, but generally the
higher the IOPS, the better the performance youll get from the drive. HDDs
typically top out at around 200 IOPS,
whereas SDDs can easily reach 100,000
IOPS or significantly higher.

RAID
Using RAID (redundant array of independent disks) essentially means that
instead of viewing an array as dozens

or even hundreds of individual physical disks, you can view them in a


more focused and centralized way,
as a single logical unit. There multiple RAID configurations; the ones
most often used are RAID 0, RAID 1,
RAID 6, and RAID 10.
There are pros and cons associated
with each type of RAID configuration. For example, RAID 0 uses disk
striping, which means that multiple
hard drives will appear as one and
can work together to run workloads.
The downside of RAID 0 is that if
one drive goes out, it will negatively
impact the rest of the drives and potentially lead to data loss. RAID 1
includes replication and disk mirroring, meaning it can copy the same
data to two separate drives simultaneously, so if one drive goes down,
you still have another one running to
prevent downtime.

SATA

Tiered Storage

SATA (Serial Advanced Technology Attachment) is a storage


standard for connecting hard drives
and optical drives to computers and
servers. Compared to its predecessor
Parallel ATA, SATA cables are
smaller, data speeds are faster, and
theres the ability to hot-swap drives,
which means you can replace hard
drives without turning the power off
to the computer or server.

Tiered storage provides a way to


categorize storage systems based on
numerous factors, such as speed and
performance or capacity. Using this
approach, you can make sure that
mission-critical data and applications are stored on the best possible
systems, and then gradually move
down the line as you consider where
to store less critical workloads.

Unified Storage
PCI Express
PCIe (PCI Express) is a standard
used for connecting HDDs, SSDs,
graphics cards, Ethernet cards,
Wi-Fi cards, or other expansion
cards within computers. The standards was developed by the PCI
Special Interest Group, which continues to work on improvements and
future PCIe iterations.

If your company uses all three or


any combination of NAS (network
attached storage), SAN, or Fibre
Channel, then a unified storage solution is one way to bring them all
together and make everything more
manageable. Rather than managing
three disparate storage types, youll
able to access them all from one centralized platform.

GO DEEP

If quality time with the latest, fastest home computing technologies


is your idea of well-spent leisure time, CPU is the magazine for
you. Each month CPU serves up how-to articles, interviews with
tech industry leaders, news about cutting-edge research,
and reviews of the newest hardware and software.

Law Offices of Donald M. Gindy


1925 Century Park East, Suite 650
Los Angeles, California 90067
(424) 284-3123 www.gindylaw.com

Check out the latest edition right now at www.computerpoweruser.com


or on your iPad via the iTunes Store.

A Unique Path Through The IT Industry


REGARDLESS OF INDUSTRY, MARKET OR TECHNOLOGY, NEC MAKES ITS OWN WAY

KEY POINTS
More than a century ago, NEC
was founded by one of the few
Japanese innovators to have
worked with Thomas Edison.
The company started by
developing telephony components and eventually moved
toward displays, projectors,
storage, and much more.
NECs enterprise products
cover a wide range of industries
and include ERP, CRM, and
other solutions.
NEC also offers smart energy
products designed to help organizations improve their energy
efficiency.

22

September 2015 / www.cybertrend.com

NEC CORPORATION is among those rare


organizations in the tech world that have
progressively evolved for so long that it
can be difficult to recognize it as the same
company that started well over a century
ago. In fact, aside from its stated focus on
product quality and customer satisfaction,
the only thing that has remained constant
about NEC is its name, which began as
the Nippon Electric Company, Limited,
in 1899 and was only shortened to NEC
about 30 years ago in 1983.
NEC started primarily as a manufacturer of phone and communications
equipment, and has evolved into a major
global provider of enterprise solutions,
displays, smart energy technologies, and
much more. Before we look at what NEC
is doing right now, however, we will take
a brief but fascinating tour of the companys history since it was established as
the very first foreign capital joint venture
in Japans history.

When Iwadare Met Edison


As NEC began primarily as a telephony-focused company, it makes sense
that its founder, Kunihiko Iwadare, was
a student of telegraphic communications,
a precursor to telephones. But instead
of watching the development and evolution of telephony from the sidelines,
Iwadare decided to move to the United
States and work with Thomas Edison,
where he served as an apprentice and was
admitted to the Edison Machine Works
in Schenectady, N.Y.
With a wealth of knowledge and invaluable hands-on experience, Iwadare returned to Japan to become the first chief
engineer at Osaka Dento before eventually establishing his own business, Nippon
Electric Company, just before the turn of
the 20th century. This joint venture with
Western Electric Company, which is now
known as Alcatel-Lucent, resulted in a
company that was unique in Japan because

it focused heavily on the customer experience and follow-up service in a time when
companies primarily focused on product
production and then moved on to the next
project. It is this focus on the customer and
the accompanying dedication to quality
that helped NEC grow in interesting ways.

From Telephones
To Television & Beyond
The period from 1919 through the
1950s was a formative and foundational
one for NEC. The company was global
from the start, as it was a joint Japan-U.S.
venture, but because communication
technology was in its early stages, there
was much research and development to
be done. In 1919, NEC produced switchboards for long-distance toll calls. The
company began developing wireless technologies in 1924, and in 1927 NEC built a
PBX (private branch system) to help connect and centralize internal business lines.
In 1928, during Emperor Hirohitos
Imperial Accession Ceremony, NECmanufactured phototelegraphic equipment (a predecessor to fax machines) was
used by newspapers to transmit photos all
over Japan. This was NECs first foray into
image-related technologies, which would
grow to become a sizable part of the companys product offerings.
From 1929 to 1956 and beyond, the
company continued improving its telephone technologies, including the first
Japan-made XB (crossbar) switching technology for PBX systems, which increased
automation within the call switching process. But in 1958, after only beginning research in 1954, NEC developed its first
computer, the NEAC-2201. This computer was unique in that it was highly reliable and also had what was then a high
memory capacity of 1,024 words.
In 1963 and 1964, NEC was instrumental in the development of TV broadcasting in Japan through the use of
satellite communication earth stations and
receiving equipment. This development
is especially noteworthy for the fact that it

was first used for a trans-Atlantic test with


the United States to transmit the news that
President John F. Kennedy had been assassinated. But for NEC and from a purely
technological standpoint, it was another
step toward a more varied product lineup.
Other NEC highlights throughout
the years include computer memory research and development in the late 1960s,
the integration of C&C (computers and
communications) in 1977 (which put an

asteroid in 2005, examine it, and return


collected samples; NEC delivered these
samples in 2010.
Needless to say, from Edison to space
travel, NEC has come a long way from its
origins. At ever point in its history, NEC
has not shown evidence of resting on its
laurels. Instead, the company continues to
grow, shift focuses, and provide customers
with the products they need.

The NEC Way

Kunihiko Iwadare worked with Thomas Edison in


the United States and later founded the Nippon
Electric Company, Limited, in Japan in 1899.
(Photo courtesy of NEC.)

emphasis on global communication with


the use of computers), and the development of the PC-8001 personal computer
shortly thereafter. Leading up to the
2000s, NEC developed the PC-9801 personal computer in 1982, built the SX-2
supercomputer in 1985, created the first
notebook-sized PC in 1991 (complete with
a color LCD), and introduced the first
1-gigabit DRAM technology in 1995.
Since the turn of the 21st century, NEC
has continued to innovate, expand, and
diversify. For example, NEC designed
the worlds most compact mobile camera
phone in 2003. It also served as a system
integrator for the Hayabusa spacecraft,
which was built to land on the Itokawa

Regardless of how far-reaching or highprofile NECs projects were and continue


to be, the company has always made sure
that quality and customer satisfaction
come first, which all culminates in the organizations values and ideals. The NEC
Way comprises the companys Corporate
Philosophy, Vision, Core Values, Charter
of Corporate Behavior, and Code of
Conduct. All of these pillars work together
to support NEC and make sure that everything the company does meets the highest
possible standards.
In fact, the companys corporate philosophy reads NEC strives through C&C
to help advance societies worldwide toward deepened mutual understanding
and fulfillment of human potential. And
because 2017 marks the companys 40th
anniversary of introducing the concept of
computers-and-communication integration with its C&C initiative, NECs vision is
to be a leading global company leveraging
the power of innovation to realize an information society friendly to humans and the
earth. For such a large and successful corporation, its striking that its philosophical
focus tends to be on the betterment of society rather than on revenues. Ultimately,
the understanding is that as long as you
develop products to a high quality and
meet customer needs, the revenues flow
naturally from that.
This idea meshes with NECs core
values, which comprise the companys
overall motivation, the motivation of individuals, the motivation of teams within
the organization, and the idea of striving

CyberTrend / September 2015

23

for customer satisfaction. As a company,


NEC wants to solve problems and unite
the world. For individual employees, NEC
asks that they act with speed and work
with integrity until completion. For
teams, NEC demands that workers treat
each other with respect, work with open
minds, and collaborate in whatever way
makes the most sense.
As for customer focus, NECs mantra
is Better Products, Better Services.
Putting all of these core values together
starts to build a clearer picture
of how NEC has grown into the
company it is today: one that
demands more form its technology and offers a broad range
of solutions for consumers and
enterprises alike.

from purchase through maintenance;


product life cycle management solutions;
M2M (machine-to-machine) applications;
and other products designed to improve
the efficiency and operation of manufacturing facilities.
NEC serves the logistics industry with
its LVS (Logistics Visualization System),
which is designed for ease of use and offers in-depth search capabilities. Whether
youre shipping products via ship, train,
plane, or truck, LVS will help you track

customers can more easily find the products theyre looking for.
Similar in some respects to retail, the
hospitality industry focuses almost solely
on customer experience. Negative experiences, of course, can result in brutal
reviews, with individuals often deciding
never to use a companys services again.
For that reason, NEC offers its Smart
Hospitality Solutions, a portfolio containing multiple tools for improving
hotel operations and the guest experience
overall. The portfolio includes
UC (unified communications)
and PMS (property management systems), as well as technologies that let hoteliers take
advantage of interactive digital
signs and facial recognition.
For those in the automotive
business,
NEC offers a DMS
NEC Global Enterprise
Solutions
(dealer management system),
The best way to illustrate
which is all about building
how much NEC has changed
a stronger sales network and
over the past century and
reaching more customers.
how its focus has shifted subDMS allows dealers to view the
stantially in that time is to
sales of vehicles as well as mainlook at the companys Global
tenance records, so its easier to
Enterprise Solutions. Broken NEC Corporation has office locations worldwide. The NEC Corporation of
track the life of a vehicle from
down by industry, NEC offers America headquarters, located in Irving, Texas, is pictured here. (Photo
beginning to end and better
a wide range of products and courtesy of BOKA Powell, LLC.)
serve customers. And as an
services designed to bridge the
added bonus, dealers can use
gap between internal business
DMS to let customers know
goals and the overall values of society. Its a it. This also includes an inventory and
about potential new campaigns or even
unique approach that not only takes sales,
supply chain management component to manufacturer recalls; this opens a commurevenue, and operational efficiency into make sure that customers always have acnications channel for reaching customers
account, but also safety, security, environ- cess to the products they need when they about potential future sales as well as safety
mental concerns, and equality. Regardless need them.
issues, helping to build stronger customer
of the product, NEC backs it up with easyRetail-oriented offerings include so- relationships.
to-use tools and support based on years of
lutions for improved in-store operation,
NEC understands that various busiexperience and knowledge.
merchandizing, CRM (customer relationness categories and technologies arent
NECs Global Enterprise Solutions
ship management), and payments. Having
necessarily separate and self-contained,
focus on the needs of five industries in
a strong focus on customer satisfaction and that crossover is often a distinct posparticular, offering solutions for manu- enables NEC to deliver products that help
sibility. To that end, the company offers
facturing, logistics, retail, hospitality, and
companies better connect with customers solutions that arent designed specifically
automotive industries as well as crossand provide an improved customer expe- for one industry over another. Along with
industry solutions. For businesses in the rience. These solutions not only advance the previously mentioned ERP systems
manufacturing space, NEC offers ERP (en- the operating efficiency of the retail store
that can be used across multiple industerprise resource planning) systems to help itself, but also help with such things as intries, NEC also offers business consulting
improve all aspects of asset management, store organization and layout to ensure services so that it can share its knowledge

24

September 2015 / www.cybertrend.com

and experience with other companies,


regardless of industry.

Storage & Networking


In addition to more general enterprisefocused solutions, NEC also offers specific
products in the areas of storage and networking. For storage, NEC offers its M
Series and WB Series solutions for use in
SANs (storage area networks) as well as
its HYDRASTOR or HS series for backup
and archiving. The M Series of disk arrays
come in M110, M310, M510, and M710
varieties. The major differences between
these arrays are how many drives they can
hold and the maximum
total capacity. For example, the M110 supports up to nine disk
enclosures and 120
HDD (hard disk drive)
slots, whereas the M710
supports up to 80 disk
enclosures and as many
as 960 HDD slots.
Depending on your
drive configuration and
whether or not you decide to go with HDDs
or SSDs (solid-state
drives), you have the potential for storing multiple terabytes, if not
over a petabyte, of data.
Another important aspect of the M
Series of disk arrays is the software that
comes along with them. NEC offers integrated management, device management, performance management, and
cooperative management tools as well
as storage control, replication control,
resource control, disaster recovery, and
high availability tools that give you a complete overview of your storage environment and how best to ensure efficiency
and performance. The M Series software
adds value that youd typically only find
by picking and choosing storage management tools from other vendors.
The WB Series solutions are actually
Fibre Channel switches designed to enable

high-speed data transmissions between


your storage equipment and servers. These
switches are important because sometimes IT teams can get so focused on the
specifications of the equipment itself that
they forget to think about the connections
between them. For example, if you have
a group of top-of-the-line, high-performance servers
but the cables
and switches
that connect
them are generic and not
optimized for

first SDN-specific solution in the form


of the UNIVERGE PF Series for controlling the flow of communications. The
company has also worked with multiple
groups, including the Open Networking
Foundation, Network Functions
Virtualization group, OpenDaylight
Project, and the Open Networking
Research Center, to
make sure its software
and appliances are
of the highest quality
and interoperability.
Whats more, NECs
century of experience in the technology
space and its innovations in the areas of
communication and
networking put the
company in a strong
position to usher in
the future of SDN.

In NEC Corporations 116-year history, it


has designed and manufactured products
as diverse as communications switchboards
and home energy storage systems, as well
as displays, supercomputers, and more.
(Photo courtesy of NEC.)

specific use cases,


then it may result in performance degradation. WB Series switches ensure that the
connection between equipment is just as
powerful as the equipment itself.
On the networking side, NEC is making
great strides in SDN (software-defined networking). The main idea with SDN is to
take the operating brain out of every individual piece of networking equipment and
centralize it so that the network overall is
easier to manage and is more flexible. The
goal of NECs SDN products is to make
the technology easier to grasp and easier
to implement. In this area, NEC focuses
largely on telecommunications carriers,
but its SDN solutions are also a fit for a
range of other organizations.
NEC has long been a supporter of
SDN and, in fact, introduced the worlds

Smart Energy

NECs commitment
to environmental improvement solutions
and green technologies extends at least
as far back as the late 1990s, when NEC
introduced the Earth Simulator to the
world as a way to better understand how
technology affects the environment and
what can be done to minimize its impact.
Today, NEC is working on lithium ion
batteries, energy storage systems, energy
management systems, electric vehicle
charging solutions, and other projects designed for utilities companies.
With its smart energy products, NEC
is using both business- and consumerfocused initiatives to lessen technologys
negative impact on the environment. This
is part of NECs aforementioned Group
Vision 2017 and it runs through the core
of what NEC has been about from the
very beginning: bringing the world together and making it a better place
thought the use of technology.

CyberTrend / September 2015

25

Think Visually With Your Online Marketing


SHOW (DONT TELL) YOUR MESSAGE TO PROSPECTIVE CUSTOMERS

A RECENT REPORT from Cisco predicts


that online video content is set for a 13fold increase, accounting for nearly threequarters of the worlds mobile data traffic
by 2019.
Video traffic already consists of 55%
of the total mobile data traffic, so its clear
that video is a priority for apps and websites being viewed on tablets and smartphones. Graphics and video capabilities
on PCs continue to steadily improve, as
well. It has never been more important
that your website and online marketing
efforts feature a strong visual presence.

A Visual Evolution
Stanford Persuasive Technology Lab
has found that people often evaluate
the credibility of a website based on its
visual design. The images you use, the
layout, and even the typography matter.
Using graphics and visuals are the path
of least resistance to get consumers to see

26

September 2015 / www.cybertrend.com

your content, says Jason Knight, head


of planning at 180LA. For example, do
the graphics on your business websites
landing page effectively demonstrate the
services offered? What about evoking a
compelling response?
Social media websites, such as
Facebook, Twitter, and Instagram, offer
some of the better examples of how personal preferences have shifted toward visual content. Before social media, people
and organizations interacted via textheavy websites and blog posts that were
often hundreds of words long. Now,
users are often limited to hundreds of
characters, while longer posts run the
risk of being ignored by those scanning through their social media feed.
Graphics and video, on the other hand,
can more rapidly provide information,
often in a more entertaining format. In
short, visual content allows us to go from
tell to show.

Traffic statistics tell a similar story.


A recent report from Shareaholic, for
example, indicates that social networks
drove 31.24% of all traffic to websites.
Facebook itself accounted for 24.63%
traffic referrals, which was up from
15.44% in 2013. Overall, were relying
less on text-based searches, such as a
search engine query or directly visiting
a website, and more on organic referrals
from our social networking tools. With
that in mind, you can leverage the power
of visual marketing on your website and
in your social media efforts to better
reach customers.

How Graphics
& Videos Communicate
Youve likely heard the phrase a picture
is worth 1,000 words, and the same concept is true with graphics and videos. The
ability to create an emotional appeal, and
to do so quickly, is one of the key elements

of visual marketing. When videos address


a real-time interest or need, they can create
an emotional appeal, says Brian Wong,
co-founder and CEO of Kiip. The images
you create and share with others can help
to both connect with customers and promote your businesss brand.
When it comes to creating videos that
will capture and engage your audience,
Wong gave us a few keys. First, the video
should be relevant to the content they appear in. Secondly, the video should appear
to be serendipitous and without incentive. For example, Kiip delivered its own
form of video advertising, called Rewarded
Video, inside the popular mobile game
Into the Dead, where the ads showed a
trailer for the popular TV show The
Walking Dead, says Wong. The strong
relation to the game content helped the
videos to receive a 77% view-through rate.
Visual content also holds appeal for
busy people who want to make the most
of their limited time. Theres a societal
pressure to keep up with news, and quick
visuals and graphics are an entertaining
condensation of information, says
Knight, who went on to note that A GIF,
meme, or six-second video can all deliver
your brands message, but with minimal
effort by a consumer. You can blow up
text and blend it with images, too, so you
can incorporate infographics, checklists,
tutorials, or slideshares to a website. Or
maybe you just want to share an inspirational quote. The key is that you find a
quick way to share your message.
Whatever you choose to do, it should
be a seamless part of the online experience, whether it be an off-beat company
blog post connected to your website or
a critical video in the About Us section.

A GIF, meme, or six-second video can all deliver your


brands message, but with minimal effort by a consumer.
JASON KNIGHT
Head Of Planning
180LA

Of course, its also best if the customer


can stay on the website or product page
to view the visual marketing. Digital
platforms are increasingly incorporating
their own native video players and image
hosting, says Knight. Its important for
digital marketing to play naturally and be
contextual to each unique space.

Tell Your Story


Beyond branding, visuals can be used
to effectively convey your organizations
story, which is one of the easiest ways
to connect with your customers. Maybe
its a holiday greeting snapshot with employees and their families, or a social post
commemorating a company anniversaryaccompanied by a video that briefly
describes your organizations humble origins. Experts recommend that you follow
a traditional storytelling structure with
a beginning, middle, and end. Any good
story has some conflict, too, so dont shy
away from pointing out what your company has overcome to get where it is.
Social media, of course, works handin-hand with storytelling marketing efforts. Best of all, if readers like your
message, they might share it with others,
so the story is spread among trusted
members of potential customers peer
groups. Heres where quality content and
good storytelling come in. If its an engaging story, your message will likely be
distributed to many more people than

When videos address a real-time interest or need, they


can create an emotional appeal.
BRIAN WONG
CEO & Co-Founder
Kiip

just those who follow you on social


media. If the story is mediocre, it wont
hold peoples attention for long and
theyll simply move on or skip the video.

Technology, Social Media, & You


The video and display performance
of todays smartphones, tablets, and PCs
have created new levels of expectation
when it comes to graphics and video
quality. Spend some time working with
graphic designers and video production
specialists to create professional-looking
content that will truly have an impact
on those youre trying to reach. Check
out some well-known (or competitors)
brands that excel at social media to see
how theyre doing things and to get
some ideas about how you can share
your own story.
Keep in mind that social videos are
of a different nature than video content
developed for broadcast. Similar to the
brief text used in social media, the videos
you post shouldnt be more than a few
minutes long, if that. Youll want to engage the audience to ensure the best effect, and part of the process is developing
a concise story that consumers will seek
out. If you want to present a longer narrative, break up the videos into several
small stories.
In addition to engagement, it can help
if the video or graphics you create are
participatory. For example, you can solicit input from viewers about the next
product or service youll offer. Or you
can offer a platform for viewers to post
short videos showing off products in use
for a chance to win prizes. With todays
technology, youre really only limited
by your imagination and what the marketing team can handle.

CyberTrend / September 2015

27

What Are Data Lakes?


INCORPORATE UNSTRUCTURED DATA INTO YOUR OVERALL ANALYSIS APPROACH

IF YOUR COMPANY currently uses or is


looking to use big data and analysis technologies, then theres a good chance you
have heard the term data lake in recent
months. The technology is so new, however, that analysts and experts on each side
of the data lake coin are trying to determine whether the technology is a viable
option for enterprises. Although some
analysts say data lakes arent as beneficial
as vendors and stakeholders in the market
say they are, there is a chance that, with
proper implementation, many companies
could use data lakes for storing big data.

Filling In The Gaps


At its very core, a data lake is a data
repository like any other, which means
you can compare it to your other everyday
databases and enterprise data warehouses
and get a general understanding of how it
works. However, data lakes are primarily
designed for storing unstructured and

28

September 2015 / www.cybertrend.com

semi-structured data vs. enterprise data


warehouses, which are for storing structured data, says Daniel Ko, manager at
Info-Tech Research Group.
Data lakes take advantage of low-cost
Hadoop cluster storage, and for that
reason, companies can use them as repositories for data they arent using presently
but may need to access in the future. Its
almost like an insurance policy . . . and
in the case of a data lake, you are storing
some unused data and hoping that one
day you will be leveraging it in the future,
Ko says. This allows you to essentially save
data for later that you dont necessarily
want to store in your main storage arrays
but may want to access in the future.
Ko says there is a gap in the current
way companies use data warehouses and
other repositories, because they only deal
with the structured data that most businesses are familiar with. Data lakes help
fill those gaps by gathering and storing

unstructured data from things such as


Web logs, data center equipment messages, and other streams. There could
be a lot of hidden messages in those data
sources, but right now you dont have a
way to analyze them systemically, says
Ko. A data lake fills that gap by providing a repository where you can store
those data sets. And then once you have
a data lake program up and running,
you can find some specific people to analyze that data and give you some insights
that are not available in your enterprise
data warehouse.

From Marketing Jargon


To Real-World Use
There is an ongoing debate among analysts as to whether the term data lake is
simply marketing jargon or if it is truly a
beneficial technology enterprises should
explore. For example, the research firm
Gartner has released reports about data

lakes and how they dont necessarily live


up to their billing. Ko, however, says companies simply need to look at data lakes
just as they would any other immature
technology. They need to put the programs through their paces and make sure
the technology fits well into their overall
big data analysis approach. Before doing
that though, they need to understand what
data can go into a data lake.
A lot of people are using data lakes
as data landfills, where its just a mashup
of a bunch of data, says Ko. People are
putting too much stuff in the data lake
and turning it into a data landfill or wasteland. He suggests that companies start
by implementing a data pond in the beginning, instead of taking on a data lake,
and then they can pick one or two departments to start with.

Data Ponds vs. Data Lakes


The concept of a data pond is more
manageable than a data lake, thus its the
term you will want to use during the pilot
phase. The key to having a successful data
pond pilot is to focus on what you want
it to achieve. For instance, Ko says, you
can use improving revenue as a general
theme for your data pond. The IT team
can partner with the sales and marketing
departments, because they are on the
front line interacting with customers and
generating revenue, he says.
From here, you decide what types of
data will fit best into your overall data
pond theme. Ko recommends looking at
high-impact and high-visibility data first.
This means gathering information from
CRM (customer relationship management) systems, email exchanges, recorded
calls, or other records of customer interactions and placing them in the data pond.
By placing all of this unstructured and

Im advocating the term 'data pond' because I dont want


people to get lost in the data lake. At this stage, because
the technology is not too mature and you have a bunch of
data in your organization, lets do a data pond before you
do a data lake.
DANIEL KO
Manager
Info-Tech Research Group

semi-structured data in one location, you


can access the information when needed
without having to convert it into structured data your other systems can handle.
Ko thinks that taking this approach will
make data lakes a reality and allow people
to move past the jargon identifier attached
to the concept. He is currently working
with clients that are starting their data lake
journeys. He explains that these clients are
setting up some kind of Hadoop cluster
first, and then they plan to put some semistructured and unstructured data in before
running analytics.
Its a starting point right now," says
Ko, "but its turning from marketing
jargon to reality. And I can see some big
potential once people are having some
successes, because they will be using those
big data insights in their operations sooner
or later. The actual implementation will
be sometime next year or the year after,
and then they should be getting some successes within two years.

Put The Right People In Charge


Starting with a data pond and eventually
moving to a data lake doesnt mean much
unless you have the right personnel in
place to run the programs and sift the usable information out of the mounds of unstructured data. For that reason, Ko says,
you want to build a dedicated team of

data scientists and data analysts to analyze


your data before you actually hand that
data over to your everyday business users.
There are a few reasons for this, but
the major ones are possible security or
privacy issues related to the data pond
or lake, especially if it contains sensitive
customer information. You want to have
dedicated people, and they should sign an
advanced non-disclosure agreement so
they are safe to export that data and, in the
meantime, the data is being protected by
the NDA, says Ko. At the end, they will
have a bunch of data insights and those
insights will be actionable at the operating
level. In essence, your data analysis experts will sift through the data lake and
hand information over to the business in a
form it can use.
Another important step in setting up
and maintaining a successful data lake
strategy, according to Ko, is to make sure
you socialize the concept of a data lake
from your IT management up to your
senior management and board of directors. You need to make sure that everyone
throughout the organization understands
the value and benefit of a data lake as a big
data insurance policy that could bear fruit
in the future. And if you can establish that
foundation, theres a much better chance
that a data lake will prove beneficial to your
company as the technology matures.

STARTING WITH A DATA POND AND EVENTUALLY MOVING TO A


DATA LAKE DOESN'T MEAN MUCH UNLESS YOU HAVE THE RIGHT
PERSONNEL IN PLACE TO RUN THE PROGRAMS AND SIFT THE
USABLE INFORMATION . . .

CyberTrend / September 2015

29

Hiring A Data Scientist


LOOK PAST TITLES & FOCUS ON RELEVANT EXPERIENCE

THERE IS SIGNIFICANT debate around


the term data science and the role
of the data scientist in the enterprise.
Some experts say data science is a real
discipline, one that is instrumental in
ensuring the success of big data and analytics programs. Others say there is no
such thing as data science and that its
essentially just a fancier term for predictive analytics and data mining. The key
to this debate is to go beyond terms and
titles and dig deeper into what the role of
a data scientist actually is, whether you
choose to use that title in practice.

A Concrete Definition
Is Hard To Come By
For decades, if not centuries, businesses have used some form of analytics
to inform their futures. Going back to
the days of tracking sales in paper ledgers, analytics was a way for business
owners to keep track of what they sold

30

September 2015 / www.cybertrend.com

in the past and what inventory to stock


more of in the future. Although analytics
has been in use for quite some time, the
concept of big data analytics is a relatively new idea, and it has opened the
door for a whole new group of data analysts who are sometimes referred to as
data scientists.
Data scientists are defined by their
experience. Its possible they may have
a masters degree in predictive analytics
that gives them a strong foundation to
build on, or they may have worked in
and around data analytics programs
for years or decades and have become
experts in the field through real-world
practice. How a data scientist gains

experience matters little; the important


thing is that the experience is there.
At the very core, data scientists should
be able to look at data in a way that transcends the capabilities of the average IT
employee. They have to live and breathe
information to the point where they can
identify trends in massive amounts of
data with the help of in-depth analytics
tools. Any company can put a big data
analytics platform in place and try to
make these discoveries, but a data scientist should be able to put these platforms
to work and truly understand how the
insights they gather can be used to improve the business in terms of internal
efficiency and sales or revenue generating

AT THE VERY CORE, DATA SCIENTISTS


SHOULD BE ABLE TO LOOK AT DATA IN A
WAY THAT TRANSCENDS THE CAPABILITIES
OF THE AVERAGE IT EMPLOYEE.

opportunities. Whether an individual is


identified as a data scientist or not, what
matters is that he is able to perform the
required job at a high level.

Caveats Of Choosing
A Self-Proclaimed Data Scientist
The problem with data science being
a somewhat loaded term, or at least one
that can be used to describe a discipline
that already exists to some degree, is
that almost anyone can call himself a
data scientist. Although some universities offer courses that at least mention the term "data science" in some
capacity, you dont need a special degree or a certification to assign yourself
the title of data scientist. However, its
important to remember there are quite
a few unique roles in the big data analytics space, and not all of them fit into
the data scientist mold.
For example, there is a role in analytics that revolves around data mining,
which is the process of sifting through
large amounts of data to find patterns.
There are expert data miners who excel
at sorting and categorizing data in a
way that makes it usable for the enterprise, but that doesnt mean that they
also have the skills to fully analyze that
data and suggest ways for the business
to take advantage of those insights. For
that reason, data miners shouldnt technically be called data scientists because
they only handle one part of the process.
This same idea also applies to BI
(business intelligence) and system analysts, and to anyone else whose role
consists of sorting through data to find
actionable insights. BI analysts, for example, may be able to look at the insights gathered from a big data analytics
project and use them to improve the
business in some way, but that doesnt
necessarily mean theyd have the expertise to sort through that data on their
own and detect patterns in the same
way a dedicated data mining expert
would. If the data mining expert and

the BI analyst each identifies himself as


a data scientist and you choose to only
hire one of them, then you may end
up with gaps in experience that could
minimize the benefits of your big data
analytics program.

What To Do As A Business
Should you hire a data scientist
thinking he could handle every aspect
of an analytics project, but find out instead that he is only an expert in data
mining or BI, then you may not get the
return on investment you were expecting
form that hire. This is perhaps the biggest problem that organizations face
when trying to figure out the types of
employees they need to implement and
manage a big data analytics program.
While it is necessary to have the right
tools in place to run analytics projects,
you also need to employ experts who
know how to use those tools efficiently
and effectively.
The key for each business is to look at
what it hopes to get out of an analytics
program. This starts with determining
whether to use the program primarily
for internal or external use cases. For
example, if you want to use analytics
to improve internal business processes
and give your enterprise an organizational face-lift in some respects, then you
may want to go with a systems analyst.
A systems analyst looks at what could
be changed within to make an organization more efficient or employees more
productive, for instance, and then hands
those ideas over to other people in the
company who will design and implement those solutions as needed.
If you want to have a more external
focus for your analytics program, then
you may want to go with an analyst
who specializes in customer or sales analytics. These types of analysts will look
at a range of customer data, whether
from a customer relationship management system or other channels, such
as social media. Following evaluation,

CHIEF DATA SCIENCE


OFFICERS
In the same way that companies
have CIOs and CTOs, organizations
that place a heavy emphasis on
analytics may have a CDSO (chief
data science officer) position, as
well. CDSOs are typically only
found in larger organizations, and
they differ from other C-level executives in that they only focus on
the analytics and data science aspects of the business rather than
also focusing on information and
technology as a whole. The role
of the CDSO is to make sure that
every data science and analytics
project across the organization is
working toward the same goal of
meeting key business objectives
and ensuring the future success of
the company.

the analysts can then present that data


to the business as a way to improve advertising and marketing programs, increase market awareness of products, or
fine-tune sales strategies to make sure
the business can maximize revenue.
The big dream behind data science
and the data scientist role is that you
can get the best of both worlds by hiring
one person. In other words, the hope is
that a data scientist can look internally,
externally, and anywhere in between
when analyzing data and then decide
what types of data and insights fit each
individual use case. Because the role
of a data scientist is relatively new, its
important to take the title with a grain
of salt. And if you plan to hire a data
scientist or any other analyst sometime
soon, do your due diligence and make
sure he has the experience necessary to
run the type of analytics program you
want to support as an organization.

CyberTrend / September 2015

31

Greenovations
ENERGY-CONSCIOUS TECH

The technologies
that make our
lives easier also
produce some
unwanted side
effects on the
environment.
However, many
researchers,
manufacturers,
and businesses
are developing
solutions that are
designed to keep
us productive
while reducing
energy demands
to lessen our impact on the environment. Here's
a look at some of
the newest such
initiatives.

This 40-ton vehicle from BMW Group and SCHERM Group is the first electric truck of its size to transport materials
on public roads in Germany. When fully charged, the truck has a range of about 62 miles (100km).

40-Ton Fully Electric Vehicle Hits The Streets In Germany


Notably quiet and clean, the new fully electric 40-ton truck from BMW Group is
now in operation in Germany. Deployed in partnership with SCHERM Group, a
logistics, transportation, and real estate company, the truck makes the 2.5-mile (4km)
round-trip trek between BMW Groups Munich plant and the SCHERM Group eight
times per day, hauling materials from one facility to the other. The truck produces no
CO2 and almost zero particle pollution. Compared with an equivalent diesel truck, the
electric truck produces 11.8 tons less CO2 per year, according to BMW Group. It takes
three to four hours to charge the truck, and when full charged it has a range of about
62 miles (100km). With this project we will gain valuable information on what will
be possible with electric trucks in the future for city logistics, says Jrgen Maidl, head
of logistics at BMW Group. The truck will operate on a one-year trial period; if successful, BMW Group and SCHERM Group will expand the project.

Working Toward A More Energy-Efficient CPU


Rex Computing is an unusual startup, founded as it was by a 19-year-old recipient
of a Peter Thiel investment (Thomas Sohmers) and another Thiel grant recipient (Paul
Sebexen). Rexs goal: To build a processor with a new architecture that is 10 to 25
times more efficient than current CPUs and graphics processors. By designing its Neo
chips with current supercomputer requirements in mind, Rex anticipates that the Neo
line will meet the requirements of all computers in the foreseeable future.

32

September 2015 / www.cybertrend.com

U.S. Department Of Energy


Invests In Algaes Potential
Algae holds a great deal of potential as a base material for biofuels, including alternatives to petroleum-based
diesel and jet fuel, according to the U.S.
Department of Energy. There are problems, however, related to affordably and
efficiently growing, harvesting, and converting algae. To counter these problems,
the DOE announced in July it would
provide $18 million in funding for six
algae-related projects. The goal for each
project is to push the price of algae-based
biofuels below $5 per gge (gasoline gallon
equivalent) by 2019, which feeds into the
DOEs bigger goal of reaching a $3 per
gge by 2030.

Siemens SWT-7.0-154 offshore wind turbine has been installed in sterild, Denmark, as a
prototype. It is capable of producing enough electricity to power 7,000 homes.

Denmark Wind Turbine Delivers 7 Megawatts Of Clean Energy

Solar Investments Dip,


But 2015 Remains A Good Year
Overall worldwide funding in the solar
sector declined a bit to $5.9 billion in Q2
2015 from $6.4 billion in Q1, according
to Mercom Capital Group, which takes
into account funding from venture capital,
private equity, debt financing, and public
market financing. Funding remains up for
the year, however; Q1 2015s $6.4 billion
almost doubled the $3.4 billion in funding
a year earlier, in Q1 2014. There are other
strengths to consider, as well. For example, says Raj Prabhu, CEO of Mercom,
Residential and commercial solar funds
continue to attract record funding as the
ITC [solar investment tax credit] expiration deadline approaches.

Denmark has traditionally been home to so many windmills for a reason. Today,
plentiful offshore winds combine with wind turbines to produce more than one-third
of the nations electricity. Among the newest installments is Siemens SWT-7.0-154,
a prototype of which is up and running in sterild. This 7MW (megawatt) turbine is
capable of producing 32 million kilowatt hours of electricity, about enough to power
7,000 households, 10% more than its predecessor. Based on the reliable technology
and supply chain of our 6MW machine we have improved our flagship wind turbine
with stronger permanent magnets, optimized generator segments and upgraded converter and transformer units, says Morten Rasmussen, head of technology at Siemens
Wind Power and Renewables Division. With only these minor changes we expect to
get it ready for serial production within only two years.

Looking For Benefits From Smart Lighting? Gartner Cautions


That You Cant Cut Corners
To appreciate the full benefits of smart lighting technologies, organizations
must invest in a full implementation of the available technology, according to
Gartner. The research firm identifies five key strategic phases of smart lighting:
(1) LED (light-emitting diode) lighting, (2) sensors and controls, (3) connectivity,
(4) analytics, and (5) intelligence. Dean Freeman, research vice president with
Gartner, says, Smart solid-state lighting in office buildings and industrial installations has the potential to reduce energy costs by 90%; however, achieving these costs
takes more than just installing LED lighting. Smart solid-state lighting can and
fluorescent lighting can cut energy costs by 50% and 25%, respectively, according to
Freeman. The key benefits dont come from the lighting, however, as much as from
the analytics. Gartner says if you stop at phase 3, you wont be able to analyze light
usage patterns and myriad other details that result in adjustments to improve usage.

CyberTrend / September 2015

33

Web-Scale IT vs. Traditional IT


TAKING CUES FROM THE CLOUD TO IMPROVE INTERNAL AGILITY & SCALABILITY

WITH THE popularity of cloud computing


higher than its ever been, organizations
are looking for ways to take the fundamental concepts behind the cloud and
apply them to other aspects of business.
This strategy makes sense considering that
scalability and agility are the major tenants
of the cloud, and they are also goals of any
IT team trying to improve performance
and efficiency. Web-scale IT aims to take
cues from the cloud and not only make it
easier to connect to and utilize cloud environments, but also introduce increased
scalability and agility to the data center,
application development, and the business
as a whole.

What Is Web-Scale IT?


Web-scale IT, according to Stephen
Hendrick, principal analyst at Enterprise
Strategy Group, is all about economies of
size and scale in IT. He says many organizations are thinking about or have already

34

September 2015 / www.cybertrend.com

moved some of what they do outside of the


data center and into the cloud, but when
it comes to the software dimension of what
goes on inside of IT, there are some significant implications for how you need to
think about developing and deploying software. What this ultimately means is that
IT teams and software developers must
focus on agility, which in this case means
being able to build applications faster and
scale them up quickly as needed.
One of the major challenges that companies face when adopting Web-scale IT
concepts is that they have to deal with the
resource side of the equation, which is all
about delivering capacity in response to
increasing demand, Hendrick says. The
best and most cost-effective way to do this,
he says, is to focus on improving the scalability of mission-critical and businesscritical applications first. This will require
a relatively high level of automation and
policy-driven management, so you can

respond as quickly and as close to real-time


as possible, Hendrick adds. This mindset
will help you identify patterns of demand
and make sure your software and other
aspects of IT are ready to meet those needs.
To further illustrate how these concepts
fit into the idea of Web-scale IT and how it
takes cues from the cloud is evident in how
IaaS (infrastructure as a service) vendors
are so good at spinning up resources as
their clients need them almost instantly.
Whether you need to deploy a new application or scale up an existing one,
a good cloud vendor can meet that demand quickly and give the team access to
whatever resources they need as they need
them. And while that should also be the
goal for IT teams when it comes to internal
resources, it hasnt always been the case.

Different From Traditional IT


The reason why the cloud was able to
arrive on the scene and help companies

Web-scale IT is important because, besides this whole


notion of economies of size and scale, a significant part
of what happens as a result of Web-scale IT is a dramatic
increase in agility. Agility can be defined in a lot of different ways, but I would define it as meaning essentially very
fast cycle times from the standpoint of being able to scale
up or develop applications.
STEPHEN HENDRICK
Principal Analyst
Enterprise Strategy Group

outsource so much of their infrastructure is because it can often offer performance and other benefits that internal IT
teams simply cant match. Traditionally,
Hendrick says, IT teams have had to deal
with a procurement process where they
have to go out there, attain the hardware,
and get it configured, which requires
quite a few manual activities to actually get
everything online and to move the necessary workloads.
In traditional IT, organizations always
had to look very carefully at trying to understand patterns of capacity and patterns
of demand and how to set capacity to ensure they met demand, says Hendrick.
There were a tremendous amount of
inefficiencies built into the IT stack. We
solved a whole variety of those with virtualization, were going to be able to solve
a whole bunch more with converged systems, and there will be even more that we
can solve with containers. There is a lot of
evolution in technology thats helping us
understand how to better right-size demand against capacity.
Meeting demand is one of the hardest
jobs of IT because it is dealing with more
sizable and less predictable workloads
than ever before. The role of IT has transitioned from maintenance and management of physical infrastructure to
essentially being service providers within
an organization. Hendrick points out
that if mission-critical applications even
have multi-second response times, people

arent going to put up with it. Web-scale


IT, especially when it comes to application development and even DevOps
initiatives, isnt just about building applications faster, but also about speeding up
the change management process.

Web-Scale IT Can Help You


Embrace Microservices
Microservices is basically a software
design philosophy that breaks applications into much smaller components
that developers can focus on and iterate
more quickly. Rather than having one
big, complicated application thats difficult to update and redeploy, individual
developers can instead focus on smaller,
more specific components of the larger
whole, make necessary improvements,
and then put it all together in the end.
This makes troubleshooting much easier,
because if one component of the application needs attention, you can fix it
without needing to put the rest of the
software on hold.
What this does is it lets you get an
application into the marketplace more
quickly. You, of course, have to have
good design behind it in order to make all
of this work, but once you have the thing
out there in the marketplace, you can also
revise it much faster because its not a big
mega code base, because the components
are small and lightweight," says Hendrick.
" This idea of microservices has a real relevance here with Web-scale IT because it

allows organizations to build and revise


applications a whole lot faster.

Not Just For Large Enterprises


An important thing to remember with
Web-scale IT is that even though it seems
like a concept fit only for larger enterprises,
it can actually work for companies of any
size. In fact, the cloud has made it so that
smaller businesses and startups have access
to the same resources that larger enterprises do from the very beginning. This
makes the overall market more competitive, and it also means that innovation isnt
necessarily coming from larger enterprises
and trickling down.
In fact, Hendrick says that a lot of the
most innovative ideas are coming from
smaller businesses now that they have
access to better resources. For example,
he tells the story of a bike shop that had
a hobbyist application developer. The
store built a unique solution using IBMs
Watson facial recognition software with
a camera and monitor. Customers would
come in and answer a few questions and
during this time, the camera and software
would size up the customer and recommend a few types of bicycles based on all
of the gathered information. This system
helped customers and salespeople because
it ultimately got them through the decision-making process quickly.
One individual who worked for a major
car company happened to be in this small
bicycle shop, saw this system in action, and
then wanted to bring something similar
to his the dealership. Heres an example
of innovation working in reverse where
something very clever was introduced in a
small business, and then made its way upstream, says Hendrick. I dont think this
whole idea of Web-scale is necessarily restricted to just big organizations. With the
agility and ease with which developers can
now have access to all different kinds of
very sophisticated technology, were going
to see a whole lot more very innovative applications coming out and being leveraged
by organizations of all sizes.

CyberTrend / September 2015

35

High Performance Computing?


This Is Why You Need Supermicro
THE NEXT-GENERATION X10 SOLUTION
DELIVERS TOP SUPERCOMPUTING PERFORMANCE
GO AHEAD, SUPERMICRO invites you
to compare its supercomputing servers
with competitors models. With an extensive range of SuperServer platforms
in 1U/2U/4U/7U form factors, says
Douglas Herz, product market manager
with Supermicro, with support for up to 30
GPUs in 7U on the companys SuperBlade
systems, Supermicro offers an unrivaled
range of flexible configurations to meet any
scale supercomputing challenge.

Power & Efficiency Inside


Leading up the new X10 product line,
the SYS-1028GQ-TR(T) (show above)
maximizes performance and density
through pioneering non-preheat GPU architecture and PCI-E direct connect (no
need for extension cables or re-drivers)

for lowest latency, says Herz. The system


supports dual Intel Xeon E5-2600 v3
processors (up to 145W), up to 1TB ECC
DDR4 2133MHz memory (16 DIMMs),
quad double-width GPU/Xeon Phi coprocessors (up to 300W), two 2.5-inch
front-facing hot-swap SATA3 drives and
two 2.5-inch internal SATA3 drives (hard
drives or solid-state drives), and dual-port
10GbE LAN. Servers in the X10 line are
ideal for oil and gas exploration, medical
image processing, and many other research and scientific applications that involve large data sets and require the highest
processing capabilities, says Herz.

Energy & Space Savings


The X10 lines streamlined architecture
is built to provide the best signal integrity

INTEL, THE INTEL LOGO, XEON, AND XEON INSIDE ARE


TRADEMARKS OR REGISTERED TRADEMARKS OF INTEL
CORPORATION IN THE U.S. AND/OR OTHER COUNTRIES.

while minimizing cables, repeaters, and


other obstacles to achieve maximum airflow and cooling.
The servers use 2,000W fully redundant
power digital power supplies for Platinum
Level high efficiency. Designed to take
full advantage of ultra-high-performance
GPU/Xeon PhiTM accelerators while minimizing power consumption, the servers
bring new levels of energy-efficient performance for compute-intensive data
analytics, deep learning, and scientific applications, says Herz.

Learn More
Supermicro offers multiple supercomputing servers to meet specific requirements. Contact Supermicro to find out
more about pricing and availability.

SUPERMICRO | 408.503.8000 | WWW.SUPERMICRO.COM

36

September 2015 / www.cybertrend.com

Mobility & The Network


HOW MOBILITY IMPACTS CORPORATE WLAN PERFORMANCE

KEY POINTS
Add more access points to
combat the strain that more mobile devices coming into the workplace are putting on networks.
If older devices on a corporate
network are slowing down other
devices, consider upgrading any
legacy devices.
When upgrading capacity, keep
in mind that users will likely connect more devices to the corporate
network in coming years.
Upgrading to 802.11ac technology will add extra capacity and
other improvements that should
alleviate some issues an organization may be currently experiencing.

38

September 2015 / www.cybertrend.com

ENTERPRISE NETWORKS ARE under duress. Virtualization, cloud computing,


applications, streaming video, real-time
communications, and other entities are
certainly contributing to the strain corporate networks are feeling, but where
corporate WLANs (wireless local-area
networks) are concerned, mobility and
the increase of tablets, smartphones, and
other wireless devices now in the workplace are producing a slew of headaches
for network administrators charged with
overseeing performance. In short, Andre
Kindness, Forrester Research principal
analyst, says for organizations supporting
BYOD (bring your own device), a WLAN
refresh is a common undertaking.
In addition to detailing the impact
more mobile devices are having on enterprise wireless network performance, the
following details problems that can result,
how network upgrades can alleviate the
pain, mistakes to avoid, and more.

Devices = Problems
Increasingly, WLANs are becoming
mission-critical infrastructures to organizations, to the point that wireless networks
are becoming the primary access layer in
many environments, says Mark Tauschek,
Info-Tech Research Group associate vice
president, infrastructure research practice. In other words, the vast majority of
users are relying on wireless connectivity
and not plugging blue cables in, he says.
For many enterprises this means simply
installing an AP (access point) in a boardroom for Wi-Fi access during meetings
isnt enough anymore. Workers need and
expect throughput in all locations.
Not long ago, network administrators
could roughly plan for one wireless device per user. With widespread tablet and
smartphone adoption, Tauschek says to
assume each user will now connect up to
three devices. As wearables, such as smart
watches, catch on and begin connecting

Encourage workers to not accept that the network is


slow. If poor performance is making it hard to do your
job, IT and management need to know that.
JIM RAPOZA
Senior Research Analyst
Aberdeen Group

via Wi-Fi rather than predominately via


Bluetooth, he says, networks will take on
even more of a burden. In coming years,
one user could be connecting a tablet, a
smartphone, one or two wearables, and a
laptop, Tauschek adds.
Today, even small companies are upgrading their WLANs to multiple APs
to handle more devices. Chris DePuy,
DellOro Group vice president, says in
terms of capacity, DellOro has observed
50 to 100 connections on high-capacity
APs. Once the number of high-bandwidth-consuming users tops 10 or so,
however, some APs run out of capacity,
he says, although this can vary widely depending on the AP.
Although mobile devices dont typically use an enormous amount of data,
Tauschek says there can be significant
chattiness if each user has multiple devices connected, and that will impact performance. Applications can also impact
network performance, because he says
they are more media-rich. Tauschek adds
that developers typically dont consider
bandwidth implications because theres a
perception that bandwidth is not a limitation.

Problems At Hand
Older devices may pose wireless network-related problems, because they can
slow down other devices on the network.
Lee Badman, Syracuse University adjunct
and network architect, says that while
less powerful Wi-Fi radios dont equate
to slow, especially in a well-designed
WLAN environment, where slowness can
occur is from dated radio technology, such
as 802.11b/g, operating in a world thats

moving to 802.11ac technology. Dated


technology can be a real pain, Badman
says, especially where legacy devices that
demand data rates and security protocols
that are no longer supported exist.
Organizations can address this scenario
either by accommodating the old devices,
with the penalty being decreased performance for newer client devices and associated network complications, or they can
disallow the legacy stuff, Badman says.
The latter will hopefully identify better
gear for the people youre saying we will
not support that to, he says.
For many organizations, this situation
isnt much of an issue, Tauschek says. In
fact, most organizations wouldnt even
know that the network was adjusting itself to accommodate older/slower clients.
The only thing you can do to mitigate if
it is an issue is to upgrade older clients,
he says. Suffice to say that b clients will
slow down other clients and the network
in general, while g clients will impact
overall network speed but not other clients
because they contend for time slots at a
lower transfer rate.
In general, its often users that will tip
organizations off that wireless network
performance is lagging. Jim Rapoza,
Aberdeen Group senior research analyst,
encourages workers to not accept that
the network is slow. If poor performance
is making it hard to do your job, IT and
management need to know that.
Most organizations have an NMS (network management system) in place that
is capable of monitoring network traffic
and performance and viewing when bandwidth consumption is peaking and where
bottlenecks and chokepoints exists. If

not utilized fully or if missing, however,


Tauschek says users who will complain
loudly and frequently will be the best indicator that the network isnt performing
as it should.
In terms of scale, larger organizations
typically face the greatest hurdles with
wireless networks, including having to
scale to the most users and devices and
accommodate the largest areas and most
remote locations. That said, wireless coverage can be a complex issue for any sized
organization in terms of an APs location,
building layout, and building materials
impacting wireless performance. Often,
poor floor plans, multiple floors, wall
thickness, machinery, and other WLANs
located nearby can impact wireless performance.
Mike Fratto, Current Analysis principal
analyst, advises monitoring airspace for
interference from unauthorized APs on
the network, other radios in the 2.4GHz
or 5GHz range, and any other sources
of interference. Work with neighbors to
make sure there are no APs using overlapping channels, he says. APs that overlap
can degrade performance because the
overlapped channel is treated like interference and causes errors over the air. Letting
802.11 radios work out fair access will improve performance for everyone.
Microwaves, says Tauschek are one
example of a non-Wi-Fi device that can
cause interference. Numerous devices
operating in the 2.4GHz range, he says,
can potentially cause some interference.
Having visibility, from a troubleshooting
perspective, into the spectrum via handheld tool or the AP will put the organization on a better path to identify sources of
RF (radio frequency) interference, he says.

Avoid Mistakes
For organizations considering a wireless network refresh or upgrade, performing a thorough site audit up front
that covers device density, analyzing network traffic (applications in use, where
people use devices, etc.), WLAN security,

CyberTrend / September 2015

39

and other pertinent issues can help them


avoid many problems later.
Matthew Ball, Canalys principal analyst, meanwhile, says some organizations
fail to recognize that theres a difference
between treating a wireless LAN as an
overlay network vs. deploying a converged
fixed/wireless infrastructure with singlepane-of-glass management. He explains
that overlay networks typically are treated
secondary to a primary fixed network,
meaning they lack adequate capacity and
coverage. This approach also means two
points of management, two security solutions, etc., are required. The converged
approach addresses these areas, Ball says.
Organizations could also treat both the
WLAN and fixed network as the primary
network, providing both with adequate
resources, he adds.
Today, largely due to BYOD and
mobility initiatives, its recommended
that organizations build out wireless networks with a focus on adding
capacity via more APs rather than
stressing coverage, as was typical in the
past. Inevitably, if you build for coverage your network is very likely to fail,
Tauschek says. Rather than just meet
immediate capacity needs, however, anticipate in your wildest dreams what

Inevitably, if you build for coverage your network is very


likely to fail. Rather than just meet immediate capacity
needs, however, anticipate in your wildest dreams what
capacity will be years out to reduce the odds of facing
issues related to coverage caps, interference, channel
allocations, power levels, and more.
MARK TAUSCHEK
Associate Vice President, Infrastructure Research ractice
Info-Tech Research Group

at clients vs. sending them out across a


wide area.) Daryl Schoolar, Ovum principal analyst, says while 802.11ac increases
network throughput and handles more
users, because it operates in the 5GHz
spectrum level its radio range isnt as
great as 2.4GHz radios with 802.11n.
Fratto adds that as existing APs are replaced, the 5GHz range will get congested, as well.
Tauschek says with the use of MIMO
(multiple input, multiple output) and
beamforming, many of the issues that
once occurred with latency, delay, and
multiple paths related to bouncing signals dont exist today. You dont see with
802.11n, and certainly with 802.11ac,
the same types of issues we used to have

AS A STARTING POINT IN UPGRADING


WLANS, MOST EXPERTS RECOMMEND
ORGANIZATIONS WORK WITH THEIR
WIRELESS VENDORS.
capacity will be years out to reduce the
odds of facing issues related to coverage
caps, interference, channel allocations,
power levels, and more, he says.
For organizations that havent already,
an upgrade to the 802.11ac standard is
likely in the immediate future. Compatible
with the previous 802.11n and 802.11a
standards, 802.11ac adds extra capacity via
support for roughly gigabit Wi-Fi speeds
and includes added beamforming abilities.
(Essentially, beamforming targets signals

40

September 2015 / www.cybertrend.com

with g and a/b before that, he explains.


Overall, Tauschek says that when he identifies a trouble spotmaybe an organization is having difficulty getting coverage to
a particular locationhe says, my guidance is to add another access spot. The
best way to mitigate trouble spots is to add
capacity and add an access point. Frankly,
thats probably the cheapest way, too.
Elsewhere, Fratto says for organizations
using a controller-based WLAN, distributing controllers closer to APs can take

traffic off the network closer to where it


originates. Some vendors also split off
controller functionality, he says, which
maintains the benefits of centralized control of all APs while terminating WLAN
traffic on the AP and putting it directly
onto the wired network. This model also
maintains survivability because if the AP
loses touch with the controller, it will continue to operate with the last good configuration, he says.
Badman, meanwhile, says the best
thing organizations can do to improve
their WLANs is just ensure that all engineering, installation, or support staff
know what they are doing. From design
to daily operation, skilled expertise is the
difference between constant headaches
and user complaints and a system that
just works, he says. Beyond this, developing a WLAN that enforces and enables
the organizations business operational
goals is key, he says. Afterward, he cautions against doing weird stuff with the
WLAN, such as trying to make all sorts
of consumer-grade junk work on it at the
cost of reliability.
As a starting point in upgrading
WLANs, most experts recommend organizations work with their wireless
vendors, which have tools that can account for building layout, materials, etc.,
and that can model environments and
demonstrate coverage to pinpoint whats
needed to increase coverage areas, create
seamless integration among areas on a
campus, and more.

Boost Your Cellular Signal


GET BETTER RECEPTION AT HOME & IN LARGER BUILDINGS

ALTHOUGH CELLULAR coverage has improved substantially over the past few
years, there are still gaps in certain areas
that result in degraded or lost signals. And
in office buildings, sports arenas, and other
large facilities, there are often so many devices competing for a signal that performance isnt as good as it could be. If your
organization has such an environment, its
good to analyze the geography and determine what technologies or strategies will
work best to improve the cellular signal
overall and provide users with the best experience possible.

How To Measure The Signal


The signal from a cell tower is typically
measured in dBm (decibel-milliwatts).
This measurement is translated visually
into the bars that appear at the top of
your phones screen. The more bars, the
better. If you care to dig deeper, you can
find the actual dBm measurement, says

Todd Landry, vice president of product


and marketing strategy at JMA Wireless.
To do this, go to the network settings
on your Android device or, if you have
an iPhone, you may be able to download
apps that will tell you what your signal
strength is.
Landry explains that a good signal for a
phone typically measures around -65dBm
to -90dBm, and an excellent signal measures somewhere at or beyond -60dBm.
However, he also points out that as the
radio technology in phones has gotten
better, these devices are able to derive a
quality signal from a lower dBm. For example, your phone might have a -113dBm
signal, but you could still be experiencing
a relatively high-quality connection. Thats
important to keep in mind because it gives
you a bigger range to work with when
trying to improve a signal, whether its at
home with your own device or on your
companys campus.

Boosting The Signal


In Your Home
Landry points out that if you simply
cant get a reliable signal in your home,
there are some devices on the market
that can give you a boost. For example,
WeBoost is a small antenna that is
mounted outdoors, which is then cabled to
a powered device placed indoors, he says.
This product actually increases the signal
from the local cell tower in your area, but
you have to be careful to make sure it supports the type of service you have, the carrier you use, and whether or not you have
4G/LTE capability.
Landry recommends that home users
dealing with weak cellular signals start
by speaking with the carrier to find out
if they can help solve the problem before opting for a third-party solution. If
you do go with something like WeBoost,
you need to first tell your carrier, because there are actually FCC regulations

CyberTrend / September 2015

41

on transmitting, including boosters,


regulated radio frequencies, he says.
And its also important to remember, as
previously mentioned, that cellphone
technology is improving on a yearly if
not monthly basis, so if your current device consistently fails to retrieve a good
signal, then it may simply be time to upgrade your device and take advantage of
new radio technology innovations.

Boosting The Signal


In Larger Buildings

Today, a good DAS [distributed antenna system]


has the flexibility to adapt to a wide range of different
environments. . . . Examples include combining different
remote units with different power levels to fulfill larger or
smaller areas; delivering DAS service to a combination
of both indoor and outdoor areas, such as parking areas;
and centralized DAS systems that make it so carrier cell
equipment can be located offsite . . .
TODD LANDRY
Vice President of Product & Marketing Strategy
JMA Wireless

There are two main technologies used


to boost cell signals in buildings, arenas,
and other larger facilities: small cells building. In every case, these are de- that in addition to installing antennas
and DAS (distributed antenna systems). ployed with carrier participation.
around the facility, a good DAS soluLandry says that small cell technology is
tion will also bring in a dedicated cell
a better fit for smaller buildings or offices Distributed Antenna Systems
or sector of a cell that is applied at the
that are less than 50,000 sq. ft. but usuFocusing specifically on those larger
front of the DAS to enable to facility
ally no more than 100,000 sq. ft. For fa- implementations, distributed antenna with good coverage and capacity, he
cilities that are around 100,000 sq. ft. up
systems give large facilities and outdoor says. In other words, youll get much
to 500,000 sq. ft., or even larger, Landry arenas a way to not only boost the covbetter data performance to go along with
recommends that companies look into erage areas, but also the capacity. While those extra bars.
DAS solutions. Its essential to match its possible to use a DAS to simply boost
The interesting thing about DAS is
the implementation with the size of the
an existing signal, Landry says that in
that a single implementation can be used
building or area, otherwise you could
most cases, when its deployed, a new set
to combine multiple carrier signals for
still end up with gaps in coverage that of cells are also employed. This means the benefit of users. Landry says that a
will result in a less than ideal
DAS unit is typically comuser experience.
posed of a POI (point of
Its important to note
interface) that works with
that small cells are typiequipment from cellular
cally limited to singleor
carriers, a master unit that
a small number ofbands
does the work of combining
or frequencies, and in deand managing those signals,
ployments [small cells]
a remote unit that provides
are limited to per-carrier
power to the antennas, and
scenarios, says Landry.
the antennas themselves,
Therefore, if your venue
which emit and receive the
needs broad coverage for
RF [radio frequency] from
multiple bands and multhe mobile phones.
tiple carriers, a distributed
Landry cautions that
antenna system is the right
when looking for a DAS
answer. Note that [DAS]
vendor and installer, its imsolutions are quite difportant to make sure they
ferent from a home booster
can provide all four of these
as they are much more Depending on the size of your facility or how many buildings are located on a
components if you need
sophisticated and offer campus, youll need to choose between implementing small cell technology or
them, because some venmuch more control over a DAS (distributed antenna system).
dors offer some but not all
delivering signals into a
of these components.

42

September 2015 / www.cybertrend.com

Software-Dened Networking
IS IT A GOOD FIT FOR YOUR COMPANY?

SDN (SOFTWARE-DEFINED networking)


is a major topic of conversation in the
IT world because it promises to give
companies more flexibility and agility
in their networks. Implementing SDN,
however, isnt just a matter of putting a
solution in place and turning it on. This
networking approach requires not just
a technological shift, but also a shift in
culture and staff expertise. What this
means is that while many companies
could benefit from SDN, it may not
necessarily be a fit for every organization. Whether SDN is ideal for your enterprise ultimately depends on the size
of your business, the current state of its
network, and your operational goals for
the future.

Larger Organizations Benefit


Most From SDN
SDN is the type of technology that
is built for scale, so the larger your net-

work, the more critical it is to your business, and the more you spend on it,
the bigger benefit youre going to get,
says Andrew Lerner, research director
at Gartner. Some of these benefits include a more agile and easier-to-manage
network as well as reduced OPEX and
CAPEX. Lerner says SDN also brings
on another, less obvious benefit in that
it enables market innovation. Because
you can do things faster and in more
unique ways, you can differentiate yourself in the market and offer more valuable services to your customers.
Lerner admits that SDN isnt restricted only to larger organizations and
that some smaller companies may benefit from its implementation. However,
because SDN is such a major technological change, it requires a different approach, mindset, and skill set that SMBs
(small to midsize businesses) simply
may not have in place. If you think

about it, what are the organizations that


have the technological wherewithal to
do that? Lerner asks. Its not your
SMB. Its your carriers and very advanced and large organizations. At this
point in the market, the folks who stand
to benefit the most from it are often the
folks that actually have the technical capability to do that.

Its A Big Technological Change


From a pure technology perspective, SDN can fundamentally alter the
way your network operates and how
the rest of the business interacts with
the networking approach. Lerner says
SDN essentially makes a network more
dynamic, more programmable, more
agile, and easier to manage. Rather than
having the network be its own separate
entity that applications simply utilize,
you can tie your applications to the network and improve overall integration.

CyberTrend / September 2015

43

If youre an organization and your network is running


perfectly fine, is highly automated, and highly agile because youve either done some stuff yourself or are using
some automation tools, you may not necessarily need
SDN. If you are an organization that is wanting to tie your
application workloads much closer to the infrastructure so
that the applications can move within the infrastructure
and have it proactively adapt, thats a different conversation. Where you are and what your pain points are today in
terms of your environment are really important.
ANDREW LERNER
Research Director
Gartner

For example, Lerner says, companies can connect their cloud management platforms directly to the network
so they dont need to make as many
manual changes. He adds that the same
idea can be applied to multifaceted applications with different networking
needs. Instead of having one base network for doing everything, an SDNenabled network can essentially adapt
to a given application and give it access
to the specific resources it needs.
Take an application such as Microsoft Lync, which can do voice calls, video
calls, desktop sharing, instant messaging,
and more. "Those all have dramatically
different networking requirements."
Lerner says. "The way networks are built
today its just one network, and that
traffic would all follow the same path.
With SDN, that Lync server could send
an API to the SDN controller and say
this is voice traffic, so I want to apply
this set of policy to it or this is video
and this is instant messaging, so I want
to go this other way. It creates this opportunity to dynamically change the network to the needs of applications, which
is something we dont have in todays
traditional networks.
Just because SDN is a big change in
terms of technology and introduces

44

September 2015 / www.cybertrend.com

much more automation doesnt mean


that you wont still require the deep networking knowledge of an IT team. What
your organization will also require is a
shift in thinking where your IT team
gains cross-functional knowledge and
a better understanding as to how the
network connects to different parts of
your infrastructure. Thats the evolution from a traditional IT networking
person, Lerner says.

Lerner adds that networking practitioners are traditionally risk-averse, because


their primary incentive has always been
uptime. He says that network administrators arent typically rewarded for
saving money or being innovative, but
they may be punished or even fired if
the network ever breaks. For that reason,
many organizations have a culture of
evolution with incremental change
rather than pure revolution, which flies
in the face of what SDN is, Lerner says.
By its very nature, its a new way to design, build, and operate networks.
Thats why its so important to ease
into the implementation of SDN. You
have to make sure your company's infrastructure is ready for the shift and
your workforce will be able to adapt to
the changing roles and responsibilities.
Youre requiring IT organizations that
have operated in silos for a long, long
time to make a change, says Lerner.
Its easy to say you need to tear down
the silos, but youre dealing with peoples lives. Many people are resistant to
change, so you have deep staffing and
cultural issues.

Not All Companies Need SDN


Potential Challenges:
Staffing & Culture Issues
On the subject of a change in thinking, Lerner says the No. 1 barrier to implementing SDN is typically staffing and
the fact that it requires a different skill
set. If you have a very traditional networking organization, you have to recognize that its going to be tough, youre
going to need to evolve your teams
skill sets, and its going to require crossfunctional communication, says Lerner.
Its not just a network thing anymore.
Youre going to have to pull in your application team and the rest of your infrastructure team. If youre not doing that,
theres going to be more friction to go
to SDN. If you are doing that, it doesnt
mean its a slam dunk, but youre at least
further along the way.

In the end, deciding whether to


implement SDN comes down to how
much your company will benefit from
the change if your company really needs
SDN. Lerner says companies with networks that are running perfectly fine
and are highly automated and highly
agile may not need SDN. In fact, introducing a major technological overhaul could upset the network balance
youve already established. However, if
youre looking to improve the way your
network operates, increase agility, and
improve overall management, then SDN
may be a fit for your company. You
have to understand the strengths and
weaknesses of your current networking
approach and determine if a move to
SDN would benefit both your IT team
and business.

Better Understand Enterprise Risk


MITIGATING RISKS REQUIRES COMPANY-WIDE POLICIES & A RISK-AWARE CULTURE

KEY POINTS
Risk and security are actually
business issues, so a business
should manage them and have IT
put solutions in place.
Data loss and loss of brand
reputation are among the most
crucial issues youll face if you
dont take a risk-based approach.
Consider implementing GRC
platforms, to better handle risk and
compliance, and put strong quality
assurance processes in place.
Security and IT teams need to
use business terms when communicating risks so execs and
managers understand the importance of fixing those issues.

46

September 2015 / www.cybertrend.com

WHILE THERE ARE quite a few areas where


business and IT groups tend to have a disconnect, risk management is one area that
doesnt always get enough attention. Risks
are everywhere in the business world, and
they can negatively impact companies in
a variety of ways. If companies really want
to mitigate risks and avoid major issues,
then business and IT groups need to come
together and communicate. The first step
in ensuring your company has the right
approach for viewing and managing risk
is to make sure employees in all departments understand what their roles are in
the process.
Michael Versace, global research director at IDC, stresses that companies need
to have a strong culture in tone, from
the top of the organization, on everyones
role, which means that executives and
members of the IT team need to be on
the same page and understand that risk
touches every part of the enterprise. You

have to start by building this strong risk


culture because if it doesnt exist, then
whatever process, procedure, or technology you use underneath that is fundamentally going to be less successful,
Versace explains.
Its important to point out that even
though this risk culture has to start from
the top, it shouldnt just be executives
pontificating risk from on high, says
Renee Murphy, senior analyst, security and
risk, at Forrester Research. She adds that
everybody should be contributing to the
risk portfolio rather than one group being
in charge of everything. There has to be
some kind of universal definition for what
risk is and what constitutes risk, because
it becomes a problem when IT sees risk
and security differently than other parts of
operations, Murphy adds.
To help solve this problem, Versace
says companies need to make information security a business quality priority

and understand that its not something


that simply sits in technology. Companies
need to make sure that risk is a priority
and is included in every single policy and
process to minimize it as much as possible.
It also doesnt hurt to have regular conversations in the organization to make sure
everyone is on the same page.
If you want everybody, at least in IT, to
understand risk and have a risk-aware way
of working, I think you have to put committees together where youre all sitting at
the same level having the same conversation about risk and how it applies to your
downstream work, says Murphy. You
have a technical operations manager there.
The CISO is there. You have the help desk
guys there. And youre hashing out all the
risks. Thats a good way to get that started.
You certainly cant do it by yourself in security or operations.

Understanding Risk
Once conversations among the business
and IT groups begin, you can move on to
establishing the definition of risk and work
to spread that idea throughout the organization. The first step, according to Versace,
is to understand that risk and security are
actually business issues and not software or
technology issues. For that reason, he recommends companies consider moving security out of IT and making it a function of
the business instead. Using this approach,
the business can pinpoint enterprise risks,
and IT can help put software and technology in place to minimize those risks.
Ultimately, the end responsibility with
incentives and compensation models
around it for security and risk, have to be
established at the line of business level,
says Versace. These folks have to have the
ultimate endgame responsibility, and its
their responsibility really to bring development, operations, and security together on
solving those problems for their products,
services, and ultimately the line of business
that they operate.
Versace also recommends that companies put business training programs in

[Companies need to make] information security a business quality priority. . . . Its not something that simply
sits in technology. Its something that must sit in the line
of business, and we see that. We see a whole bunch of
trends indicating how much more business is influential
in making technology decisions that relate to risk and
security. Organizationally and policy- and process-wise,
making these things a business responsibility, priority and
accountability with incentives is really important.
MICHAEL VERSACE
Global Research Director
IDC

place so their security and operations staff


can get a deep understanding of the products, services, customers, competitors, and
the market, in general. Then, they can use
training and educational resources to make
sure everyone is focused on protecting the
business. This can also lead to social collaboration programs where employees on
the business side and the IT side can communicate better and exchange ideas for
how to manage risk.

The Dangers Of Not Taking A


Risk-Based Approach
Without having a proper risk management strategy in place, you could be
opening your company up to dozens of
potential problems that will negatively impact your business in different ways. For
example, Versace says perhaps the most
obvious and fundamental outcome of not
managing risk is financial loss, whether
thats loss of business, loss of customer
trust, inability to meet deadlines, or loss of
position to competitors.
The idea of financial loss also leads to
potential loss of brand recognition and
trust. If someone decides to attack your
company and steal information, you will
end up losing reputation and your brand
could take a hit it wont be able to come
back from. Versace says it can be so damaging to find yourself on the front page of

a newspaper for a data breach that it may


cause a significant loss of value for the
company in the market.
In addition to financial loss and a damaged reputation, you also have to consider how a failure due to improper risk
management will impact your workforce.
Organizations tend to be less productive
when there are a lot of actions that have
to be carried out as a result of some failure
in a risk process, Versace explains. And
thats why its crucial that everyone within
an organization, regardless of position,
should constantly think about risk.
The concept of risk management is built
on the idea that every business decision
you make, every policy you put in place,
and every solution you implement should
be done so as to mitigate risk in one way
or another. Even if you think about something as simple as setting up a phone line,
you put that technology in place to avoid
the risk of customers or clients not being
able to reach you. It sounds obvious, but
not making risk a part of doing business
could make you vulnerable to attacks, data
loss, or other security issues.
For example, Murphy says, if your company is in the health care industry and
you dont have a DLP (data loss prevention) system in place, you probably have
leaks you dont know about. Data loss
doesnt always come in the form of a virus

CyberTrend / September 2015

47

Understand your role in the organization and leverage


roles where you dont have a lot of strength. Lets say I
have a risk program and I think its running pretty well,
but I want to monitor all of the remediation. You should
be tapping your audit department for that. They should be
telling you whether your risk is appropriate based on the
outcome of your remediation efforts. If youre not leveraging audit and operations, because they have a ton of data
you need, and youre not leveraging security, and youre
not all coming together to have that conversation about
risk, then youre not a risk-centric organization.
RENEE MURPHY
Senior Analyst, Security & Risk
Forrester Research

stealing information or a person hacking


your systems to see sensitive data. It can
sometimes happen as a result of human
error. Thats why Murphy recommends
companies keep risk in mind, regardless of
project or process.
If youre not doing this to mitigate risk,
why are you doing it then? Murphy asks.
The only reason you do anything should
be to mitigate risk. You have a risk and you
want to mitigate it, so you create a control.
Then, audit goes in and tests that control
to make sure the mitigation works. If you
ever have a control that isnt tied back to a
risk, you are literally doing it for no reason,
and thats the danger of not taking a riskbased approach to security.

Technologies & Strategies


To Consider
After you have the policies and concepts
established, then you can start looking at
the technologies you can implement to aid
in the risk management process. Murphy
says that some companies use survey tools
to get feedback from employees as to what
systems, processes, or potential issues keep
them up at night. Theyll give you this
litany of stuff and you just throw it into a
registry, she says. And with that registry,
you can start pinpointing areas that require

48

September 2015 / www.cybertrend.com

attention and then start figuring out what


technology to use.
For example, if your company has to
consider regulatory compliance, you might
want to implement a GRC (governance,
risk, and compliance) platform that allows you to put all of that in place, put
workflow around it to resolve it, and do incident management and tracking through
it, Murphy says. You can also find GRC
platforms that include financial transaction
and security transaction monitoring to
get an in-depth view of your organization
and potential risks. They can be specific
to industries, and there are some related
to healthcare and financial, and there are
others that are like a Swiss Army knife that
you can use for everything, Murphy says.
However, its also important to remember that software doesnt solve everything. If you put risk management systems
in place, but you have no one in-house
who can validate or understand how to use
them, youll be hard-pressed to actually get
anything useful from them. Versace says
that every company with a significant IT
component needs to have a strong QA
(quality assurance) function and a set of
structured programming quality standards
that are evaluated, tested, and validated.
This means that when risk management

systems or any other solution are put in


place, they need to be checked for QA
throughout the development, testing, and
production processes, he says.
Versace stresses the importance of including QA as a part of the business continuity process as well, which can also
involve what happens when you make
changes to infrastructure. He says you have
to be able to evaluate how changes to an
operation or application or even to a significant business process might impact the
ability of the firm to recover from a minor
problem, like a software or server failure
or a storage array running low on space,
or a major failure that could be related
to a major issue in the data center or loss
of power.

Communication Is Key
Although the strong risk-aware culture
has to come from the top of the organization, there is also a huge amount of responsibility on the security team within
an organization to properly communicate
its concerns to executives. Murphy says
that too often executives in companies are
warned about security issues but because
the warning isnt given in business terms,
it falls on deaf ears. Its important that the
security team learns how to speak to the
business and that the business listen to
what the security team has to say.
Even when you do bring them the
risks, if management doesnt have an understanding, then that, to me, is the biggest
disconnect, says Murphy. Once you understand the risk, you have to understand
how it actually feeds into the corporate
risk and figure out how your security solutions and your security risks impact that
business. Then, he adds, you can point
it back to the business and have business
conversations with the rest of the world.
You can have your conversations about
security and technology internally, but the
minute you take that out of your own department, understand that you need to
speak in business risks, because technology
risks will never win the day.

Long-Term Enterprise Storage


COMPARE THE ADVANTAGES & DISADVANTAGES OF TAPE, DISK & CLOUD STORAGE

KEY POINTS
Tape is still the least expensive
and one of the most reliable longterm storage options.
High-capacity disk storage is a
great fit for primary and secondary
backup, but it isnt necessarily a fit
in regard to price as archives dont
require quick recovery times.
A disk and tape hybrid strategy
is often best because you get the
quick recovery time of disk for primary backup and the low cost and
reliability of tape for archiving.
Although long-term cloud
storage is a hot topic, you often
wont know whether data is stored
on tape or disk due to abstraction.

50

September 2015 / www.cybertrend.com

WHEN IT COMES TO long-term storage,


most companies are content to find the
least expensive option with the highest
reliability and maintain it for as long as
possible. Although this approach certainly makes sense from a cost standpoint,
its also important to survey the storage
market to determine whether there are
other potential technologies that might be
better options in the future, or to identify
alternatives to the tried and true method
you already have in place. But before you
actually decide what type of storage medium you want to use, its best to understand what long-term actually means in
this context and what types of data qualify
for long-term storage.
John Sloan, Info-Tech Research Group
director, says that different companies will
define long-term storage in different ways
and it ultimately comes down to understanding the difference between an archive
and a backup. A lot of organizations get

into the practice of keeping all of their


backups forever and calling it their archive,
but thats not really an archive, says Sloan.
An archive is as much about deciding
what you can get rid of as deciding what
you can keep. An archive is where you look
at defining what documents, records, and
materials need to be kept for compliance
reasons or historical record, and making
sure a copy of those is kept. Just having
complete backups and keeping them forever is not an archive.
Using this thought process, youll be
able to determine that the proper use of
long-term storage is almost entirely based
around archives and the only reason
you should indefinitely keep backups is
if youre under some kind of legal requirement to do so, says Sloan. If not,
then youre free to keep your archive on
tape, disk, or in the cloud, depending on
the needs of your company. Each medium has its own unique benefits and its

own potential use cases that will make the


most sense for your business.

Tape Storage
Tape is by far the longest-standing type
of long-term storage available today. Even
though it has been around for about 60
years, a majority of companies still use the
technology today for archiving and other
long-term storage needs. Because it has
been in use for so many decades, vendors
have practically perfected the technology
to be cost-effective and reliable, but that
doesnt mean the technology is stuck in
neutral. Sloan says that many vendors are
adding LTFS (Linear Tape File System)
formatting to their tape storage solutions,
which essentially makes it much faster to
restore media than it was in the past because you can find out very quickly whats
on the tape and then pull it up.
In addition to being inexpensive, sometimes costing as little as pennies per gigabyte to store data, tape is popular for
its reliability and longevitybut only if
its properly maintained. Tape has a long
shelf life and the range that it can hold data
is 30 years, which makes it ideal for longerterm storage, says Sloan. But there is
also a component of actually managing
the media, moving it from place to place,
putting it in proper storage, and having it
properly indexed. And that has to be done
well in order to get the value of tape as a
long-term storage medium.
This is where tape starts to get a bad
rap in some circles. Sloan says hes talked
with organizations and heard horror stories about pulling up an important file
from a tape only to find it was corrupted.
However, he points out that this is often
caused by the user and not necessarily the
medium. Often the issue is not with the
tape itself, because the tape should be able
to last 30 years, he says. It may well have
to do with human error in the backup process or with the actual physical handling
and storage of the tapes.
One last caveat when it comes to tape
is that many people make the erroneous

Depending on what your compliance requirements are,


the cloud provider may be using tape and youre dealing
with that abstraction, some organizations have very stringent compliance requirements down to knowing what disk
the file is on, who has access to it, and that sort of thing.
With your own tape library, its much easier in an audited
compliance situation to be able to show this is where it is,
this is who has access to it, and this is how its retrieved.
JOHN SLOAN
Director
Info-Tech Research Group

assumption that tape has no cost over that


30 years of shelf life, Sloan says. But the
truth is that even though that tape wont
cost you anything in terms of electricity
over 30 years, or however long you store it,
there are other costs to consider.
The software that you use to do
backups is going to change and be updated, says Sloan. The tape drives and
formats that you use are going to continue
to develop. Its really unlikely that youre
going to have a tape sitting there for 30
years without touching it. Youre likely
going to have to refresh that tape library as
new technology comes along, so that tape
might be refreshed or transferred to a different format, and that all has a cost.

Disk-Based Storage
Another long-term storage option companies tend to consider is high-capacity
disk, but that doesnt necessarily mean they
should use it for archiving and similar use
cases. Tape got a bad rap, and somewhat
deservedly, in the 80s and 90s, as being
slow, dusty, and horrible to work with,
says Mark Peters, practice director and senior analyst at Enterprise Strategy Group.
There are many people who still view it in
that light, despite that not being the truth
these days. There is an emotional detachment from tape and an emotional attachment to disk, which leads to many people
incorrectly storing long-term data on disk,
even though thats probably a bad idea.

Perhaps the biggest knock against using


disk for archiving and other long-term
storage needs is cost. Peters says that in
the foreseeable future, tape will remain
less expensive than disk and in the road
maps for both disk and tape, the lines dont
cross. The reason why the extra cost of
disk doesnt make sense in an archiving
scenario is because you dont typically need
to pull that information up quickly, and
that stored data usually isnt accessed on a
regular basis.
Where disk really shines is with shorterterm backup scenarios. Sloan says the
strength of disk has always been on the restore side, which makes it more of a shortterm storage solution for more frequently
accessed data. Where youre seeing the
most displacement has been in that primary backup, he says. Thats where disk
has really sort of made its mark and replaced tape on that primary backup side.
But I dont see it as much on the longerterm cold storage going to disk. Its just not
as popular.

Tape & Disk Hybrid Approaches


If your company already uses tape for
archiving, but you also want to use disk
storage for primary and/or secondary
backup, then it makes sense to go with a
hybrid tiered approach to how you store
your data. Peters says that some data essentially goes from frequently used, to
only used every once in a while, to rarely

CyberTrend / September 2015

51

There is always room for more tiers. Life isnt as simple


as I like that shirt, I wear it every day because its new,
and then I never wear it again but Im keeping it in case
the grandchildren want it. Life moves from I wear this
shirt every day because its new and I like it, to hey, I still
wear it pretty often, to hmm, its seasonal, to hmm, I
wear it when I do yard work, until eventually it gets put
somewhere else. There are lots of grades of usage.
MARK PETERS
Practice Director & Senior Analyst
Enterprise Strategy Group

if ever used, which means the importance


of recovery speed for that data becomes
less and less important over time. For that
reason, companies often use D2D2T (diskto-disk-to-tape) or flash-to-disk-to-tape
backup solutions in order to maximize the
benefits of each specific storage medium
and keep storage costs in line.
Peters says companies usually start
with flash or disk storage as their primary
backup because of the immediate recoverability where, if theres a failure, it is
quicker to get that going from disk. But
if that data is not in use for longer periods
of time, many companies will choose to
trickle it off or tier it down to tape. On one
hand, this is an effort to save money, because from a total cost of ownership
perspective, its much cheaper on tape,
says Peters. On the other hand, tape is still
often used at the bottom tier of the storage
ladder because, Peters adds, despite what
people think, tape is more reliable longterm than disk.
These days, if you look at the raw reliability of tape, in other words the likelihood of uncorrectable bit errors, that
likelihood is much lower on tape than it is
on disk, he says We place other software
or code in front of disks, RAID specifically, to mitigate some of those problems,
so Im not suggesting that you see problems all the time on disk, but the fact is
that they still break, and as much as we
can mitigate it, if you have long-term

52

September 2015 / www.cybertrend.com

storage of something thats not going to be


used very often, its probably smart to put
it on the most reliable media rather than
something thats more prone to error.

Cloud Storage
Sloan and Peters agree that companies
need to look at the cloud as a delivery
method or a consumption model rather
than an entirely new storage medium. The
reason why its important to make this distinction is because the cloud is abstracted
and a cloud service could very well be
using tape, but you just dont know whats
on the back end, says Sloan. This could be
a benefit for your company because you
dont have to worry about what medium

your data is stored on as long as its accessible, but for other companies, this abstraction could be a disadvantage, especially
when it comes compliance and auditing
where you have to know exactly where
data is being stored at any given time.
Companies may also look toward the
cloud for long-term storage because as
Sloan says, there is a veritable price war
going on with major public cloud providers. But just because the cloud is the
least expensive on paper, doesnt mean
it will always be that way. Those incremental costs add up over time, says Sloan.
People want to use it as a long-term
storage option, but when they make out
the numbers, they realize its actually going
to cost them a bit over time. The cloud has
not always been the most economical solution in the long-term.
Whether you go with the cloud or stick
with a more traditional storage medium,
its important to compare feature sets
and find the technology that will fit your
needs. Dont forget that there is no such
thing as the cloud, and what I mean by
that is that everything about tape and disk
applies to the cloud, says Peters. The
only difference is that the cloud is owned
and operated by someone else. They will
be, and should be, going through the exact
same decisions.

OTHER STORAGE OPTIONS


Mark Peters, practice
director and senior analyst
at Enterprise Strategy
Group, points out a few
other long-term storage
options for companies
to consider, even though
they arent widely used
by any means and arent
necessarily in the shortterm conversation for
competing with tape, disk,
or the cloud. One example
is a disc that essentially

looks like a Blu-ray but is


made out of stone and has
a guarantee to hold data
for 1,000 years.
Peters also mentions
that many large organizations, such as Facebook,
are pushing the longterm storage industry
forward because theyre
asking for less expensive
flash storage to store the
massive amounts of data
that they need to keep

but that doesnt need to


be accessed on a regular
basis. These requests are
essentially promoting
R&D in the storage
industry and pushing
companies to try to find
new technologies that
could someday replace
tape or disk, although its
important to keep in mind
that, at least for now,
such a technology isnt
on the horizon.

GENERATORS

AUTOMATIC TRANSFER SWITCHES

UPS

PRE-OWNED GEN SETS


20-3000KW
LOW HOUR WITH WARRANTY

We buy and sell complete systems.


CALL FOR PRICING AND SPECIFICATIONS.
w w w. e m p i r e - c a t . c o m

INQUIRIES
Kris Davenport: 602.622.5619
kris.davenport@empire-cat.com

Certied Piedmontese beef tastes great: lean and tender, juicy and delicious.
But theres more to it than just avor. Certied Piedmontese is also low in fat
and calories. At the same time, its protein-rich with robust avor and premium
tenderness. Incredibly lean, unbelievably tender: Its the best of both worlds.

piedmontese.com

Social Media For Professionals


SOCIAL OUTLETS CAN BENEFIT MARKETING, RECRUITMENT, SUPPORT & MORE

ALTHOUGH SOME organizations have


fought against the use of social media in
the workplace for some time, networks
such as LinkedIn and Twitter are blurring the lines between business and personal use. In fact, some companies have
gone so far in the other direction that
they encourage the use of social media to
aid in marketing campaigns, CRM (customer relationship management) interactions, and other business tasks. Social
networking is unique in that individuals
can use it to further advance their careers
or make the entire company more successful. The key is to understand how professionals are using these different outlets
and determine whether such use cases
could fit into your environment.

Networks Professionals Use & Why


Jenny Sussin, research director at
Gartner, says that while there are currently
more social networks than ever before,

many businesses and professionals are


turning their attention to LinkedIn and
Twitter and moving away from Facebook,
Pinterest, and Instagram. For Sussin, its
a matter of constituencies and personas
where LinkedIn is supposed to be for
your professional persona and Facebook
is more for your personal persona.
Its you with your suit on vs. you with
your sweatpants on, says Sussin. Because
you might be connected with colleagues,
friends, and family, she says, the types
of posts that go up there and the types of
posts youre looking to consume when
youre on Facebook are different than the
types of posts that go to LinkedIn where
everybody is wearing their suits.
This same idea goes for Twitter, but
the lines are a bit blurrier. While many
people use Twitter for personal interactions, there is a major business component to the service as well, and it can serve
as a way for companies to interact with

customers and for professionals to interact


with one another. With Twitter, unlike
with Facebook and oftentimes unlike with
LinkedIn, theres a lot more discoverability because profiles are default set to
public, says Sussin. There is a lot of networking opportunity there, whereas you
dont necessarily see that in Facebook or
LinkedIn where theyre more closed off."

Growing Acceptance Of Social


Media In The Workplace
Sussin also points out that the acceptance of social media within businesses
has changed in the past few years. Where
enterprises used to be afraid of social
media and tried to block it at every turn,
they are now more open to using social
media platforms. One reason, according
to Sussin, is because many organizations
are now building social media clauses
into their ethical and press guidelines, so
theyre covered legally if they need to take

CyberTrend / September 2015

55

Facebook insights and Twitter analytics are nice, but


they dont answer all of the questions. If people need
free tools, theyll work with those tools. But for a lot of
what businesses need to understand how social actually impacts their business and not just the amount of
engagement going on, they really do need heftier tools.
A lot of what the social networks offer for free dont cut
it in an enterprise environment. For SMBs, it may end up
looking a little different, and there may be more interest,
but where Im at with most enterprises, that stuff doesnt
cut it for them.

HR action. This approach makes it less


risky for companies to allow the use of
social media in the workplace.
Another reason social media is somewhat more accepted in the workplace is
because many companies have essentially thrown their hands up in the air
and said OK, because their employees
have personal cell phones and dont need
the company network to access their social media profiles. The idea that its
something where people are going to
be wasting time is something that still
exists in many organizations, but its becoming less prevalent and more people
are opening up, Sussin says.

JENNY SUSSIN
Research Director
Gartner

Social Media As A Tool


For Recruitment
Some social networking services have
been known to benefit professionals and
businesses alike. LinkedIn is a perfect example because many professionals use it
as a way to find job openings to apply for
specific positions directly from the site.
Recruiters can use sites such as LinkedIn
to find prospective candidates for open
positions. In fact, Sussin says, companies
use social networks for 30% or more of
their recruiting efforts when compared
with going through agencies or other
traditional methods. LinkedIn profiles,
for example, let recruiters see a persons
work history as well as endorsements and
references from colleagues and peers. Its
unique in that it serves a dual purpose
and gives businesses and individuals access to the information they need all in
one place.

Expertise, Thought Leadership


& Troubleshooting
Building on the idea of using social
media for personal advancement, Sussin
points out that many professionals use
social networking, and specifically services
such as LinkedIn and Twitter, to assert
their expertise in a certain area. There
are other times when people use it for
research purposes, says Sussin. Maybe

56

September 2015 / www.cybertrend.com

theyre looking into a new industry and


want to see what people in that industry
have to say about it. Theres the expertise part of it on both the person whos
asserting their expertise and the person
whos seeking that expertise.
This idea can also apply to product
recommendations and troubleshooting.
By having access to a social network
full of professionals and experts, you
can seek out the vendors to compare
and the tools to buy to complete specific tasks. And if for some reason the
product youre currently using isnt
working as well as planned, you may be
able to find an expert in your network
who can help.

Marketing & Sales


Focusing on the business side of things,
people who work in marketing not only
use social media to improve future campaigns, but also for market research purposes and from an individual point of
view. Lets say I want to run a campaign
and I need to find out what women in
North Dakota are saying about breakfast
cereal, says Sussin. Thats something
they could do through social media.
Sales goes hand-in-hand with marketing, because the goal of any campaign

is to ultimately drive sales up, so it makes


sense that social networking can be used
to help that purpose, as well. Not only
can salespeople use LinkedIn or Twitter
as a way to prospect for potential clients,
but they can also use those sites to maintain business relationships. CRM systems
are ideal for keeping internal records, but
having outlets where you can directly connect to and interact with customers is invaluable for many organizations.

Customer Support
Another great use case for social networking, and one that builds on the idea
of customer outreach, is customer service
and support. The old days of customers
calling in or sending emails asking for
product support or troubleshooting are
fading as many users are now reaching out
to the social media presences of businesses
and expecting a quick response. For that
reason, some companies even have dedicated customer support Twitter handles
where users can go to ask questions and
get answers. In this regard, you need to
look at social networks as two-way streets
where you can reach out to customers and
market to them, but you also need to have
a system in place to support them in the
long run.

Missing CyberTrend when


youre on the go? View or
download the digital edition at

STAY AHEAD
OF THE CURVE

www.cybertrend.com to get up
to speed on the latest technology news and information about
products for your company.

THE LATEST PREMIUM ELECTRONICS

A Turbo-Charged Smartphone
WWW.VERIZONWIRELESS.COM
What does a 15-minute charge get you these days? If youre using the Verizon Wireless Droid Turbo by Motorola with
its Turbo Charge technology, it gets you eight hours worth of battery power. And on a full charge, the smartphone lasts
for up to 48 hours. You can even charge the Droid Turbo wirelessly. The smartphone sports a turbo-fast 2.7GHz quad-core
processor for fast app-switching, and it includes Verizons Advanced Calling feature so you can make high-definition voice
and video calls to other Advanced Calling-enabled phones while continuing to use data. The phone has a 21MP (megapixel) camera with two LEDs for an extra-bright flash, and it can be used to record video at 4K resolution. Theres also a
2MP front-facing camera. The Droid Turbo runs Android 4.4.4 (aka KitKat) and includes 32GB or 64GB of storage (no SD
card slot), as well as a Corning Gorilla Glass 3-covered 5.2-inch Quad HD display. The case is sturdy (reinforced by DuPont
Kevlar), water-repellent, certified as energy efficient, and recyclable.

58

September 2015 / www.cybertrend.com

Cool, Quiet & Ergonomically Brilliant


US.COOLERMASTER.COM
For those who have uttered common complaints about their laptop computersMy laptop gets too hot, or Typing on
my laptop puts a kink in my neck, or My laptop doesnt have enough USB portsCooler Master offers a solution in the
form of its NotePal ErgoStand III. An expert in taming overheated computers and components, Cooler Master makes laptop
cooling a primary feature of the ErgoStand III, equipping the metal mesh device with a quiet 9-inch fan, a dial to control fan
speed, and a removable cover to simplify cleaning. Cooler Master says the device provides up to 72 cubic feet of air flow
per minute and produces just 21 dBA in noise. Ergonomics is another key feature of the device, as its name implies; it features six height settings so it can be adjusted to meet any comfort level. The ErgoStand III weighs a little over 2.6 pounds,
supports laptops with screen sizes up to 17 inches, and includes anti-slip holders so it can be used with a variety of tablets
and other handheld devices. It also includes a USB hub that converts one laptop USB port into four.

CyberTrend / September 2015

59

Images, clockwise from top left, courtesy of Apple (1), Samsung (2), Sony (3), Microsoft (4, 5), and BlackBerry (6)

Smartphone Tips
A ROUNDUP OF HANDY ADVICE

WINDOWS PHONE
Let The Local Scout
Find Places For You
If youre looking for someplace to eat
or something to do nearby, press the
Search button on your Windows Phone
device and tap the Scout (three buildings) icon, and then flick left or right to
view choices under Eat + Drink, See +
Do, Shop, and For You. For personalized results, choose the For You category. By default, the results are sorted
by distance, so whatever is closest to
you is at the top of the list; tap Sorted
By Distance to choose a different sort
order or to filter the results. Tap any
item to find out more. To find places
in a location you are heading to, find
the location using the Maps app before
launching the Local Scout.

60

September 2015 / www.cybertrend.com

Have NFC? Use It To Share Web Pages


If your smartphone includes NFC (Near Field Communication) short-range wireless functionality, you can (among many other things) share a Web page with other
Windows Phone users, provided theyre nearby and also have NFC-capable phones.
First, make sure each user turns on NFC; tap Settings (gear icon), tap Tap + Send,
and move the switch to On. Then, in Internet Explorer, access the Web page you wish
to share, tap More (three dots in a horizontal row), tap Share Page, tap Tap + Send,
and touch the other persons smartphone with your smartphone.

Take Advantage Of Auto-Correct


As with other mobile operating systems, Windows Phone offers numerous automatic text correction features that are designed to speed up typing. When you
reach the end of a sentence, for example, tap Space twice to automatically place a
period at the end of the sentence and capitalize the next word if you should decide
to type another sentence. Windows Phone also adds accents and apostrophes
where it seems they might be needed (for example, it will change werent to
werent, and will even try to determine whether the word well is meant to be
left as it is or changed to well).

ANDROID
Dont Settle For
The Standard Notifications
If you like the array of items that
appear in your Android smartphones
Notification panel, youre all set. If
you never want to use a certain feature, however, such as Driving Mode
or Multi Window, or if youd like to
add a feature to the Notification panel,
such as NFC or Smart Stay, you can
adjust the panel to your liking. Access
Settings and tap Notification Panel
to get started. Youll see two panes at
the bottom of the Notification Panel
setting screen: one for active notifications and another for available buttons. Simply press and drag the icons
from one area to the other for the features youd like to add to or remove
from the Notification panel, and then
exit Settings.

Enable Notifications
For New Messages
By default, the Gmail app only
displays a notification for each new
email thread, not for new messages
that occur within a thread. For those
who watch their email like a hawk,
leaving this default setting turned
on is akin to putting blinders on as
countless new messages roll in unnoticed. To correct this situation,
open the app tray, tap the Gmail icon,
press the Menu key, tap Settings, select an account, scroll to and select
Labels To Notify, and then tap Inbox.
On the Notifications For Inbox window, uncheck the Notify Once setting. Tap OK, and then press the
Escape key until youre back at the
Home screen. Now you just have to
make sure you keep up with the influx of notifications.

DIY Wi-Fi Hotspot


You can turn your phone into a Wi-Fi hotspot, which lets you connect other
devices, such as tablets and notebooks, to your cellular network. To do this on
an Android smartphone, press the Menu button from the Home screen; tap
Settings, Wireless & Networks, More, and Tethering & Portable Hotspot. Tap
Configure Wi-Fi Hotspot to enable the feature and configure the SSID, security
settings, and more. Return to the Tethering & Portable Hotspot menu and tap
Portable Wi-Fi Hotspot to enable it. Once set up, you can use your other device and connect to the network as you would when connecting to a public or
private hotspot. Most carriers charge an additional fee for tethering; if your cellular contract doesnt support tethering, youll probably just hit a splash screen
that tells you how to add tethering to your account.

Try Google Voice Typing


If youre in a situation where its easier to speak into your Android smartphone than it is to type, try out the Google Voice Typing feature. While viewing
a screen where
you would normally enter text,
swipe down from
the top of the
screen to view
the Notifications
panel, tap Google
Voice Typing, and
speak into your
phone. The text of
what you say (or
at least the text of
what the Google
Voice Typing fea- Is there something you need to type when typing is inconvenient? Try
ture thinks youre saying it instead.
saying) appears
on the screen. If
the text is correct, tap Done; you will then see the text you entered and the onscreen keyboard. If the text is incorrect, tap Delete and try again.

Delete Browsing History


Deleting your mobile Chrome browser history eliminates some potential
privacy issues and helps keep your Android device clutter-free. In Chrome,
tap the menu icon, Settings, (Advanced) Privacy, and Clear Browsing Data to
remove browsing history, site data, and related information.

CyberTrend / September 2015

61

iOS

62

Reminders Not Updating?

Dont See What Youre Looking For In Search?

The Reminders app for iOS helps you


sync reminders across linked devices via
iCloud. The app also syncs with Exchange
and integrates with Microsoft Outlooks
Task feature. Although synchronization
is designed to work seamlessly, hiccups
can occur. If changes you make on your
computer or other device dont show up
on your iPhone, there are a few things
Apple recommends to remedy the situation. First, open the Calendar app, tap
Calendars at the bottom of the screen to
show a list of your calendars, and pull
down from the center of the screen to
refresh the calendars; doing this also refreshes Reminders.
If that doesnt work, stop the Reminders app from running and then restart it. To do this, press the Home button
twice and scroll to the right until you see
the Reminders app icon. If you are using
iOS 7, swipe the app icon upward to dismiss it; if you are using iOS 5 or iOS 6,
press and hold the Reminders app icon
until it wiggles, and then tap the circle
containing an X to turn off the app. As
a last resort, you can try the option that
solves so many problems: reboot. Simply
power down your iPhone and then turn
it back on.

What or who was that you were looking for? Was there somethinga document, say, or contact informationstored on your iPhone? No? If you are performing a search on your iPhone (which, by the way, you can initiate by swiping
down from any main screen and typing) and dont find what youre looking for as
you type the search term, you can tap Search Web to instantly change your iPhone
search into a Web search.

Change Your Default


Safari Search Engine

Dismiss Currently
Running Apps

When you first get a new iPhone, the


default search engine for the Safari mobile Web browser is Google. You can,
however, change the search provider to
Yahoo!, Bing, or DuckDuckGo if you like.
To do this, access your iPhones Settings,
tap Safari, tap Search Engine, and then tap
your search provider of choice. Changing
these settings doesnt affect other browsers
you may have installed.

The iPhones multi- Tap and hold the period key for a selection of domains.
tasking feature is great for
quickly switching back to
a recently used app, but it also opens the door for multiple apps to run in the background. Apps running in the background dont usually hamper performance, but
they can, and restarting the iPhone doesnt shut down the apps. To view all currently
running apps, press the Home button twice quickly. Swipe left or right to locate each
app you would like to shut down, and simply swipe the apps thumbnail screen upward to dismiss it from the list and stop it from running. Doing this does not cause
any data loss. Press the Home screen again to return to the normal view.

September 2015 / www.cybertrend.com

Quickly Enter .com


& Other Domains
Since iOS 7, the Safari
browser no longer has a
.com key for quickly entering a domain. With current iPhones, when you
type a Web address directly
into the address bar, as
soon as you reach the point
at which you need to enter
a domain simply tap and
hold the period key. This
brings up a small menu
that lets you choose from
a short list of common domains: .com, .us, .org, .edu,
and .net. If the website you
want to visit uses a different
domain, youll have to type
it manually.

BLACKBERRY
Quick Key Shortcuts

Transfer Data From A Non-BlackBerry Device

There are dozens of shortcuts in


BlackBerry 10 that let you type specific
words or press certain keys to quickly
perform an associated action. Here are
some single-key shortcuts that are especially handy for business users.

You can use BlackBerry Link software (available via the BlackBerry website) to
transfer data from a non-BlackBerry device to a device running BlackBerry OS 10.
If the other device is an iPhone, sync it with iTunes, disconnect it from your computer, download and install BlackBerry Link, connect the BlackBerry using a USB
cable, select iTunes as the music source, and access the desired views (Pictures
and Video, Music) in order to select and drag files from your computer to the
BlackBerry. If you have an Android device or a feature phone and a Windows PC,
you can simply connect both the non-BlackBerry device and the BlackBerry to
the PC using the corresponding sync cables, open Windows Explorer (click Start,
Accessories, Windows Explorer), and then drag files directly from the Android
device or feature phone to the BlackBerry.

Calendar
A - switch to agenda view
C - create new calendar item
D - switch to schedule view
M - switch to month view
S - search calendar items
T - switch to current day
W - switch to week view
Contacts
B - jump to bottom of contact info
E - edit contact information
T - jump to top of contact info
Web Browser
H - access browsing history
I - zoom in
K - access bookmarks
L - refresh the Web page
N - go to next page (forward)
O - zoom out
P - go to previous page (back)
S - search for text on a Web page

BlackBerry Link software simplifies the process of transferring data from another device.

BlackBerry Balance Separates Work & Personal Activities


Connect To A Hidden
Wi-Fi Network
Sometimes you know a Wi-Fi network exists but it is hidden from view.
To access such a network, go to the
Home screen, swipe down, select WiFi, and select the Add (plus sign) icon.
Enter the exact name of the Wi-Fi network in the SSID field, select the type
of network security that is in use, enter
additional information if required, and
tap Connect.

The BlackBerry Balance feature, which was introduced in BlackBerry OS 10,


lets system administrators easily establish two separate personalities for a
BlackBerry device: one that is maintained and somewhat controlled by the companys IT staff, and one that the user can control. This permits companies to push
work-related apps and updates directly to employees phones, as well as wipe data
remotely if necessary, while allowing users to install and use their own apps and
data in a separate location for personal use. For the user, switching between these
personalities is as simple as swiping down and tapping either the Personal or the
Work button. In terms of social networking, this work/personal split is helpful
because it enables users to maintain multiple accounts simultaneously. If, for
example, a user has a personal Facebook account but also maintains a company
Facebook account, he can run two separate applicationsone on the Personal
side, one on the Work sideand switch between them quickly.

CyberTrend / September 2015

63

Quick Cloud Collaboration


KEEPS PROJECTS IN SYNC

AS THE NUMBER OF employees doing


business outside the walls of the traditional office environment increases,
companies of all sizes are adopting new
ways of getting work done. Namely,
theyre moving toward more flexible,
efficient cloud-based services. Although
the purposes of online SaaS (software as
a service) options vary, users are taking
advantage of seamless conferencing, file
sharing, idea generating, and so much
more. Read on to find a service that suits
your collaborative needs.

primarily let you read docs offline,


whereas more feature-packed options
let you edit and save changes to collaborative documents, spreadsheets, and
presentations. Microsoft, for instance,
provides a solution for offline workers
via Office 365s (products.office.com
/en-US/business) SharePoint Online.
Using the programs MySite tool, you
can create copies of documents on your
PC and work on them when you are
offline. Then, when you connect to the
cloud again, SharePoint automatically
syncs your work.

Take Documents Offline


It seems inevitable that wireless Internet availability determines when
and where you edit online documents
while you are on the road. But with
the help of the right device-specific
offline app, you dont have to postpone work until you are within range
of a Wi-Fi hotspot. Some basic apps

64

September 2015 / www.cybertrend.com

Dont Forget Your Webcam


Collaboration is accomplished on an
international level these days, which
means that face-to-face conversations
with globetrotting team members are
commonly conducted via LCD touchscreens. Whether youre working on a
smartphone, tablet, laptop, or PC, using

your webcam as a collaboration tool


connects you to colleagues and clients
more intimately than the routine conference call. We suggest using a videoconferencing app or software that
supports multiuser conversations. Some
options let you incorporate shared
whiteboards and simultaneous document editing.

Consider Using File-Sharing Tools


If you need to share documents that
dont contain particularly sensitive
data, you can do so using a file-sharing
service. Most file-sharing services let
you securely upload and store a limited number of gigabytes (2 to 5GB is
common) of data. Some services also
give you the tools to organize your files.
Sharing from your mobile device makes
on-the-go collaboration convenient, so
its beneficial to check out file-sharing
apps appropriate for your device.

Consider Online
Productivity Tools
A plethora of Web apps fall under
the umbrella of productivity, but in
no way is that a bad thing because there
is an app for practically every task, priority, project, and goal. For instance,
you can use project management tools
to juggle deadlines, manage to-do lists,
track workflows, and more. Adding to
these capabilities, Microsoft Office 365
gives team members shared access to
master documents via user-created intranet sites, so they can edit in real-time
and manage file access among customers
and partners.

With a cloud service such as Microsoft Office 365, you can co-author Word documents, Excel sheets,
and other files with colleagues. Unlike traditional Office products, you dont have to save a separate
version for yourself or wait until another person closes the file.

Use Whiteboards
When you cant meet in person,
members of your virtual team can interact and brainstorm on full-featured
online whiteboards. Browser-based
whiteboards typically let you invite
meeting participants to create and sketch
on the same board. A number of whiteboard apps also support real-time collaboration in which everyone in the
session is an equal participant. This is
a good tool for tablet users who want
to share ideas on the go but need input
from others.

Accomplish More With


Web Apps That Combine
Different Capabilities
Multitaskers take note: Not only can
you collaborate with more team members in the cloud than ever before, but
you can also complete more tasks within the same service. Want to walk your
team through a live slideshow from a
presentation sharing service? No problem. Need to create flow diagrams and
share relevant images with your colleagues online? Theres a service for that.
And, if your team and a third-party developer are working on a website, for
example, you can work together in a virtual space where anyone can add comments, crop images, and more.

If youre a Windows Phone user, you can easily access Office 365 apps from your device. Specifically,
you can start a new OneNote page, create a new Office document, or edit files saved in SharePoint.

Manage Time & Tasks


Organizing schedules and all
the associated meetings, deadlines,
projects, and so forth can become a
daunting task. Among the available
cloud-based sites and mobile device
apps, you can find apps and services
that will help you manage your work
life. Consider utilizing event-based
planners, group-oriented reminder
apps, services for meeting coordination, and visual to-do lists to keep
your busy life on track.

Print Documents
When you need to print content
from your mobile device, you can use
one of many available apps to print

documents to supported printers anywhere in the world. For example, if you


are working on a presentation on your
tablet while traveling and need to distribute copies to colleagues, you can
print the presentation to a printer in
your main office. Some mobile printing
apps let you search a directory for
nearby printers (such as those in hotels
or airports) or locate a printer via GPS,
so if you need to print a boarding pass
or other content from your device while
traveling, you can do that, too. Some
cloud-based printing apps and services
also provide the option to print by
sending an email attachment to a supported printer, or to print documents
saved in an online storage service.

CyberTrend / September 2015

65

Mobile Data Best Practices


SYNC & BACKUP OPTIONS FOR YOUR TRAVELS

THE THEFT OR LOSS OF a laptop, tablet,


smartphone, or other mobile device
ranks among the worst productivity
catastrophes that can befall a traveling
professional. For all intents and purposes, our devices are our offices when
we travel, and losing them disrupts our
ability to work and communicate. There
is an obvious financial hit associated with
the loss of hardware, but there is a potentially greater hit that occurs in the loss of
corporate data. Its important, then, to
know where your data is at all times, so in
the event that you no longer have access
to your devices, youll know what is lost
and what is accessible elsewhere. And, if
you follow a few mobile best practices,
youll never have to worry about losing
much data at allif any.

Know What Gets Backed Up


Automatically
Depending on your smartphones
or tablets OS (operating system), there
is a certain amount of device data that
automatically gets backed up on a regular basis. If you use a USB cable to directly sync your iPhone or iPad with
your computer, for example, the sync
process backs up all of the OS and app
data stored on that device; there is an
option to encrypt and password-protect
the backed-up data, too. If you use the
iCloud service with your iOS device, specific sets of data will automatically be
backed up in the background as long as
your device has a Wi-Fi Internet connection, is plugged in to a power source, and
has a locked screen; backed up data can

include camera roll images, documents,


audio, and settings, depending on the options you choose.
Android users can manage incremental backups for apps and device
settings by signing into the associated
Google Account from their smartphones
or tablets. The Android Auto Sync feature routinely syncs in the background;
how and what it syncs partly depends on
the options you choose, but by default
the feature backs up OS data, contact
information, documents, and select app
data (such as Facebook and Twitter).
If you have a device running one of
the latest versions of Windows Phone,
you can sync documents stored on your
device with Microsofts OneDrive cloud
storage solution; you can also retrieve

IF YOURE RELUCTANT TO SYNC KEY DATA TO A CLOUD BACKUP


OR STORAGE SERVICE ON A REGULAR BASIS, CONSIDER USING AN
ALTERNATIVE CLOUD SOLUTION . . .
66

September 2015 / www.cybertrend.com

documents from OneDrive that were uploaded from a different source. To sync
all of the photos, audio files, and videos
stored on your Windows Phone device,
you must install Microsofts Zune software on your computer and connect the
mobile device to the computer via USB.

Dont Forget Your App Data


App data encompasses a broad
range of digital information, but in our
context it means third-party apps and
the content you create using those apps.
Consider, for instance, note-taking services that exist as both cloud services
(where all of the information that is associated with those services is stored in
the cloud) and applications (where your
app-related information is stored locally). As you take notes with the app,
it stores those notes locally and in the
cloud simultaneously and in real-time.
Such an app-service combination is different from a note-taking app that does
not have an associated cloud service; with
this type of app, everything you add is
stored only in the device and is therefore
vulnerable to loss. Make sure you know
how your apps work so you dont get
caught unawares.
Also keep in mind that some apps are
more flexible than others. Apples Notes

app in iOS, for example, can keep your


notes on the device only or on both the
device and in the cloud, depending on
how you set it up.

Be Careful When Traveling


If you travel frequently, you probably
have quite a few travel-related routines.
When it comes to keeping all of your
data intact, though, its important to remember that travel disrupts the routines
youve established at the office. For example, if you regularly sync your tablet
and smartphone with your computer
but typically leave the computer behind
when traveling, the backup that otherwise occurs with every physical sync
wont take place during your travels. If
you keep that sort of thing in mind while
traveling, you will remain aware of what
data resides in the danger zone (i.e.,
stored on your device, but not backed up
anywhere else) in the event your device
gets lost or stolen.

Use Cloud Services, At Least


Temporarily

For example, you could set up an account


with a major online storage provider to use
with only a handful of files that are necessary for a specific trip. Providers offering
this type of service typically also offer a
mobile app that makes the service more
useful on your mobile device. And some
major storage services also sync with productivity apps you might already have installed on your devices.
Another stop-gap alternative is to use
a Web-based email service to email documents to and from a corporate account.
Doing this ensures that a copy of the document is maintained on the corporate
network even after you delete the associated email from the Web email account.

Physical Backup
Finally, you cant sync a certain
amount of valuable device data to the
cloud (or to your main computer via the
cloud), so be sure to back up that data as
often as possible to a second device (such
as a laptop) or storage solution (such as a
microSD card or portable hard drive).

If youre reluctant to sync key data to a


cloud backup or storage service on a regular basis, consider using an alternative
cloud solutionat least temporarilyto
meet specific requirements while traveling.
To locate the
Storage & Backup
menu on your iOS 7
device, tap Settings,
iCloud, and Storage
& Backup (choose
Backup in iOS 8). From
this screen you can
view available storage
and switch iCloud
Backup off and on.

You can customize which apps are backed up in iCloud by toggling


the ON/OFF button next to each app. Be sure to activate the Find My
iPad feature in case you need to locate a lost iOS device.

CyberTrend / September 2015

67

Excel Formulas
MAKE THEM WORK FOR YOU

EXCEL SPREADSHEETS ARE useful for


tracking finances, storing important
figures, or even creating databases
of information. But the only way to
take full advantage of Excel is to use
functions and formulas. Whether you
simply want to find the sum total of a
column of numbers or calculate compound interest, formulas are the best
way to transform your data. Here are
examples of formulas that might save
you time.

with a 4% interest rate, you can plug


those numbers into the =PV*(1+R)^N
formula. In our example, your formula would be 100*(1+.04)^10. Note
that you need to change the 4% figure
into a decimal number, otherwise you
might expect a larger than life return
on your investment. Calculate the formula and youll see that over 10 years
your initial $100 investment will grow
to $148.02.

you want to calculate change between


numbers (200 to 250, for example),
you can use the formula =(250-200)/
ABS(200) to get a growth rate of .25
or 25%.

Sum Of Totals Across Multiple


Worksheets
Lets say you keep track of sales
figures over the years using the same
Excel document. Not only do you want

Calculate Percentages
Calculate Compound Interest
Because Excel doesnt have a builtin function for calculating compound
interest, Microsoft provides a formula
that will get you the results you need
using present value (PV), interest rate
(R), and the number of investment
periods (N). So, if you make an investment of $100 and want to see how
much money youll have in 10 years

68

September 2015 / www.cybertrend.com

You can calculate percentages in a


variety of ways using Excel, depending
on the information you already know.
For instance, you can use a simple division formula to find a comparison
between two numbers. For instance,
if you shipped 25 products and only
one of them was returned, you can
simply enter =24/25 (or use cell coordinates) to get a figure of .96 or 96%. If

Excel doesnt have a built-in compound interest


function, but you can use this relatively simple
function to get the same result.

a record of your current years sales,


but you also want your sales figure
from the previous year at the top of
each sheet. This will require the use
of the SUM function as well as some
cross-sheet calculation. Using the SUM
function, =SUM(Sheet1!A1:A6) for instance, you can take numbers from the
first sheet, add them together, and display them in a cell on the second sheet.

The MATCH function is


helpful if you want to find
a specific figure in a long
column of numbers. It shows
you where your query is located in relation to the array
you provide in the formula.

MATCH Function
Excels MATCH function makes it
easier to find the location of a specific
figure relative to its order in a column.
For instance, if you are searching
for the number 780 in a column of
30 cells, you can type the formula
=MATCH(780,B1:B30,0) to find
your exact match. If the information
is located in the 15th cell, for instance,
youll receive the result of 15 from the
formula. You can also use a 1 or -1
modifier in place of the 0 to find the
number that is greater than or less than
your desired figure.

Round Up Or Down
If you work with figures that have
multiple decimal numbers and need to
round up or down to a specific decimal
place, then Excel has two easy functions you can use to get the job done:
ROUNDUP and ROUNDDOWN.
For example, take a number you want

Cross-sheet calculation makes it possible to link


formulas across multiple sheets in the same
workbook, so you dont have to copy and paste
information or calculate figures outside of Excel.

to round up, such as 12,345.678 and


decide what decimal place you want
to round to. Then, use the function
=ROUNDUP(12,345.678, 2) and
Excel will automatically round it up
to 12,345.68.

WORKDAY Function
WORKDAY lets you take a start
date and a number of days to determine what your end date will be
with weekends and holidays taken
into account. For example, you need
to enter the DATE formula, well use
=DATE(2015,4,1) into the A1 cell,
and a specific number of days in the
A2 cell, well use 18, you can use the
formula =WORKDAY(A1,A2) to find
your end date, which in this case is
April 27, 2015. You can also add holidays to the formula by entering the
dates into cells
and adding them
to the end of the

formula =WORKDAY(A1,A2,A3:A9),
which will change the end date.

Display Current Date & Time


Excels NOW function is a quick and
easy way to display the current date
and time in your spreadsheet. Type
=NOW() into a field and the date
and time will appear. This information doesnt update automatically, but
rather every time you make a calculation within the spreadsheet as well
as every time you open that particular
Excel document.

REPT Function
Typing the same thing over and over
can quickly get repetitive, especially if
you need 32,767 instances of the same
information. If you think that number is
oddly specific, youre right. Its the maximum number of times you can use the
REPT function, according to Microsoft.
To use the REPT function, simply take a
word, number, or other entry (Repeat,
in this instance) and tell
Excel how many times you
want it repeated by typing
=REPT(Repeat,5) into a cell.
You can also use this function to better visualize data.
For instance, you can use
symbols to represent sales
figures or your amount of
customers and watch your
growth over time.

CyberTrend / September 2015

69

Rootkit Attacks
WHAT TO DO TO FIGHT BACK

EVEN SEEING THE WORD rootkit can


send shivers up the spine of someone
who has suffered through the inconvenience and damage a rootkit can
exact. According to Dan Olds, principal
analyst at Gabriel Consulting Group,
rootkits are some of the most insidious and dangerous pieces of malware
out there today. Thats due to the fact
that rootkits are both extremely difficult to detect and get rid of completely.
Therefore, the more you know about
rootkits, the better.

What Is A Rootkit?
A rootkit is software that infects and
gains privileged access to a computer.
This means it can perform administrator-level type tasks, says Michela
Menting, digital security research director with ABI Research. The primary
feature is that it can hide itself in the
system and remain undetected.

70

September 2015 / www.cybertrend.com

One way to think of how a rootkit


wreaks havoc, says Jim OGorman, an instructor of offensive security measures,
is to envision that you are driving a car
but someone else is intercepting all your
movements and deciding if he should pass
them on to the car or not. In some cases,
he might decide to just insert some of his
own commands, as well, OGorman says.
Although rootkits are similar to viruses or Trojans, says Chris Hadnagy,
a security training professional, viruses
and Trojans usually delete data, stop
services, or cause harm while a rootkit
provides an attacker system access to
get at data. Not all rootkits are malicious (a company might install one to
remotely access and control employee
computers, for example), however,
Menting says they are extremely popular with malicious hackers and cybercriminals, which is why they have such
a negative connotation.

The Damage
Essentially, rootkits give an attacker
free reign to perform any task desired, including installing software; deleting files;
modifying programs; transmitting data;
and using spyware to steal credit card
numbers, passwords, keystrokes, etc. A
rootkits ability to modify existing programs and processes, says Menting, enables it to avoid detection by security
software that would normally catch such
software.
There really arent any limits to
how much damage it can do to a PC,
Olds says. It can delete data files and
then rewrite gibberish on the hard
drive to ensure that the data cant be
recovered, or it can quietly work in the
background and log user keystrokes,
eventually capturing workplace, ecommerce, or banking user-names and
passwords. Ultimately, a rootkit can
route that data to a hacker to plunder

accounts or gain access to a corporate


network, Olds explains.
Beyond software-based rootkits
there are hardware-based rootkits, says
Hadnagy. These, like software rootkits,
give the attacker full admin access to a
machine, compromising everything on
it and even at times the network its connected to, he says. For users, OGorman
says a rootkit destroys all trust with the
computer. You cant know what is private,
what is not. All integrity is gone.

How Youll Know


There are several ways a rootkit can
find its way into a computer. A downloaded program file a user believes to be
legitimate, for example, may have a rootkit
embedded within it. Menting says rootkits generally enter a system through existing vulnerabilities and are loaded by
malware, which can infect computers via
downloads, email attachments disguised
as genuine communication or documents,
websites with unpatched vulnerabilities,
USB thumb drives, or mobile devices.
To the average user, abnormal computer behavior is the best indicator a
rootkit might be present; warning signs
include files spontaneously disappearing
or appearing, a sluggish Internet connection, and slow-loading programs. Such
behavior can indicate other programs are
running in the background. Menting advises checking the Task Manager to detect
which applications or processes are running and using significant memory. For
the non-tech user, it may be difficult to
understand, she says. But users should
familiarize themselves with how their Task
Manager looks when its running on a
clean system so that when it actually is infected, the user can spot some differences
when looking at the tasks.
That said, detecting a rootkit is still
generally difficult. This is due to how
adept the rootkit is at installing itself and
hiding its presence in a way that is virtually undetectable by your system software, Olds explains. In this case, the

Unfortunately, the likelihood of being hacked or unwittingly downloading malware on a computer is extremely
high. Especially in the network-connected environment
of a company, even if you take all precautions necessary
someone else may not have and you get a virus from
them internally.
MICHELA MENTING
Practice Director
ABI Research

only way to find the rootkit, he says, is


to boot the system using a CD/DVD or
thumb drive that has special diagnostic
routines designed to find and remove
rootkits. Hadnagy says if a systems OS
is compromised, it cant be trusted to
find flaws in itself.In this event, it may
be necessary to boot a self-contained OS
running from a CD/DVD or USB drive
and run malware detection and removal
software from a clean environment.

What To Do
For typical users, arguably the worst
news concerning rootkits is that getting
rid of one can be beyond their scope.
Olds says, in fact, most users should
probably seek an experts help if they
suspect a rootkit infection. Though
some security programs can detect and
remove specific rootkits, Menting says,
there are so many variants that it can
be impossible to detect and remove
them all. Often, she says, getting rid of
a rootkit requires a radical solution.
If a user suspects a rootkit, he should
first disconnect the system from the
Internet to cut off possible remote access
and prevent data from leaking, Menting
says. Next, remove data from the infected
computer and scan it for malware on another device. (Menting notes that if the
data contains unknown [or zero-day]
malware, this step may not guarantee the
malware is eradicated.) Finally, the computer should be purgedwipe the hard
drive and reinstall everything, she says.

OGorman, in fact, says starting over is


the only real solution, because really, you
cant trust cleanup methods, as you are
never really sure if they worked.

How To Protect Yourself


The first defense against rootkits (and
malware in general) is keeping the operating system and all softwareespecially
security softwareup-to-date and fully
patched. Completely relying on antivirus
software is a mistake, however. OGorman
explains theres always a lag between the
time a new threat pops up and the point
at which antivirus software can detect it.
The best way to avoid issues is to not
engage in risky activities, he says. Run
trustworthy, current software thats kept
patched. Dont go to shady sites with outof-date browsers and plug-ins. Dont run
software that doesnt come from trustworthy sources.
Unfortunately, the likelihood of being
hacked or unwittingly downloading malware on a computer is extremely high,
Menting says. Especially in the networkconnected environment of a company,
even if you take all precautions necessary,
someone else may not have and you get a
virus from them internally.
Menting suggests using different passwords for all logins, encrypting sensitive
and confidential data, staying constantly
on the lookout for odd system behaviors,
and securing mobile devices, particularly if
they are connected to a company network
or business computer.

CyberTrend / September 2015

71

You might also like