Você está na página 1de 96

DECEMBER 2015 VOL. 13 ISS.

12
CYBERTREND.COM

ALSO IN THIS ISSUE

Special Section For IT Managers

REIMAGINING
EVERYTHING
VERIZONS TAKE
ON DATA SECURITY,
ENTERPRISE MOBILITY,
& THE FUTURE OF THE
INTERNET OF THINGS

Special Section For PC Enthusiasts

Volume 13 : Issue 12 : December 2015

10

48

VERIZON ON SECURITY, MOBILITY


& THE FUTURE OF THE INTERNET OF THINGS
4 NEWS
Business technology news and research

30 ENERGY
Energy efficiency and the environment

10 COVER STORY
How Dell became a one-stop enterprise shop

32 IT
IT and data center concerns

16 CLOUD
Cloud computing and cloud-based services

42 NETWORKING
Wired and wireless networking

22 MOBILITY
Mobile tech for doing business anywhere

48 SECURITY
Solutions and best practices for security

26 DATA
Methods for leveraging data and analytics

56 ELECTRONICS
High-end consumer electronics
58 TIPS
Advice for mobile professionals

KEEP YOUR BUSINESS SECURE,


PREPARED & RESILIENT
71 PROCESSOR
Special advertising and content from our
Processor partners

Special Section For IT Managers

81 COMPUTER POWER USER


Special advertising and content from our
Computer Power User partners

Special Section For PC Enthusiasts

CONTACT US
P.O. Box 82545
Lincoln, NE 68501
or
120 W. Harvest Drive
Lincoln, NE 68521

Advertising: (800) 247-4880


Fax: (402) 479-2104
Circulation: (800) 334-7458
Fax: (402) 479-2123
www.cybertrend.com
email: feedback@cybertrend.com

Copyright 2015 by Sandhills Publishing Company. CyberTrend is a registered trademark of Sandhills Publishing Company. All rights reserved.
Reproduction of material appearing in CyberTrend is strictly prohibited without written permission.

Most Companies Planning Big Data Investments,


With Focus On Improving Both Customer Experience & ROI
In June, Gartner surveyed 437 Gartner Research Circle members, which covers
multiple industries worldwide, to take a pulse on their big data investments. The
research firm found that although the pace of growth was slowing, the number of
big data projects has continued to rise, with 75% of companies in 2015 planning to
start big data projects over the next two years, up 3% from 2014. The hype around
big data, it seems, has given way to a sense that big data projects are needed as a
matter of course. This year begins the shift of big data away from a topic unto
itself, and toward standard practices, says Nick Heudecker, research director with
Gartner. The topics that formerly defined big data, such as massive data volumes,
disparate data sources and new technologies are becoming familiar as big data solutions become mainstream. For example, among
companies that have invested in big data technology, 70% are analyzing or planning to analyze location data, and 64% are analyzing
or planning to analyze free-form text. Here are a few highlights from the research:

Whereas big projects


used to be the domain
of CIOs, now more
unit heads are
initiating them

64% of organizations
have enhanced
customer experience
as the primary goal of
their big data projects

The pursuit of big


data for better security
capabilities has been
influenced by a rise
in data breaches

Uncertainty about
effectiveness remains,
as 43% of those investing
in big data arent certain
ROI will be positive

Manufacturing Offers Biggest


Growth Opportunities For Mobile

Demand For Storage (Especially


Cheaper, Simpler Storage) Is Up

Vehicle Telematics Over 3G


Networks Are Big Business

In its latest report on business mobility,


IDC points out that the use of mobile
technology has leapt from the concept
of facilitating on-the-go productivity to
enabling a wide range of powerful, industry-specific capabilities. IDC identifies
manufacturing as the sector with the biggest growth opportunity for mobile tech
thanks to its sizable economic footprint
and global operations. IDC says organizations spent $901 billion on mobile tech in
2014, and will spend $1.2 trillion in 2019.
Other prominent growth areas include
consumer services, media, and banking.

Global factory revenue from enterprise


storage systems grew 2.1% year-over-year
to Q2 2015, reaching $8.8 billion, according to IDC. Companies are increasingly using new project initiatives and
infrastructure refresh as an opportunity to
deploy new storage technologies that are
able to drive cost and complexity out of
their existing storage resources, says Eric
Sheppard, IDC research director. Spending
on cloud-based and software-defined
storage, integrated systems, flash-optimized systems, and other storage is edging
out traditional arrays, Sheppard adds.

Most discussions about the cellular


biz these days gravitate toward 4G, but
3G still plays a dominant role in vehicular telematics, the technology behind
services for navigation, roadside assistance, diagnostics, etc. According to
ABI Research, telematics is the revenue
leader of IoT (Internet of Things) segments, standing to generate $60 billion
in service revenues in 2020. Although
4G and connected car seem to go
hand-in-hand, the telematics business
will continue to thrive on 3G for the
next five years, says ABI.

December 2015 / www.cybertrend.com

Where The Revenue Streams Are


Flowing In The Internet Of Things

The Cloud Just Keeps Getting Bigger,


& Companies Keep Building The Infrastructure To Support It

Professional services, such as application development, systems integration, and consulting, are currently the
main drivers of IoT (Internet of Things)
revenues, according to ABI Research.
However, says ABI, revenues will likely
shift to vendors offering software platforms and analytics solutions designed
to simplify the process of deriving the
most useful information from all of
the raw data that machines, sensors,
and other connected devices provide.
Ultimately, the goal of IoT connections
is the data, says Day Shey, vice president and IoT practice director with ABI
Research, and the IoT data and analytics market is set to grow the fastest
of any of the major IoT revenue categories. Well-established vendors will do
well, but so will startups, Shey adds.

Global spending on the infrastructure


required to continue to build out public
and private cloud environments (servers,
storage, and Ethernet switches) grew
25.7% year-over-year between Q2 2014
and Q2 2015, to $6.9 billion, according to
IDCs latest Worldwide Quarterly Cloud
IT Infrastructure Tracker. More money
is being spent on building for the public
cloud, at 30.4% ($4.1 billion), than on private clouds, at 19.5% ($2.8 billion), says IDC. Cloud computing continues to be the
name of the game compared to non-cloud IT, as infrastructure spending not related
to the cloud dropped 3.5% year-over-year. As cloud service providers continue to
expand their datacenter footprints to meet growing cloud services demand, customers
increasingly rely on a variety of as-a-service offerings and traditional hosting to help
meet the performance, manageability, time to deployment, and TCO requirements of
their organizations, says Kuba Stolarski, research director for servers and hyperscale
infrastructure with IDC. Both private and public clouds will continue to see growing
demand from customers who look to optimize their workload deployments based on
their own uniquely varied requirements.

New dtSearch Solution Runs In


Azure Cloud & Via RemoteApp

Banks Increase Innovation


Investments To Counter Startups

Companies Turn Their Social


Media Sights To Instagram

As the amount of data enterprises


store continues to grow, its vital that
search functions be fast and reliable
enough to keep pace. dtSearch recently launched a .NET solution with
these demands in mind. Capable of
searching terabytes of data in less than
a second, the solution runs the dtSearch Engine in Microsofts Azure
cloud and uses RemoteApp for anywhere access. dtSearch supports popular file types, emails, databases and
Web data, including 25-plus search
options and highlighted hits.

According to a new study from Infosys


Finacle (part of EdgeVerve Systems), 72%
of the retail banks surveyed worldwide are
on high alert for technology competitors.
The report cites retailers, telecoms, and
technology companies as chief innovation competitors, and also emphasizes that
startups are striving to out-innovate brickand-mortar banking institutions. The
study also shows that 69% of banks feel
startups have a high or very high impact
on innovation. In response, 84% of banks
are increasing their investments and 82%
are spending more on customer service.

Social network marketing has


reached a saturation point, declares a
recent eMarketer report. The market research firm says that 88.2% of U.S. organizations are using one of the major
social media platforms as of this year.
Of all of the platforms, however, companies are increasingly taking a shine
to the photo-sharing service Instagram.
According to eMarketer, 32.3% of companies surveyed are using Instagram for
marketing this year, 48.8% will do so in
2016, and in 2017 the figure will reach
70.7%, surpassing use of Twitter.

CyberTrend / December 2015

New Low-Power Wi-Fi Standard


Has Difficult Path In M2M Market

Everybody Has A Smartphone


. . . So What Else Is New?

The key feature of IoT (Internet of


Things) and related M2M (machineto-machine) communications is connectivity. Manufacturers have multiple
options when it comes to enabling devices to transmit data to each other, including the emerging low-power Wi-Fi
standard 802.11ah. The new standard
is designed to provide ample coverage
while consuming very little energy, but,
says ABI Research, the standard faces
stiff competition from a market with
existing wireless alternatives. Andrew
Zignani, research analyst with ABI, says
that 802.11ah development will have
to finish soon lest its window of opportunity close, as technologies such as
ZigBee and Thread are already gaining
momentum in the smart home and
other IoT verticals.

Pew Research Center recently released its latest big (26-page) report about digital device ownership in the U.S., and while some things arent so surprising (most of us own
smartphones), other changes are more interesting. Here are some of the essential stats:

1 Billion Mobile Banking Users


By Years End

Securing The Internet Of Things


A Goal For Enterprises

Wi-Fi & BLE Battle For


Proximity-Based Marketing

As 2015 draws to a close, Juniper


Research reports that mobile banking has
finally started taking off, with more than 1
billion people using mobile banking by the
end of the year. The research firm points
to mobile banking uptake in emerging
markets as a major cause for the high figures. Furthermore, Juniper Research says
that around 19% of households worldwide will pay bills online this year, and
that about half of bank customers will use
online banking in 2016. Juniper forecasts
the number of mobile banking users will
rise to 2 billion by 2020.

Because IoT (Internet of Things)related hardware is not all new, but


rather a mix of old and new devices
configured to connect to the Internet
in new ways, securing these devices and
their connections will likewise require a
variety of old and new technologies, according to a new report from Gartner.
But security is key, the research firm
says, and is of particular interest among
enterprise companies. Gartner forecasts
that more than 20% of enterprises will
have cybersecurity initiatives devoted to
IoT by the end of 2017.

As a concept, proximity-based services


hold great appeal for retailers wishing
to gain the attention of customers while
they shop with special deals and targeted
offers, and to gather information from
those customers. The technologies behind such services, however, are shifting.
According to ABI Research, many companies have already invested in BLE
(Bluetooth Low Energy), giving BLE an
edge over Wi-Fi alternatives. This makes
it appear as if Wi-Fi Aware is too late to
the game, but ABI suggests Wi-Fi has
greater potential for long-term growth.

December 2015 / www.cybertrend.com

Digital Device Ownership Among U.S. Adults In 2015


Cell Phone

Desktop/Laptop Computer
Smartphone
Tablet Computer

92%

(Including Smartphones)
Down from
88% in 2010

73%

Almost double the


ownership in 2011

68%

Tapering off, but way


up from 3% in 2010

45%

Consistently around
40% since 2008

40%

Game Console

Like MP3 players,


this percentage is steady

40%

Ebook Reader

Down from 32%


in early 2014

MP3 Player

Yes, 14% of adults

Portable Gaming Device still own these things

19%
14%

Source: Pew Research Center

STARTUPS
Startup Seeks To Optimize The
Digital World, Raises $58M

Mast Wants To Give Your Smartphone


A Second Line

Its an understatement that most websites could probably be better, both for the
visitors and the folks running the sites.
For businesses seeking to get the most out
of their websites and provide customers

The one-phone-numberper-phone paradigm is


outdated, says Mast, a New
York City-based startup
launched by former Virgin
Mobile executives David
Messenger (Mast CEO),
Peter Lurie (CBO), and
David Dawson (CTO). The
solution, the company says,
is its two-number solution,
which has caught the attention of investors and earned
them more than $7 million in funding so far. Aimed at business users who seek to
further separate their work and personal lives, Mast enables employers to strictly pay
for the business end of their phones. Mast lets employees work dynamically both
inside and outside the office exactly the way they want, says Messenger. Mast gives
employees the control and privacy they crave, and businesses an inexpensive solution
that boosts employee performance and enhances insights.

with an engaging online experience, optimization is key. The San Francisco-based


startup Optimizely is on a mission to optimize everything for its clients, from overall
website layout (so it works on all platforms) to personalization (so customers
get more of what they want) to testing and
analytics. The startup recently earned $58
million in a Series C funding round.

Cloud-Based Predictive
Marketing Startup Raises $65M

Improved Scalability
For Real-Time Big Data

Advanced Detection
For Advanced Attackers

Why focus on questionable leads when


you could be going straight for the leads
that are most likely to become customers?
San Mateo, Calif.,-based startup EverString
offers a cloud-based service that aims to
remove much of the guesswork from the
sales process with its predictive scoring.
The companys solution is geared toward
B2B companies and also helps them find
new prospects and market products more
effectively. Just 15 months old, EverString
recently announced it had raised $65 million in a Series B funding round led by
Lightspeed Venture Partners.

Real-time big data solutions rely on


relational databases, and with the growth
of organizations and their big data
project requirements scalability headaches often ensue. San Francisco-based
startup Citus Data offers a solution that
uses the open-source PostgreSQL, popular for real-time big data solutions, to
help companies scale as an alternative
to starting over, and to combine multiple non-relational and relational databases. To further its efforts, the company
recently raised $9.5 million in Series
A funding.

Old guard security measures are no


longer up to snuff, according to Tel Avivbased startup Illusive Networks. Todays
headlines confirm the need for a completely new approach to stopping cyber
attackers, says the companys CEO,
Shlomo Touboul. To combat advanced
attacks, Illusives Deceptions Everywhere
products pepper the entire network with
data designed to deceive and snare attackers. In October the company announced it had raised $22 million in
Series B funding, and that it plans to grow
its staff and open a New York City office.

CyberTrend / December 2015

Reimagining Everything
VERIZON TAKES ON COMMUNICATIONS, SECURITY, THE INTERNET OF THINGS & MORE

KEY POINTS
Verizons 4G LTE network
covers most of the United
States, including both highly
populated and rural areas.
Verizon recently released
a pair of new DROID devices
and also unveiled an Advanced
Calling feature for HD voice and
video calls.
Verizons enterprise mobility
services cover not just the
network, but mobile device and
application management.
Verizon also offers cloud,
colocation, hosting, and security
services, among other enterprise offerings.

10

December 2015 / www.cybertrend.com

VERIZON IS ONE of those companies that


seems like it has been around for a long
time, even though it was officially established in 2000. In the 15 intervening
years, the company has gone from being
a major player in the wireless carrier
market to one with the highest-rated
network in the United States according
to RootMetrics. But perhaps the most
interesting thing about Verizon is that its
products and services extend far beyond
the wireless spectrum. And although it
started off primarily as a wireless carrier,
its now a major provider of enterprise
solutions and services as well.
A lot of people would view Verizon
based on the business they may do
with them, says George Fischer, senior vice president of global sales at
Verizon Enterprise. For example,
if youre a wireless phone customer,
youd know us. But we provide networking, security, and collaborative

communication for companies all


over the world, and of course, the data
center work, everything from colocation to very sophisticated application
management. If youre watching TV
and youre seeing the Better campaign
talking about how were better for our
wireless network, you may not get the
whole breadth of what we do.

Mergers, Acquisitions & Building


The 4G LTE Network
Verizon was born out of a $52 billion
merger between Bell Atlantic and GTE
that closed in 2000. At the time of the
merger, GTE was one of the most successful telecommunications organizations in the world with over $25 billion
in revenue in 1999 alone, 35 million
access lines throughout the world, and
7.1 million wireless customers in the
U.S. Bell Atlantic saw even greater success with $33 billion in revenue in 1999

and 7.7 million customers in the U.S. It


was one of the biggest business deals in
U.S. history and led to Verizon Wireless
becoming established as a leader in the
wireless carrier market.
In 2002, the newly branded company
launched its 3G wireless network in the
U.S., which ushered in the age of highspeed Internet access on mobile devices.
You can trace the way smartphones are
used today back to those early days of
3G when mobile phones evolved into
mobile computers capable of handling
Web surfing as well as streaming video
and music.
In addition to offering a fast and reliable wireless network, Verizon placed
a strong focus on customer experience,
even for customers moving from one
carrier to another. This was proven in
2003 when the company promoted a
Local Number Portability program that
allowed users to keep their phone numbers when switching carriers.
Over the next few years, Verizon acquired numerous companies in order to
expand its reach and network offerings.
Verizon acquired MCI in 2006 to better
serve the SMB (small to midsize business) market and government sector,
Cybertrust in 2007 to offer managed
security services to businesses and government agencies, and Rural Cellular
Corp. in 2008 to expand its network
into more rural markets and help fill
the coverage map. It was also in 2008
that Verizon won a bid for a larger
wireless spectrum footprint in the U.S.,
which allowed it to begin work on what
would later become its LTE (Long Term
Evolution) network.
2009 and 2010 may be two of the
most important years for Verizon.
During this time, Verizon partnered
with Google to support the Android
mobile operating system, and released its first DROID smartphones
in November of 2009 with the HTC
DROID Eris and the Motorola DROID.
This line of devices was so successful

Were a Fortune 15 company with about 177,000


employees. Our capital invested in the business is
$17.2 billion in 2014. . . . [If] youre a wireless phone
customer, youd know us. But we provide networking,
security, and collaborative communication for companies all over the world. And of course, the data center
work, everything from colocation to very sophisticated
application management.
GEORGE FISCHER
Senior Vice President, Global Sales
Verizon Enterprise

that Motorola decided to manufacture


DROID phones solely for Verizon.
Also in 2009, Verizon acquired Alltel
and became the largest wireless provider in the United States. A year later,
Verizon launched its 4G LTE network
in 39 markets. Verizons LTE network
provided coverage for more than 110
million people from day one, and went
on to cover the rest of the United States
within a few years.

The Importance Of 4G LTE For


Consumers & Businesses
The history of 4G LTE is interesting
because it launched with a few USB
modems in 2010 and was already an
immediate success even before LTEcapable phones arrived several months
later. Since it launched, Verizon has
been working tirelessly to get coverage to as many users as possible
throughout the U.S. and to ensure the
best possible performance and reliability across the board. The combination of testers traveling from tower
to tower to prevent blackouts and
customer service experts both online
and onsite in retail stores means that
Verizon is working behind the scenes
and directly with customers to make
4G LTE the best possible experience.
Fortunately for Verizon and its customers, all of this hard work is paying
off. Consider the companys most

recent 4G LTE coverage map and its


clear Verizon is making more strides
than any other carrier to cover not only
highly populated residential areas, but
also as many rural areas as it can reach.
A recent report from RootMetrics
tested four different mobile networks
from January to June of 2015. Verizon
finished first in network speed, network
reliability, data and call network performance, and overall network performance. This illustrates that Verizons
4G LTE network not only covers more
customers than other carriers networks, but also that Verizons network
is backed by reliability, speed, and
overall performance.
The growth of Verizons network
has been crucial for consumers and
businesses alike. The availability of
high-speed mobile Internet access has
essentially created a mobility explosion
over the past few years that has led to
more innovations in mobile devices and
how theyre used. Take enterprises, for
example. To support remote workers
and business travelers, one used to rely
solely on Wi-Fi or wired Internet connections to get anything done. Now
you can use the same network you use
for voice to not only complete important business tasks from anywhere at
any time, but also to give customers
new ways to interact with your products
and services. Consumers can now do

CyberTrend / December 2015

11

Verizon has taken great strides to make sure its 4G LTE network is available in as many regions as possible, from large metropolitan areas to remote,
rural areas. This map shows U.S. coverage as of 2015.

banking from their mobile devices, shop


online, and reach out on social media all
using fast 4G LTE.

An Intriguing Future
Perhaps more interesting than how
Verizons 4G LTE network is being used
today is how it could be used in the near
future. Verizon recently launched a mobile-first video streaming solution called
Go90 in an effort to offer its customers
more entertainment options, but the
company isnt strictly focused on making
existing LTE uses better. In fact, Verizon
has an entire internal team dedicated
to furthering the companys overall IoT
(Internet of Things) strategy and exploring how people can use the existing
network in new and exciting ways.
For example, Verizons recently announced Hum service, which came out
of the companys telematics group, enables customers to plug a device into

12

December 2015 / www.cybertrend.com

their vehicle and view diagnostic data


and safety information. Verizon is also
leveraging its network for IoT with its remote maintenance technologies. An example of this is ATM servicing. Instead
of having an entire fleet of experts travel
from one machine to the next, its possible to have one expert in a centralized
headquarters who receives schematics
or pictures of problem machines and
can then respond. Its a much more efficient way to deploy resources and service
equipment, and its made possible by the
availability of a reliable 4G LTE network
in nearly every corner of the U.S.

New Devices & Unique Services


In addition to new types of devices and new ways to use its network,
Verizon is also improving the experience
for more traditional mobile devices. This
includes not only smartphones; you can
also purchase tablets and laptops with

built-in 4G LTE capabilities and then


subscribe to data plans that make it easy
to use these devices on the go. This is
particularly helpful for business travelers
because there are some situation where
a safe and reliable Wi-Fi connection
is difficult to find. With 4G LTE, you
can achieve speeds that are sometimes
even faster than Wi-Fi and even use one
device (say, a smartphone) as a Wi-Fi
hotspot for another (such as a full-sized
laptop PC).
Verizon differs from other wireless carriers in that it not only supports
devices from other manufacturers, including those that run Android, iOS, and
Windows Phone; it also has its own dedicated line of DROID mobile devices that
it continues to update on a consistent
basis. In fact, the company recently announced two new devices: the Motorola
DROID Turbo 2 and the DROID Maxx
2. The DROID Turbo 2 is particularly

interesting because it sports a Moto


ShatterShield display. Motorola claims
this is the worlds first shatterproof
screen, and the phone comes with a fouryear no-crack or -shatter display warranty
to back it up.
Verizon also seeks partnerships to
bring unique content to its customers.
The company partnered with the National
Football League, for example, to offer live
game coverage via NFL Mobile, so that
even if youre in an airport lounge thats
not showing your favorite team play,
you can still catch all the action. Verizon
also offers the Verizon Messages service,
which works through an application for
use on smartphones, tablets, and PCs.
This messaging platform enables different
devices, regardless of operating system, to
communicate with each other via wireless
networks, so even if you dont have access
to Verizons 4G LTE network, you can
still send text messages over Wi-Fi.
Among the most useful communications features Verizon offers for its mobile
devices is its Advanced Calling service.
This is essentially the LTE equivalent
of VoIP (voice over IP) calling. At no
additional cost, Verizon customers can
add Advanced Calling 1.0 functionality
to their devices, and as long as they are
communicating with compatible devices,
they can then conduct voice or video
calls in the highest possible HD quality.
Customers dont have to sign up for a
separate service to get access to Advanced
Calling because its already built into supported devices.

Enterprise Mobility
Best-known for its consumer-oriented
mobile offerings, Verizon is a major
player in the enterprise space as well. The
companys enterprise mobility portfolio
includes MAM (mobile application management) solutions, which help organizations better manage and control how
applications are used and secured, and
Verizons Mobile Workforce Manager, a
cloud-based service that helps companies

track, monitor, and manage all mobile


devices throughout the organization. All
of these mobile-specific tools are designed
to give enterprises a better overall view of
their mobile fleet and to help enforce mobile policies across the board.
Theres mobile device management
and the provisioning of devices in a way
that tracks assets in an organization,
makes sure costs are managed correctly,
and BYOD programs are managed very
effectively, says Fischer. We work with
the largest companies in the world to
provide cellular devices to their employees, and they expect the same customer service. We provide a high touch
service for corporations to manage their
mobile devices.
In addition to specific mobile management tools, Verizon offers services that
are difficult to pin down and categorize
because they help companies in multiple
ways and at so many different levels.
When it comes to providing network
capabilities to applications and mobile
devices, Verizon helps its customers customize their technology to fit any specific
need. For example, a retailer may need
network-enabled mobile devices to not
only manage inventory, but also separate
devices to scan customer loyalty cards.
This same idea applies to ruggedized
devices used by shipping and delivery
companies so they can capture customer
signatures and update tracking information on the fly.
Fischer says that Verizon is a major
proponent of IoT technologies as well,
because there are so many opportunities for these connected devices. IoT is
going to play a major role in enterprise
mobility going forward, as companies will
be encouraged to support more types of
wireless-enabled products than ever before. To help support companies in this
journey, Verizon pushes beyond basic
networking to ensure reliable connectivity
in nearly any situation.
There are very few organizations not
looking at IoT, and [they] require an

understanding of it, says Fischer. Weve


deployed 4G LTE, but we also have private network wireless backup thats very
powerful. We call it VoLTE, and its the
idea of combining the regular network
with a wireless network for business applications, backup, and other items. For
example, with an ATM, if the network
was to go out, you could recover with
a private wireless line connected to the
ATM. Thats another part of it. Its not
just devices. Its the network itself, in a
way, that provides for business continuity
and a better experience.

PROTECTING
ENTERPRISE DATA
In researching data breaches for its
2015 Data Breach Investigations
Report, Verizon found, with the
help of 70 contributing organizations, that there were more than
79,790 security incidents that
resulted in 2,122 confirmed data
breaches in 2014 alone. These
breaches hit a wide range of industries across the globe, but some of
the most targeted were the public
sector at 303 data breaches, the
financial services industry at 277
breaches, and the manufacturing
industry at 235 breaches.
This illustrates just how important it is for organizations to make
security a part of everything they
do. With this in mind, Verizon offers security services designed to
help companies before, during,
and after a cyberattack, even if it
involves a full-scale data breach.
Specific solutions and services
include asset and exposure management, security enforcement
and protection, identity and access
management, and risk and compliance management.

CyberTrend / December 2015

13

Cloud & IT Infrastructure


Services
In the enterprise space, Verizon particularly illustrates its versatility with
its cloud and IT infrastructure services.
Fischer says there are three major business
outcomes that enterprises are thinking
about today, including improving the
customer experience, driving growth in
business performance, and managing
risk. Verizons IT and cloud services help
companies achieve these goals as well as
meet requirements for specific vertical industries, such as health care, retail, and
manufacturing. In the old days, the telephone company was a one-to-many, says
Fischer. Were much more focused on
bespoke solutions and capabilities based
on the business outcome and the industry.
Were global, were vertical, and were
business-oriented.
In addition to offering global and local
network capabilities with the ultimate
goal of providing customers with a highspeed network available anytime and anyplace, Verizon also offers cloud services,
or what Fischer refers to as solutions for
highly network-reliant workloads. With

IN-HOME SOLUTIONS
In addition to mobile solutions,
Verizon offers in-home services for
customers in certain regions. The
company offers phone, Internet,
and television services as part of
its Fios product family, all backed
by Verizons fiber network. These
are all offered as separate services
or they can all be bundled together
in what Verizon calls the Triple
Play. This allows consumers to
pay one bill for all of their in-home
services rather than having to pay
three separates ones, and its a
helpful option for existing Verizon
customers that still want to have a
dedicated phone at home.

14

December 2015 / www.cybertrend.com

its cloud offerings, Verizon specializes in mission-critical applications


that, if they were to go down, would
cause great harm to the business
itself and to the brand name. One
example Fischer uses is an airline
application that requires enough
network and compute flexibility to
make users aware of issues due to
weather or as part of a flash marketing opportunity, for instance.
The idea of making sure mission-critical applications are always
available also feeds into Verizons
colocation and hosting services.
Fischer says that one of the most
important features that enterprises
are looking for in a hosting provider is the ability to decide how Verizon employs 117,900 employees in 150 locations
around the world, but calls Basking Ridge, N.J. home.
and where applications are hosted.
Whether they want to collocate an
companies to change the network, secuapplication near a specific customer
location or need to store their data in a rity, and configuration to specifically adcertain country due to compliance or safe dress application and user requirements.
SDN promises to make those tasks
harbor rules, Verizon has 50 data centers
much easier, and thats why Verizon has
around the world to solve those issues.
And to protect that data wherever its partnered with several large network
stored, Verizon offers advanced network equipment providers to drive forward
security, intrusion prevention, and scan- the transition to SDN. Its just another
ning capabilities for spotting issues or example of how Verizon always makes
sure it supports the technology its cusdealing with them after the fact.
Our ability to provide a completely tomers need when they need it.
The SDN generation that were desecure network is a huge advantage. We
ploying now allows for a lot more flexhave very specific and advanced security
ible and sophisticated capability at the
capability to secure our network. Thats
edge, says Fischer. It leverages the
why we have so many large bank and
Internet, broadband, and secure netgovernment customers that require that
works in a way thats cost-effective, but
level of security. We see probably 75%
of all Internet traffic, so we have a vast also protects everything weve been
talking about around performance and
amount of information around the besecurity. If you were just to go naked
havior of the network and any anomalies
into the Internet, you could possibly end
that could potentially occur. That can
up being compromised. SDN is really a
help us either in advance, during, or certainly in a forensic mode to understand cost, performance, and technology enabler thats going to open up even better
what happened when it was a networkopportunities for everybody to deploy
related attack.
applications that are network-reliant. Its
Another area that Verizon is particua big deal. Its gone from being a future
larly excited about is SDN (softwarething to being right now. Deployments
defined networking). Fischer says
are happening.
traditionally it has been difficult for

Managing Cloud Disruption


GET AHEAD OF POTENTIAL INTERNAL CONFLICTS

KEY POINTS
Many organizations arent
prepared for the disruption that
cloud adoption can bring.
Some companies may find
they need to adapt their operating model and company culture to shift to a cloud model.
Shifting to a cloud model can
impact employees but most
notably will likely impact IT staff
the most.
Despite the internal changes
that may accompany cloud
adoption, experts generally
agree that adopting a cloud a
model is still ultimately worth
the effort.

16

December 2015 / www.cybertrend.com

ALL COMPANIES establish an operating


model, either explicitly or by default. The
clearer the model, the more efficiently the
organization operates, says Ed Anderson,
Gartner research vice president. The older
the organization, the more entrenched the
operating model is in company culture, to
the point it can become part of the companys identity.
Now, introduce cloud computing to
such organizations and watch what
wrinkles surface. While cloud adoption
is becoming increasingly common and
arguably even mandatory to stay competitive, many companies arent prepared for
the internal disruption adopting a cloud
model can introduce, including to company culture. When cloud services are
introduced, organizations have to rethink
their operating processes if they want to
recognize the full benefits of the cloud,
Anderson says. Whereas cloud services are
dynamic, flexible, available on-demand,

etc., many organizations dont operate assuming their technology has these attributes, he says.
Thus, companies adopting cloud solutions must ask how they need to operate differently to fully capitalize on cloud,
Anderson says. Those that do can potentially bring about a new era of innovation,
agility, growth, and competitiveness. Those
that stick to old, established operating
models likely wont reach their true cloud
potential. No matter what, introducing
cloud services will challenge the status quo
and almost always create some tension and
disruption in the organization, he says.
The following explores such disruption
and how companies can anticipate and
react to them.

The Coming Storm


In a survey released in November 2014,
Cliff Grossner, Ph.D., research director at
Infonetics Research/IHS Research, found

that enterprises look to cloud adoption for


numerous reasons, including for improved
application performance, quicker access
to new technologies, better agility in responding to business needs, and faster application deployment. Among the barriers
to adoption, however, were the necessary
changes to internal procedures and compatibility within in-house infrastructure.
Often, Grossner says, enterprises unprepared for cloud adoption find that cloudbased systems make certain assumptions
about enterprise operations and employee
workflows. Organizations that arent openminded and ready to adapt find the cloud
model doesnt allow for much customization, he says.
Arguably, adopting a cloud model most
greatly disrupts existing IT staff who are
used to configuring and provisioning IT
equipment in a traditional way. The cloud
changes this, says Clive Longbottom,
Quocirca founder, leaving some staff with
little to do and ultimately trying to do
things they shouldnt be doing and possibly impacting the running of the
cloud, he says. Similarly, Dave Bartoletti,
Forrester Research principal analyst, says
while members of IT are increasingly
sensing how the cloud can benefit their
companies and are acquiring more cloudrelated skills, some may question their job
security when their company starts managing services, applications, or possibly
infrastructure in the cloud.
Adopting a cloud model can also disrupt a companys technology. For example,
Anderson says, organizations that operate
an internal data center may make various
assumptions and take shortcuts related to
the applications and data the data center
provides, including ones concerning security, funding IT, and infrastructure capabilities (such as network bandwidth).
A company, for example, may need to
change funding models to make an onpremises data center more cloud-like.
Bartoletti says not understanding how
much a company must change technology
budgeting and funding for the cloud can

Constant learning is now part of the job, and thats


absolutely important.
DAVE BARTOLETTI
Principal Analyst
Forrester Research

lead to significant obstacles. Many companies, for example, use standard budget
cycles in which someone estimates at the
start of the year how much more technology the company will need, and then
allocates for spend throughout the year.
A cloud model, however, entails only
consuming whats needed and paying for
whats used. In general, Anderson says,
when extending services that are running
on-premises to an external cloud service,
the assumptions companies have fall apart
and potentially cause numerous problems.

The Impact
Broadly, implementing a cloud model
can potentially disrupt everyone in an organization, although if done correctly, disruption can prove beneficial to purchasing,
skills enablement, architecting solutions,
IT management, and enabling IT to help
employees work differently. Longbottom
says processes can become far more dynamic, thus impacting how marketing
and sales approach their markets; workers
can more easily try to drop or continue
new functions and capabilities, thereby
impacting business development; and developers can move to a continuous delivery, DevOps model to meet the needs of
a highly dynamic business.
Typically, cloud disruption most acutely
impact IT; application migration and compatibility; data migration, management,
and integration; and security, Anderson
says. Beyond adapting to manage technological aspects of the cloud, IT will need
to manage relationships and details concerning external services and cloud providers delivering those services, which may
require developing new skills.
Bartoletti says organizations may need
to increase investments in training for

everyone from IT to developers or provide them the time to train themselves.


Constant learning is now part of the job,
and thats absolutely important, he says.
Fundamentally, cloud usage means more
technology assets the company owns and
doesnt own, making management key.
Further, the tools used to manage cloud
resources may be completely different
from what IT uses for in-house resources.
The tools may also be consumed in the
cloud. Currently, Bartoletti says, hiring
cloud-savvy employees is difficult and
pretty expensive, thus investing in training
for current employees to learn new tools
makes sense.

Anticipate & React


Change management is often a key part
of projects that is not handled properly,
Longbottom says. Inherently, people
dont like change, he says. To effectively
anticipate and react to the transformational stresses that adopting a cloud model
can introduce, he recommends involving
IT in pre-planning and encouraging IT
to be more responsive to business needs.
Longbottom explains that ITs thought
process must become: Of the cloud options available to us, which meets the businesss needs in the best business fashion?
The business, meanwhile, must shift back
to an IT-first model instead of embracing
shadow IT or viewing IT as just another
supplier, which fragments the business and
creates information silos, he says.
Before proceeding with a large-scale
cloud project, Anderson recommends
that organizations complete a list of
core tasks that, if done well, can result
in cloud usage that features optimization vs. planning and troubleshooting.
These tasks include identifying all of the

CyberTrend / December 2015

17

technical and non-technical stakeholders


cloud adoption will impact; outlining
desired outcomes; establishing an architecture that details every element of the
cloud environment; outlining business
terms of the cloud service and how to
measure and enforce them; and establishing exit criteria for business terms and
technology aspects.
Companies should also build an operations plan that includes management,
monitoring, and governance; build a business-process model that takes advantage
of the cloud solution; and build a financial
model thats aligned with budget cycles
and that projects costs out at least three
years to eliminate surprises. Anderson also
recommends conducting a security audit
that ensures the cloud solution meets security needs and expectations, and that the
auditing process/auditors sign off on any
regulatory requirements. Finally, train everyone impacted by cloud adoption.
Bartoletti, meanwhile, advises that companies start by focusing conversations on
applications. Specifically, they should concentrate technology efforts on delivering
terrific application experiences no matter
how theyre sourced, he says. A company
should take a fresh look at each and every
app its business depends on, determining
how critical it is to the business and if its
a real differentiator that provides a competitive advantage. Then, consider whether
you can move the app completely to the
cloud or if an MSP (managed service provider) can run the app. This kind of application analysis will lessen the fear of cloud
impact, he says.
For example, say a company relies on
100 applications. Upon evaluation, management might determine that 20 of those
apps are critical with heavy customization and should be kept in-house. Of the
remaining 80 apps, officials would want
to analyze which are decent cloud options
and assign a team to investigate a cloud
migration. Bartoletti says setting a cloud
plan by identifying which parts of the business change or move quickly and which

18

December 2015 / www.cybertrend.com

Enlisting a business champion can drive a shift to the


cloud. Arguing the case on technical grounds will never
work. Whats needed is someone who can argue the
case at a business level.
CLIVE LONGBOTTOM
Founder
Quocirca

applications it depends on does two things:


It lets employees know the company has
a plan, and, for employees focused on
those critical applications not moving to
the cloud, it lets them know the company
values those apps.

Changing The Culture


To align company culture with a cloud
model, operating like a cloud company
before adopting cloud services is one option. Use charge back. Delegate accountability for IT budgets. Think in terms of
self-service. Expect IT to operate in a scalable, elastic, and on-demand manner, says
Anderson. Think about all IT capabilities
as services.
To align company culture with a cloud
model, operating like a cloud company
before adopting cloud services is one option. Use charge back. Delegate accountability for IT budgets. Think in terms of
self-service. Expect IT to operate in a scalable, elastic, and on-demand manner, says
Anderson. Think about all IT capabilities
as services.
Identifying innovators open to trying
different things, as opposed to those who
milk certain skills theyve developed, is another option Bartoletti cites. Specifically,
he says, reinforce that the cloud isnt a
competitive threat but a new means for
doing jobs better. Similarly, Longbottom
says enlisting a business champion can
drive a shift to the cloud. Arguing the
case on technical grounds will never work.
Whats needed is someone who can argue
the case at a business level, he says. This
might mean the CIO enlisting the COO,
CFO, or other executive to emphasize how

adopting a cloud model will position the


business to better deal with the modern
markets, more rapidly change to meet new
needs, and deal with highly flexible market
conditions.
Ultimately, its likely companies will
question whether the effort is worthwhile
considering the disruption a cloud model
can bring about. Anderson believes its a
question every company must ask itself.
Benefits arent always obvious, and other
factors like industry, geography, company
culture, etc., all play into the potential for
success, he says. Most companies will
eventually find value in cloud services and
deal with the resulting disruption, he says.
Longbottom agrees, noting that a
majority of non-cloud IT platforms run
at about 10% server utilization, 30% to
60% storage utilization, and 30% network utilization. The wastage in energy, licenses, space, and resources to
run such underutilized systems is horrendous, he says. Cloud allows utilization rates to be upped to the 80% levels
in most cases. Further, because public
cloud entails sharing resources, prices
should be more predictable and result
in a platform cheaper than one operated in-house.
Grossner points to survey results
as evidence that enterprises intend
to increase cloud service adoption.
Specifically, he forecasted the North
American off-premises cloud services
market would climb from $40 billion
in 2013 to $200 billion in 2018. I dont
think wed be seeing that rate of adoption unless they believed the challenges
were worth it, he says.

The Philosophy Of As-A-Service


WHY ENTERPRISES ARE INCREASINGLY EMBRACING XAAS-CENTRIC STRATEGIES

KEY POINTS
XaaS helps companies gain
agility and flexibility and reduce
infrastructure responsibilities.
Some experts believe that to
be competitive in their markets,
vendors using traditional licensing models will need to add
as-a-service offerings.
Noteworthy XaaS offerings
include business process, database, desktop, security, and
unified communications.
Adopting an XaaS-centric
strategy can enable internal IT
organizations and businesses
to focus their efforts on whats
core to their business.

SOME COMPANIES USAGE of cloud


computing services consists of three
letters: S, P, and I. That is, S for SaaS
(software as a service), P for PaaS (platform as a service), and I for IaaS (infrastructure as a service). The alphabet soup
of as a service possibilities for other
companies, however, is much broader,
including the likes of BP (business process), C (communications), D (desktop),
DB (database), M (monitoring), N (networking), SEC (security), UC (unified
communications), and others.
The list of as a service offerings is
growing so long, in fact, it seems anything is available as a service today.
Thus, the XaaS acronym stands for
anything or everything as a service
under which all these as-a-service offerings reside. Its worth noting that for
some companies XaaS is more than
an acronym; its a preferred approach to
meeting infrastructure and service needs.

Often whats top of mind among executives when it comes to accessing all
these service types, says Colm Keegan,
Enterprise Strategy Group senior analyst, is determining which of them have
demonstrative value in terms of helping
their companies do things faster and
with greater agility. CEOs are seeing the
market changing so quickly, and their
businesses are entering spaces theyve
never been before, he says. Their motivation in embracing XaaS is helping ensure their companies have the ability to
react as opportunities emerge and compete effectively, he says.
Undoubtedly, embracing an XaaS
mentality has its benefits, including
helping companies reduce capital investments in infrastructure and enabling
quicker times to market, greater flexibility and ease in provisioning, and easily
accessing resources remotely. Although
SaaS, PaaS, and IaaS are the most familiar

CyberTrend / December 2015

19

And theres no right or wrong way necessarily. Its


about what your individual unique needs are and best
suiting those services to meet those needs.

as-a-service examples of XaaS, this article


explores several newer trends concerning
XaaS that are making an impact.

Setting The Scene


In short, XaaS collectively refers to a
growing list of cloud services delivered
over the Internet, typically via a subscription or metered model. To qualify
as cloud, Ed Anderson, Gartner research vice president, believes a service
must possess certain characteristics.
These include it being self-service, elastic
and scalable, shared among consumers,
metered, and accessible using standard
Internet technologies. Some technologies are better suited to as-a-service
delivery models than others, he says.
Further, Anderson suspects most companies will pragmatically choose technologies or services that best fit their
needs and then build hybrid management and integration around the broad,
disparate collection of stuff they use.
Dave Bartoletti, Forrester Research
principal analyst views XaaS as a broad
trend away from obtaining fixed, licensed services toward using elastic,
rented services. [XaaS] is affecting
every software category and, increasingly, every hardware category, as well,
he says. Companies are shiftingand
will continue to shiftto more XaaScentric strategies so they can get products to market faster, enter new markets
quicker, reduce up-front capital investments, and lessen their need to build out
internal data centers, he says.
To Bartolettis last point, XaaS makes
IT services consumable. Compared with
having to provision and configure internal infrastructure and loading data
over the course of days, weeks, or
months, XaaS effectively says, The infrastructure is already here in a model from
which you can specify what you need,
when you need it, for how long, and on a
metered or subscription basis.
To Bartolettis last point, XaaS makes
IT services consumable. Compared with

20

December 2015 / www.cybertrend.com

COLM KEEGAN
Senior Analyst
Enterprise Strategy Group

having to provision and configure internal infrastructure and loading data


over the course of days, weeks, or
months, XaaS effectively says, The infrastructure is already here in a model from
which you can specify what you need,
when you need it, for how long, and on a
metered or subscription basis.

Exciting Times
A key component driving companies toward meeting their needs in an
XaaS-centric manner is the desire to
focus their efforts on whats core to their
businesses as opposed to infrastructures.
Anything not core is often just as easy to
outsource, Keegan says. Moreover, doing
so can enable seeking out new areas to
enter relative, peripheral, or outside the
companys own market. Its kind of a
de-cluttering process, he says. There
are easier, more consumable ways of
obtaining services as or arguably more
reliably than they can do themselves,
Keegan says.
Interestingly, some XaaS offerings,
such as health care as a service, are extremely industry-specific. Although such
offerings will have a less broad consumption base, this doesnt make them necessarily less valuable. Thats really what it
comes down to, says Keegan. If theres
a service you can consume that will help
you save money and enable you to have
a laser focus on your business, there are
probably some legs to it.
Something to note concerning XaaS,
however, is that just because a provider
dubs its service as cloud, that doesnt
make it so. Anderson says he believes
many vendors brand services cloud as a

marketing attempt to position offerings


in a progressive way. XaaS services that
Anderson does deem truly cloud and
finds noteworthy include BPaaS, or the
delivery of a business process as a cloud
service. Essentially, BPaaS entails elements of SaaS, PaaS, and IaaS and puts a
business process (order management, for
example) in the service providers hands
to perform and monitor. Anderson also
tabs UCaaS, DaaS, DRaaS (disaster recovery), and ITaaS (IT as a service) as
newer XaaS trends worth noting.
Bartoletti also cites BPaaS and DaaS
as interesting as-a-service trends as well
as MBaaS (mobile back-end as a service).
Nearly every data center technology
can now be delivered as a service for at
least a subset of use cases, he says. The
most exciting as-a-service developments
are happening in the cloud and mobile
spaces, where time to market is critical.
MBaaS and SECaaS are especially attractive to companies building hyper-scale
Web and mobile applications and that
dont have the time or budget to implement the back-end technologies themselves, he says.
SECaaS is a market Gartner predicts
will extend beyond $3 billion this year.
The appeal of SECaaS rests largely in
the ability it provides to offload security tasks from internal IT while simultaneously providing security features
that are often better than what a company could implement and manage itself. Seemingly, any security component
an enterprise could want is available in
cloud form, including SIEM (security
information and event management);
Web application firewalls; advanced

threat management; and abilities addressing email and Web security, virus
scans, Web content and URL blocking,
spam, IAM (identity access management), and application security testing.
Like Bartoletti and Anderson,
Keegan views DaaS as something worth
considering. Essentially, DaaS is a form
of VDI (virtual desktop infrastructure)
a cloud provider hosts and offers via
subscription basis. Unlike a traditional
on-premises solution in which internal
IT would manage the VDI, DaaS typically entails a provider managing the
companys storage, backups, data security, upgrades, application support,
etc. Furthermore, DaaS enables users to
easily access data and applications from
anywhere and from any supported device. Overall, Keegan says, ESG still sees
desktops as something largely managed
on-premises. I think theres a market
for DaaS, but its probably going to be
applicable to newer companies that are
walking in without a sunken investment
in that infrastructure, he says.
Another as-a-service trend Keegan
believes has possible staying power is
DBaaS due to the practical uses and
flexibility it offers. For example, depending on the database vendor, he
says, youre not locked into either
way. You can consume it anyway you
want, including as a hybrid model. I
think thats what a lot of business are
shooting for. They want a combination
of things they can manage as well as
things they can push into the cloud.
Keegan points to one vendors DBaaS
offering as an example. A company
looking to perform, say, testing and development, could launch an instance of
the vendors database in its cloud space
in conjunction with what the company
has running on-premises. Unlike traditional IaaS, in which a company acquires
compute, storage, etc. but must still load
development tools and data before getting started, the vendor combines IaaS
with PaaS and a cloud database instance.

[Many companies] love it when they can consume their


technologies as service. [This makes XaaS offerings] an
important part of the market opportunity.
ED ANDERSON
Research Vice President
Gartner

Thus, the vendors developer tools


are pre-integrated in the cloud instance,
so youre pretty much off and running as soon as that infrastructure gets
provisioned, he says. When testing and
development is over, the company can
spin down the instance and stop incurring any more costs. And you can
ingest that data back from the cloud
into your on-premises location once the
development cycles are done and start
running the application from your onpremises instance, Keegan says.
Two additional as-a-service offerings growing in popularity are CaaS and
UCaaS. The former essentially provides
various communication options, such
as VoIP (voice over IP), IM (instant
message), and videoconferencing, that
the provider owns, manages, and delivers via the Internet. The allure for
customers is an ability to easily add and
reduce devices and features as demand
necessitates, all while greatly reducing
capital investments and management
responsibilities.
UCaaS, meanwhile, provides essentially the same unified communications
abilities companies have traditionally
run on-premises but moves them to
a cloud-hosted provider to operate
and manage. Companies typically pay
a monthly, per-user fee that can fluctuate depending on the number of users
and various UC features each uses.
Conveniently, these features can differ
per user. This compares to buying and
managing an internal solution, something which historically only larger enterprises could primarily afford due to
the complexity and costs involved.

Depending on the solution, communication features included in a UCaaS


product can include voice, messaging
(email, voicemail, unified messaging)
desktop video, IM, audio/video/Web
conferencing, presence technology,
contact center abilities, and collaboration features. Some companies decide
to deploy only certain point UC components to the cloud rather than obtain
a complete UC solution.

A Change In Tradition
In addition to influencing companies that are adopting as-a-service
strategies, the XaaS concept is impacting vendors that have traditionally sold products vs. individual
services. Traditional software companies must rethink licensing models,
says Bartoletti. Pay-per-use and selfservice access are the hallmarks of asa-service, and depending on the age
and penetration of an existing product,
it can be quite difficult for a vendor
to make the switch to completely new
licensing models.
Overall, many companies now
simply prefer purchasing a subscription and love it when they can consume their technologies as service,
says Anderson. This makes XaaS offerings an important part of the market
opportunity, he explains. In fact, he
adds, as-a-service is growing at rates
much faster than the traditional markets. If you extrapolate the trends, its
unlikely that technology vendors will
be able to compete over the long term
without an as-a-service option for
their offerings, he says.

CyberTrend / December 2015

21

The Problems With BYOD


DESPITE SOME SUCCESSES, BYOD GROWTH IS WANING AMONG LARGE ENTERPRISES

KEY POINTS
BYOD (bring your own device)
growth hasnt been as dramatic
as expected, especially in larger
enterprises.
Many companies dont have
mobility or BYOD teams in
place, and therefore dont have
formal BYOD policies.
Security and compliance are
two major barriers to BYOD
adoption, but cloud integration is also contributing to the
slower growth of BYOD.
Businesses using BYOD are
seeing improved employee
productivity, better IT support
efficiency, and cost savings.

22

December 2015 / www.cybertrend.com

BYOD (BRING YOUR OWN device) policies are often discussed because they
have the potential to save money and
minimize the burden on IT teams. And
although many businesses have found
success with BYOD, the adoption of
such programs isnt growing as much as
some anticipated.
In its recent Building Digital
Organizations report, CTIA points
out that the number of companies with
no BYOD policy at all is trending upward. In fact, from 2013 to 2015, the
percentage of organizations that dont
allow BYOD rose from 34% to 53%.
And while there are a variety of reasons
why businesses are finding it difficult to
adopt BYOD, some of the most basic
challenges relate to the immaturity of
BYOD policies in general, and the fact
that some enterprises simply arent prepared to do everything it takes to support such a policy.

The Current State Of BYOD


According to Terri McClure, senior
analyst with the Enterprise Strategy
Group, the state of BYOD is a little bit
all over the place right now and is essentially the Wild West. One of the
reasons why BYOD adoption is slow or
seems slow, she says, is because a lot of
organizations dont have teams in place
that are formally responsible for a mobility policy, and many of them are only
just now starting to build those teams.
Due to BYODs relative immaturity, organizations are still figuring out the best
path to making it part of their overall
business strategy.
Still, even with some uncertainty
around BYOD, the splits are relatively
even between companies that embrace
BYOD fully, partially, or not at all. We
found that actually a third of the organizations we spoke with have formal BYOD
policies in place that allow employees to

bring in their own devices for businessrelated needs, says McClure. Another
third have no policy in place, but they do
allow it. And then another third says they
dont have a policy place and employees
arent allowed to use their own devices.
Some of this split is due to companies
from different industries being unable to
implement it for compliance reasons or
due to the nature of the industry. If youre
in a highly regulated industry, like banking,
financial, or health care, then BYOD is
probably not going to be the solution that
your organization is looking for, says
Bryan Bassett, research analyst at IDC.
You have regulatory statutes and HIPAA,
so having an employee bring their own
device in to work probably isnt going to
be the best solution. The same could be
said for industries like manufacturing or
shipping and logistics. These devices are
mission-critical to the company and they
need to be used in fairly unforgiving environments, so you might not want to bring
your phone thats worth hundreds of dollars into that type of environment.

Barriers To BYOD Adoption


Security is another major barrier preventing BYOD from taking off, especially
in larger enterprises. McClure says that
with desktops and laptops, IT can have
total control because everything is inside

[With cloud storage, its] good if the employee can access


that data without storing it locally on their device, but at the
same time, if that device were to fall into wrong hands and
the cloud data isnt properly secured, then somebody probably has access to more than they should.
BRYAN BASSETT
Research Analyst
IDC

the VPN (virtual private network) and inside the firewall, but with BYOD the device is outside of ITs control. She says
businesses struggle to convince employees
to put MDM (mobile device management)
solutions on their personal devices for security, because they dont want the company controlling, seeing, and managing all
of the data thats on there.
To combat this issue, companies are
starting to take advantage of application
management, app wrapping, and containerization technologies to essentially separate personal and corporate data on the
device. However, many of these technologies are relatively new, so many organizations are just beginning to investigate how
to use them to help embrace enterprise
mobility and BYOD.
Another barrier to BYOD adoption relates to general integration into the IT in-

frastructure as well as integration with the


cloud. According to the CTIA report, mobility and the cloud are growing together
and often working hand-in-hand, which
is introducing new infrastructure-related
challenges. The report says that IT teams
are struggling to keep up, both in terms of
time and money, to properly implement
cloud and mobile technologies. In essence,
because the cloud and mobility are developing together, businesses are finding it
difficult to put the resources in place to
support both initiatives.
Its also with the integration between
mobile devices and the cloud that security issues come into play. Bassett says
employees accessing corporate files from
the cloud should be a great idea, in theory,
because the files arent stored locally, but
theres more to it than that. If that device
were to fall into wrong hands and the

These charts indicate the


percentage of companies
moving away from BYOD (bring
your own device) programs as
the primary device method.
(Reprinted from the Building
Digital Organizations report
with permission from CTIA.)

CyberTrend / December 2015

23

cloud data isnt properly secured, then


somebody probably has access to more
than they should, says Bassett. I certainly think that the cloud will play a big
part in it.

The Future Of BYOD


One interesting aspect about how
BYOD is playing out is that its gaining
more ground among smaller businesses
than larger organizations. McClure says
that SMBs (small to midsize businesses)
have an advantage in this area because
they often only have one person or a
handful of people in charge of the crossfunctionality of IT within the business.
Large enterprises, on the other hand,
often have desktop admins, network
admins, security people, governance
people, and legal people that all have
clearly defined responsibility, she says.
This makes it even more difficult for enterprises to get everyone on the same
page to not only build strong BYOD
policies, but also to enforce them once
theyre in place.
Even with all of this difficulty and the
major challenges, businesses are finding
success with BYOD, which could lead to
future growth as policies and technologies mature. In her 2015 research on IT

Organizationally, enterprises arent set up to support


mobility. . . . Server virtualization touches servers, networking, and storage, and there are wars between the
storage admins and the server admins. You get the same
thing here. Youre trying to balance security and productivity and youre trying to let people use their mobile
devices and meet the very strict security environments.
If you lock down your security requirements, people are
just going to go rogue and starting using their own thing.
TERRI MCCLURE
Senior Analyst
Enterprise Strategy Group

spending intentions, McClure and ESG


found that 18% of companies that had a
formal BYOD policy saw significant positive impact on employee productivity
and that IT was dealing with fewer endpoint device issues. In addition, 49% of
those companies with a formal BYOD
policy saw a positive impact on both employee productivity and IT support calls.
What this shows is that nearly 70% of
businesses that do fully support BYOD
believed that it had a positive impact on
the business. Bassetts research at IDC
also jibes with the idea of BYOD seeing
success, as a recent study found that 69%

of enterprises that went with BYOD saw


marked savings in their bottom lines.
These figures show that BYOD isnt
going anywhere, but at the same time,
that doesnt mean that alternative
models will fall away. In fact, Bassett
believes that BYOD will continue to
grow alongside corporate-liable, CYOD
(choose your own device), and COPE
(corporate-owned, personally-enabled)
policies. It all comes down to finding
the approach that best fits your business
and make sure your employees have access to whatever devices will help them
be the most productive.

IT teams are having difficulty


adopting both cloud and mobile
technologies for a variety of
reasons, which could be a contributor to the slow growth of
BYOD (bring your own device)
policies. This chart illustrates
why integration remains
challenging when it comes
to cloud/mobility adoption.
(Reprinted from the Building
Digital Organizations report
with permission from CTIA.)

24

December 2015 / www.cybertrend.com

GO DEEP

If quality time with the latest, fastest home computing technologies


is your idea of well-spent leisure time, CPU is the magazine for
you. Each month CPU serves up how-to articles, interviews with
tech industry leaders, news about cutting-edge research,
and reviews of the newest hardware and software.

Law Offices of Donald M. Gindy


1925 Century Park East, Suite 650
Los Angeles, California 90067
(424) 284-3123 www.gindylaw.com

Check out the latest edition right now at www.computerpoweruser.com


or on your iPad via the iTunes Store.

The Ethics Of Data Analytics


BUSINESSES MUST FOLLOW THEIR OWN POLICIES & CODES IN ADDITION TO THE LAW

KEY POINTS
Ethical data usage concerns
range from privacy issues and
making poor business decisions
to potentially endangering lives.
Following the law is a good
start, but maintaining legality
wont necessarily prevent you
from being unethical.
Its important to not only
create a strong internal ethics
policy, but to also build a culture of honesty and loyalty.
Data scientists should consider following a professional
code of conduct to help them
avoid unethical practices and
properly guide the business.

26

December 2015 / www.cybertrend.com

THE GROWTH OF the big data analytics


market is exciting for businesses because
it promises to bring new insights and
potential competitive advantages to the
forefront, but it can also introduce certain
ethical challenges. In fact, a recent Gartner
study showed that by 2018, half of all business ethics violations will happen as a result of improper big data analytics usage.
For that reason, companies need to be very
careful that as they embrace the idea of
using analytics to make business decisions,
they dont put themselves or their customers in jeopardy.

Potentially Harmful Analytics


Using data, especially the personal data
of consumers, for analytics and making
business decisions can quickly become
unethical or immoral if proper care isnt
taken regarding the types of information gathered and used. If a business gets
too personal with customer information,

it may be viewed as an invasion of privacy and could result in the loss of a customer. And even though the U.S. doesnt
have the strongest laws around privacy,
the European Union and other countries
have more stringent privacy laws, says
Michael Walker, co-founder and president of the Data Science Association. This
means, depending on where youre practicing analytics, that it can be easy to shift
from unethical to illegal.
Although privacy is a major concern
for some consumers, it isnt necessarily the
most significant risk associated with big
data analytics. In fact, according to Walker,
perhaps the biggest issue has to do with
improper use of data, and making faulty
assumptions that lead to business leaders
making ill-informed decisions. However,
this works both ways. On one hand, a data
scientist may make the mistake and give
the executive team bad information, which
will lead to a less than sound business

strategy. On the other hand, there may be a


strategy that the executive wants to implement, but they need justification, so they
go to the data scientist and tell them to find
information that supports the initiative
and essentially ignore any detracting data.
The problem with poor decisionmaking based on bad data is that it sometimes not only affects the business, but it
can also be passed along to the public. One
example that Walker uses is that of dietary
guidelines established in the 1950s. The
U.S. government warned the public about
the dangers of high cholesterol, high-fat
diets and instead recommended a high
carbohydrate diet. However, Walker says
these reports were based on weak correlations and weve only recently started to fix
those mistakes.
Now theyre finding out that you can
eat a high-fat diet . . . , says Walker. Its
the high-carbohydrate diet that really creates a lot of ill effects, and youre better off
eating more of a Mediterranean diet or
other types of diets that are lower in carbohydrates. I think a side effect from that is
that people became obese. . . . and I think
that creates a lot of the environment for
heart attacks and other types of bad health
effects. That all stems from negligent science, and theyre just now coming around
to correct it.
There are countless other examples for
data, in general, being used in unethical
ways from judges allowing junk science
into the courtroom that can skew a lot of
legal cases to quantitative analysts (aka
quants) on Wall Street building flawed
predictive analytic models that led to the
housing market crash and economic crisis
in 2008, Walker says. However, some of
these examples cross the line from unethical into illegal territory, which is a distinction businesses need to learn to identify
and take into consideration.

Illegal, Unethical
& Everything In Between
The problem with thinking in legal
terms with data analytics is that because

There is no consensus whatsoever on what the right


thing or wrong thing is to do and there are many ways of
looking at that, but there is sort of consensus that theres
a hierarchy of motivation for what you do something or
dont do something. The hierarchy of motivations we just
walked through from compliance to values is a very good
measure to figure out whether youre relatively mature or
relatively immature in your discussions on ethics.
FRANK BUYTENDIJK
Research Vice President & Distinguished Analyst
Gartner

the U.S. doesnt have stringent privacy


laws, simply following the law isnt enough
to prevent the unethical use of information. In fact, there was a recent case about
the National Security Agency collecting
data from phones that was first deemed
illegal by one court and then deemed legal
by another court later on, which shows
how the law can be a major gray area.
There are obvious situations where gathering data is illegal, such as hacking into
devices and stealing information directly
from them, but when it comes to business
analytics and big data, its better to not
only rely on the law, but also come up with
your own reasons for why you should or
shouldnt use data in a certain way.
Frank Buytendijk, research vice president and distinguished analyst at Gartner,
points out four ways to look at digital
ethics, both generally and specifically
as it relates to data usage. According to
Buytendijk, the lowest level, interestingly
enough, is the legal one. The reason for
that is that there are strategies that may
keep you out of legal trouble but that still
arent necessarily ethical. For that reason,
you have to consider the other three levels
of data ethics in order to make a more
well-informed decision. That starts with
the second layer after legal, which is deciding whether or not to do something
because of risk. Youre afraid of offending
someone or youre afraid that there are
negative consequences for you if you do,

says Buytendijk. Its acting out of some


kind of fear. Thats also in the lower stages
of moral development.
The next highest level has to do with
using or not using data out of differentiation. In this instance, a company may look
at its competitors that do collect and sell
information from consumers, and create
a specific policy where they promise they
wont sell consumer data or even gather
it at all. With all of the personalization
thats going on in health insurance at the
moment, I could totally understand if there
was a large health insurance company that
says, with us, we dont even do big data,
says Buytendijk. That could be a very
competitive positioning.
The highest level for deciding whether
or not to use data in certain ways has to do
with your organizations values. Although
Buytendijk admits that there is no consensus for right or wrong, there is a hierarchy of motivation that should aid in the
decision-making process. The hierarchy
of motivations we just walked through
from compliance to values is a very good
measure to figure out whether youre relatively mature or relatively immature in
your discussions on ethics, he says.

Building An Internal Data Ethics


Policy & Strong Company Culture
Buytendijk says because there is no
universally accepted common ground
for what is right vs. wrong regarding

CyberTrend / December 2015

27

data analytics, and because there are so


many different schools of thought on the
matter, the best thing you can do when
building an internal ethics policy is to
listen to all opinions and try to create
something that is defendable.
Instead of avoiding policy implementation because its too difficult, simply
do your best to build a policy that
you believe in and keep in mind that
someone somewhere will undoubtedly
disagree with it. And in the event of
a disagreement, you can at least trace
good intentions, address unintended
consequences as they arise, and make
changes to the policy to lessen the
chance of future disagreements.
In fact, Buytendijk says, there are
only three ideas that he would consider to be nearly universally accepted
(meaning he couldnt find a situation
where he wouldnt recommend them).
The first is understanding that there
is no universal truth; the second is
building that defendable policy; and
the third is performing the mirror test
when making a decision and building
an ethics policy. If you look at yourself in the mirror and it doesnt feel
right, then it probably isnt right, says
Buytendijk. How would I feel being
treated like this as a citizen, consumer,
or just as a person? How would my
mother, father, grandparents, or children feel? Make it personal.
But in addition to creating policies, you also need to have a strong
company culture to back them up.
If theres a bad organizational culture, I dont know that a code of ethics
is going to help at all, says Walker.
Some of these investment banks and
hedge fund managers in the financial
sector are just bad people and have bad
cultures. A code of ethics isnt going
to matter because theyre going to use
[data scientists] to justify whatever it is
they want to do. The number one thing
that an organization has to have is a
good solid culture where you respect

28

December 2015 / www.cybertrend.com

There are going to be bad actors out there that want


to misuse it, but we need to try to isolate them and
create a culture where people are going to use it for
the betterment of all. We want data science to be
used for the majority of people, not just the minority.
MICHAEL WALKER
Co-Founder & President
Data Science Association

peoples work and a culture of respect


and honesty.

Choose A Code & Follow It


While its important for businesses
to have strong policies and cultures
in place to help ensure proper ethics,
its also important for data scientists
and analysts to have their own ethical
codes so they can steer the business in
the right direction. The Data Science
Association, for example, has a Data
Science Code of Professional Conduct
that offers an outline for data scientist
to follow in their day-to-day lives. And
while there are quite a few concepts
covered in the code, there are three
major ideas that Walker likes to highlight: competence, client confidentiality,
and loyalty to the scientific method.
Competence. Walker says there are
quite a few garden variety analysts
who like to call themselves data scientists in order to get a larger salary
but arent fully trained in the scientific
method. He says that his team gets calls
from businesses all the time saying that
their internal data science team has
been leading them astray and asking for
help. Unfortunately, there are no standards or best practices in place to establish a minimum level of competence
required to be considered a true data
scientist, but he says its being worked
on and is maybe a couple of years away
from reality.
Client confidentiality. Walker says
that businesses are often concerned
about hiring expert data scientists who

are less likely to make mistakes and


false assumptions because data scientists are in such high demand and could
get snatched up by a rival company. The
worry is that if that were to happen, the
data scientist could potentially share
sensitive corporate information with a
competitor, which is of course highly
unethical and possibly illegal in some
instances. As an external data scientist,
I have a duty of loyalty to my client,
says Walker. If my team does work
for that client, they own it. I am not
allowed by my ethical code, and sometimes by law, to take that and give it
to one of my other clients. Thats extremely important, he adds, and one
of the dirty little secrets is that it happens all the time.
Loyalty to the scientific method.
Youre using the scientific method in
you predictive or prescriptive models,
and youre using the appropriate statistical power laws, he says. When
youre dealing with big data, and theres
a lot of hype around big data, youre
going to get a lot of patterns and correlations. Thats where a lot of these data
analytics teams go wrong. They start
seeing these patterns and correlations
and say, Wow, look at all of this. The
problem is that theres very little causality. You really have to find causality
to find high value. Not that patterns or
correlations dont have valuethey do,
and they can be very usefulbut you
cant start making decisions or be certain that this result is realistic and high
value until you have causality.

Greenovations
ENERGY-CONSCIOUS TECH

The technologies
that make our
lives easier also
produce some
unwanted side
effects on the
environment.
However, many
researchers,
manufacturers,
and businesses
are developing
solutions that are
designed to keep
us productive
while reducing
energy demands
to lessen our impact on the environment. Here's
a look at some of
the newest such
initiatives.

If current projections hold true, the worlds top energy consumers will also greatly increase their
renewable energy and nuclear energy production over the next 15 years.

Worlds Top Energy Consumers Will Also Increase


Their Renewable & Nuclear Energy Output
According to the World Resources Institute, Brazil, China, the European Union,
India, Indonesia, Japan, Mexico, and the United States are the top energy consumers,
together accounting for 65% of the worlds demand. This group also represents eight
of the top 10 greenhouse gas emitters, per the WRI. However, the WRI asserts that this
group will also greatly increase their renewable energy suppliesand, in some countries, their nuclear energy supplyby 2030 if stated plans come to fruition. China, for
example, will produce a 76% (or 2,800 tWH [terawatt-hour]) increase in its energy
supply between 2012 and 2030, and the European Union will produce a 112% (or
2,570 tWh) increase over the same time period. All of these figures are above the projected numbers. The map shown here illustrates the production increases.

Green IT Is Becoming Central To Organizations Core Business


Operations, Gartner Says
Corporations strides to go green is becoming less an exercise in social responsibility and more of an integral part of business operations, according to a new Gartner
report. Green IT is moving beyond the environmental characteristics of IT equipment, allowing organizations to improve their environmental footprint by using
equipment and services that have a low-carbon footprint themselves, says Vishal
Tripathi, research director with Gartner. As a result, the identification of wasteful and
inefficientor worse, environmentally harmfulconsumption is becoming essential.
Gartner adds that many data center solutions now include technology that reduces
energy consumption, particularly for equipment in idle mode, as a matter of course.

30

December 2015 / www.cybertrend.com

Making Fuel From Thin Air


A new power plant in Squamish,
British Columbia, performs an intriguing feat: It captures about a ton of
carbon dioxide from the air per day for
a process that turns that captured CO2
into fuels. If all goes well, the fuels will
be used to power the city's transit buses.

Lawrence Livermore National Laboratory postdoc Jianchao Ye (foreground) and Morris Wang are
two members of the 13-scientist team who developed a method for improving the lifespan and
performance of lithium-ion batteries. (Image courtesy of Livermore.)
A prototype for a new wind turbine design cuts
wind energy costs to about 12 cents per kWh.

Researchers Discover A Way To Improve


The Performance Of Lithium-Ion Batteries

New Floating Platform Design


Reduces Cost Of Wind Energy

Lithium-ion batteries are found in nearly every kind of mobile device these
days, offering the best all-around option available in terms of energy capacity,
rechargability, small size, and cost. In the evolution of portable rechargeable batteries, li-ion versions are significantly better in many circumstances
than the nickel-cadmium and lead-acid batteries developed before them.
There remains, however, the pursuit of a better, longer-lasting charge. With
that goal in mind, scientists at the Lawrence Livermore National Laboratory
have discovered a way to make li-ion batteries operate faster and last longer:
use hydrogen.
One problem with the large-scale production of li-ion batteries, which relies
on the use of graphene materials, has been that the chemical process used in
manufacturing leaves leftover atomic hydrogen, according to the Livermore
scientists. We found a drastically improved rate capacity in graphene nanofoam electrodes after hydrogen treatment, said Brandon Wood, co-author of
the paper about this discovery
at Livermore. In other words,
adding hydrogen improves the
speed and life of li-ion batteries.
The performance improvement
weve seen in the electrodes is a
breakthrough that has real-world
applications, says lead author
Jianchao Ye. Overall, this discovery dashes earlier suggestions
that the capabilities of li-ion bat- Paper co-author Jianchao Ye holds up a hydrogenteries were rapidly approaching treated li-ion battery. (Image courtesy of Livermore.)
their limits.

In the Canary Islands, wind energy costs about 24 cents per kilowatt
hour to produce, according to the
Universitat Politcnica de Catalunyas
Department of Civil and Environmental
Engineering. But researchers there
have devised a new floating structure
for supporting offshore wind turbines
that cuts that cost in half. The prototype
for the structure, called WindCrete,
is cylindrical in shape and includes a
float and a ballast to remain both afloat
and stable. In addition to the innovative shape, the structure uses concrete
instead of the steel that floating wind
turbines typically use. The researchers
behind the project say that concrete
fares better than steel in a marine environment (lasting about 50 years), and
that the construction cost for the concrete prototype is roughly 60% of what
it would cost were it made of steel. The
researchers recently announced the design had been built and patented.

CyberTrend / December 2015

31

The Challenges Of Outsourcing IT


THE COMPLEXITIES INVOLVED IN USING EXTERNAL PROVIDERS FOR IT OPERATIONS

KEY POINTS
Outsourcing IT is on the rise,
especially as companies acquire
new technologies.
Cementing a solid customerprovider relationship with an
IT outsource provider is key to
successful outsourcing.
Although IT service providers
have technical expertise, some
lack specialized business
knowledge, which can impact
business-specific IT functions.
One complexity in outsourcing IT: integrating the
external providers services into
the organizations existing environment for managing services.

32

December 2015 / www.cybertrend.com

IN MANY RESPECTS, outsourcing IT


operations has become so prevalent
that some executives dont give much
thought to what the process actually
entails or to the various complexities
and challenges it can present their organizations. Most likely, this is partially
due to the fact that on the surface, IT
outsourcing really amounts to nothing
more than an organization hiring an external provider to perform an IT function on its behalf. Below the surface,
however, things get more complicated.
Arguably, knowing the complexities and challenges that IT outsourcing
poses has never been as important due
to the increasing need for organizations
to stay up-to-date with new IT technologies in order to gain or maintain a
competitive business edge.
Having access to an infrastructure
that can enable taking greater advantage
of the organizations data and applying

analytics to that data at various fronts is


a notable example.
Increasingly, companies that cant
tap into such abilities via their current
IT outsource provider are jumping ship
for those that can, which coincidentally
introduces its own set of complexities.
Overall, even executives with a solid
grasp on what it means to outsource IT
likely have something to learn about the
difficulties the process can introduce.
This article illustrates some of these
challenges.

A Current Perspective
Organizations outsource IT operations for numerous reasons, including
to save costs and improve their technology foundation in an affordable
manner. Others seek to revamp their
infrastructure to launch a new business initiative or simply to place development, maintenance, support, and

management duties in someone elses


hands in order to concentrate their expertise and resources on the businesss
core competencies. Commonly, organizations outsource IT operations to
multiple providers, and some outsource
most, if not all, IT tasks.
Today, companies outsource everything from security to storage,
monitoring, IT access management,
network management, and hosting
of cloud applications, says Charles
Weaver, MSPAlliance CEO. These are
the more common services, but there
are dozens, if not hundreds, of types
of managed IT services offered globally, he says.
Wolfgang Benkel, Forrest Research
principal analyst, counts infrastructure services (service desk, end device services, data center and server
services, network services, etc.), application services (application maintenance, development, etc.), and
consulting services as common processes being outsourced. Forrester
is also seeing main vendors offering
business servicesapplications and
infrastructure services, such as BPO
(business process outsourcing).
In general, organizations primarily
outsource services that are more commodity-oriented and technical in nature while keeping business-relevant
functions and services in-house, Benkel
says. In terms of service life cycle functions (design, plan, build, run, etc.), he
says build and run functions are main
candidates for outsourcing, while design and plan functions are often performed in-house.
Benkel describes sourcing itself as
deciding the right sourcing model, be
it in-house, staff augmentation, managed services, or cloud computing,
based on a sourced strategy, business
relevance, and internal skills and experiences. Sourcing itself is increasing,
he says. Of these sourcing models,
Benkel counts staff augmentation,

At the extreme, if youre using Google for corporate


searches, then youre outsourcing that global search
capability to Google through its data centers and
software. These all have an impact on IT. Are they
IT outsourcing, process outsourcing, or business
outsourcing? Should there be any differentiation
these days?
CLIVE LONGBOTTOM
Founder
Quocirca

managed services, and the cloud as


outsourcing, as the resources reside
outside the company.
Increasing technology complexity
and accelerating business dynamics
and changes are forcing companies
to use more external resources to
master the future services, Benkel
says. Companies main objectives
or expectations in doing so include
gaining greater flexibility (CAPEX vs.
OPEX, for example), agility (faster
provisioning of additional capacities to
support growth), and reducing costs.
Outsourcing is increasing, but the
way to do it is changing. The definition
of what is in and what is out and why
is much more a decision process based
on criteria than as what companies did
in the past, he says.
Pinpointing which IT operations
are most common for outsourcing is
difficult, as different companiesparticularly those of different sizes and
in different countrieshave varying
needs and solutions available, says
Clive Longbottom, Quocirca founder
and principal analyst. Longbottom also
believes IT outsourcing is increasing,
although in what areas and by how
much depends on how outsourcing
is defined. For example, a majority
of companies now outsource breakfix functions, and increasingly more
systems are being managed externally
as system management tools become

more capable of run remotely via a


lights-out operation, he says.
Depending on the definition,
Longbottom also cites cloud computing, business processes (expense
management, payroll, etc.), colocation
(in terms of outsourcing the facility),
hosting (in terms of outsourcing the
facility and hardware), and SaaS (software as a service; or Web-based software) as IT outsourcing examples. At
the extreme, if youre using Google for
corporate searches, then youre outsourcing that global search capability
to Google through its data centers and
software, he says. These all have an
impact on IT. Are they IT outsourcing,
process outsourcing, or business outsourcing? Should there be any differentiation these days? Its difficult
to avoid outsourcing some part of IT
these days.
Weaver sees IT outsourcing as being
very much on the increase, pointing
to managed services and cloud computing as components that are particularly on the rise, taking a greater
percentage of overall IT services
spending. We have external data to
show that companies are spending on
managed services and that it encompasses a variety of vertical markets.
Some are more aggressive in growth
than others, but its very prevalent, and
it happens at all levels of the spectrum:
enterprise, mid-market and SMBs, he

CyberTrend / December 2015

33

says. Reasons why this is the case are


plentiful, but the generic answer is its
easier, cheaper, and better to use an
outsourced service provider to do the
work than take on the responsibility
and headache of doing it internally,
Weaver says.

The Complexities
IT-related headaches that outsourcing can alleviate can generally fall
into a few specific categories. One is
scalability. For many companies, its
easier to ramp up an IT services department thats outsourced vs. one
thats in-house. Another headache is
that IT operations consume in-house
resources that could otherwise be devoted to the organizations core competencies. You cant be an expert on all
things, Weaver says. Running an IT
department is something that demands
expertise, and if you dont have that expertise, why do it? You still rely on IT,
so why not outsource it?
Compared to decades ago when
starting a business essentially could
mean renting an office and buying
file cabinets and folders, a telephone,
and a typewriter, today computers,
servers, and mobile devices have replaced most, if not all, of these things.
Because of that change, you now need
providers that can help you set up and
manage not only the devices but the
data that sits or goes through those
devices, Weaver says. Thats why
I think youre seeing so many MSPs

[managed service providers]. Theyre


simply responding to what has always
been a flourishing of small business
throughout the world. Today, however, even those small businesses can
have demanding infrastructure needs
that extend their in-house abilities.
Benkel says increasing technology
complexity, accelerating business
changes, and high competition have led
to gaps in capabilities of existing IT organizations that they must resolve with
external resources and services, and not
just paying consultants for a service or
project but much more to buy services
(managed services) or cloud solutions to
obtain all the benefits of outsourcing.
Benefits include cost reduction; cost flexibility and elasticity; vendor expertise;
and investing in automation, provisioning platforms, and tool frameworks
for continuous improvement.
The main complexity involved with
outsourcing is integrating the outsourcing providers services into the
organizations existing service-management environment, as well as managing
multiple sourcing or vendor service environments. The technical complexity
is easier to solve than the complexity of
the management and the identification
of the right responsibilities that clients
should keep in-house, he says.
While hiring a consultant to ease
such complexities and help devise an
effective IT outsourcing strategy can
be worthwhile, its worth noting that
while external IT service providers are

We have external data to show that companies are spending on managed services and that it encompasses a variety
of vertical markets. Some are more aggressive in growth
than others, but its very prevalent, and it happens at all
levels of the spectrum: enterprise, mid-market and SMBs.
CHARLES WEAVER
CEO
MSPAlliance

34

December 2015 / www.cybertrend.com

technical experts, they may lack certain business knowledge. Benkel says
providers weakness in the area of
customers business or business understanding is why most sourcing strategies dictate that business-relevant and
business-strategic functions be done inhouse while technically relevant functions be outsourced. Conversely, many
organizations lack expertise in new
technologies to efficiently implement
infrastructures or automate service delivery. These are strong domains of
most external providers, Benkel says.
Weaver similarly says MSPs have
largely been technical organizations
for the last 20 years, with most coming
from engineering or IT backgrounds
but lacking training and skills in business areas. Theyve had to learn the
business side to supplement what they
already know, he says. As an industry,
MSPs are making very good strides,
however, and are learning to translate
the sometimes very technical into a
non-technical manner and explain
why what they do is very valuable,
he says.
Another complexity of outsourcing
IT is managing the relationship with
a provider. A company thats poor
at managing third-party relationships may have difficulties with even
the best provider. Weaver stresses the
importance of making the companyprovider relationship the foundation
of IT outsourcing. What you see most
commonly are customers that arent
sure what the capabilities of the provider are, he says. For example, a provider may indicate what functions it
will perform for the company but it
doesnt end up being exactly what was
described. Thus, the company ends
up saying, We really didnt get what
we wanted, so were going to another
MSP, Weaver says. MSPAlliance is
seeing organizations now on their
third, fourth, and sometimes fifth MSP
for such reasons.

As a result, organizations are becoming more aware of how to validate


and qualify IT outsource providers to
their needs. Weaver sees this type of educated consumer market as being extraordinarily positive, including because
what providers do is highly technical in
some cases and it can be difficult for a
company to know if the provider really
is managing its IT operations effectively.
If a company doesnt take the initiative
to educate itself, the only basis it has for
knowing its getting what it should be
getting is to ask, Does the stuff work
when I need it to? Weaver says.
Longbottom believes that as long as
companies enter into IT outsourcing
with their eyes open, the process
doesnt have to be complex. He advises
first ensuring the organization knows
exactly what is going to be outsourced.
Try and make it process-based, so that
you can define pretty closely what the
outsourcer has responsibility for, he
says. Never outsource the strategy.
For example, if the organization feels
a need to migrate to a more modern
OS, the outsourcer must respond to
that unless it has a very valid reason
not to. By allowing the provider decision-making power, it may take a status
quo approach, as it costs them less in
terms of updating staff skills, he says.
Further, the provider can cascade less
capable staff into your environment,
Longbottom says.
Longbottom also recommends conducting constant reviews, as an outsourcing deal even only six months old
may no longer fit a particular purpose.
Ensure up-front that flexibility isnt
going to result in hidden costs; that the
outsourcer doesnt double its prices just
because you asked for a small change,
he says.

Specific To IT
Compared to outsourcing other
types of processes or roles, outsourcing
IT can be more complex, although

All vendors and client organizations know how to manage and set up servers or PCs and do this in the same
way. Application services are more client-individualized,
however, meaning more complexity for vendors in terms
of learning the application and other issues.
WOLFGANG BENKEL
Principal Analyst
Forrest Research

Weaver believes the differences are


minute. The similarities are what
we should pay attention to, he says.
Specifically, no matter whats being
outsourced, organizations arent abdicating their rights to own that process.
If outsourcing taxes, for example, the
company is still responsible for its taxes.
If you outsource IT, you still need to
know whats happening. You still need
to be responsible for the IT assets you
own. In that regard, I dont think its
any different than outsourcing something else, Weaver says.
Benkel, meanwhile, says infrastructure services are more commodity in
nature (standardization, mature pricing
models, etc.) than application services.
All vendors and client organizations
know how to manage and set up servers
or PCs and do this in the same way.
Application services are more clientindividualized, however, meaning
more complexity for vendors in terms
of learning the application and other
issues, he says. This is the service
perspective, but we also see that complexity is increasing if responsibilities
arent clearly defined. If organizations
arent willing to transfer more responsibility to vendors, for example, fingerpointing, tedious negotiation processes
for improvements, and frustration for
everyone can result.
In the long term, companies that
outsource IT services must ensure services will evolve, improve, and refresh
over time so that those services are always up-to-date and can meet rapidly

changing business needs, Benkel says.


Innovation for new services and solutions is also key, as is choosing IT providers with the capability and expertise
to support these processes.
Similarly, ensuring IT outsourcing
providers know whats occurring in
technical markets and what impact
this has on its environments it vital.
Longbottom says if too much is outsourced, its difficult to track whats
moving from physical to virtual to
cloud platforms and what is means to
the business or IT itself. In general, aim
to choose a provider thats honest, will
regularly convey what it sees as happening in the markets, and that can
spell out what options this presents
to the business and what the costs of
changing or not changing will be.
Overall, an organization that is unable to gain visibility into what a vendor
is providing should ask more questions.
Even the garden variety MSP should
be able to say, This is what were doing.
This is how were earning our paycheck
every month, Weaver says. This includes providing external data, reports,
etc. that indicate whats been patched,
monitored, backed up, etc.
Conversely, its organizations responsibility to use the information to
drive accountability. We hear a lot
from MSPs that We give customers a
lot of data, but they dont care. Theyre
not reading it. Theyre not paying attention to it. That falls on the end users because they have responsibility, as well,
Weaver says.

CyberTrend / December 2015

35

Data Center Consolidation Done Right


PROPER PLANNING IS KEY TO A SUCCESSFUL MIGRATION

KEY POINTS
Make sure you have a plan in
place that details what needs to
be done and who is in charge of
each step along the way.
Cost of moving and lifecycles
are two ideas to consider when
deciding whether to keep existing hardware or buy new.
Use a combination of physical
tapes and over-the-network updates to migrate data and apps
from one location to another.
Consolidation offers many
benefits, but without proper
guidance and communication, it
may take longer than usual and
you may not meet your goals.

36

December 2015 / www.cybertrend.com

THE MASSIVE amount of data flowing in


and out of data centers, along with the
increased storage demands of employees
and customers, have led many organizations to build additional data centers
or buy rack space in third-party facilities
in recent years. In an effort to grab onto
as many resources as possible, some of
those companies may have oversubscribed
themselves and ended up with more infrastructure or more facilities than they need.
This is where the concept of data center
consolidation comes into play. Whether an
organization decides to pare down the infrastructure in one data center or combine
multiple data centers into one, consolidation is a complicated process with multiple
steps that requires putting a solid plan in
place and sticking to that plan.

Reasons To Consolidate
Companies decide to consolidate data
centers for different reasons. Perhaps the

most common reason is to reduce costs.


David J. Cappuccio, vice president and distinguished analyst at Gartner, says many
companies find themselves in a position
where, either through organic growth
or because of merger and acquisition activity, they end up with multiple (up to
a dozen or more in some cases) data centers. Because these separate and often disparate data centers run at the same time
and sometimes perform the same services
as one another, according to Cappuccio,
they may create unnecessary redundancy
and, in turn, unnecessary operating costs.
In an effort to reduce hardware and software costs, companies will turn to consolidation as a way to combine multiple data
centers into one.
In addition to cost reduction, theres
also the opportunity to get better efficiency, so you dont need as much physical
infrastructure, says Colm Keegan, senior
analyst with the Enterprise Strategy Group,

which can also lead to further cost-saving


benefits, including reduced power and
cooling expenses. Even within the same
facility, a company can consolidate a large
number of smaller servers into a handful
of larger, more efficient and better-utilized
systems. Keegan points out that doing this
not only leads to improved power and
cooling efficiency, but also makes dayto-day management and maintenance of
those systems much easier.
Aside from the physical reasons, you
also need to look at security and compliance when considering consolidation.
You can have smaller office locations
where maybe the carrying costs arent that
high relative to a full-blown, raised-floor
data center, but you may have concerns
around data security and data protection,
says Keegan. You may be driven by a
need to comply with regulatory statutes
where data that is protected under those
statutes needs to have a chain of custody
and you need to prove that youve maintained that chain of custody. Its a little
easier to do that in a central location than
to have it in multiple sites.

Prepare For Consolidation


After you have a firm grasp on why you
want to consolidate and what your ultimate goals are, you have to get deep into
the planning process to make sure you
have a clear path to follow. The first step,
according to Cappuccio, is to figure out
exactly what the objectives are you want to
meet and the metrics you want to track.
It isnt enough to say that you want to save
money, because that means other aspects
of the process may get lost in the fray. For
example, what happens to the people running those data centers you plan to consolidate? In an effort to save money across the
board, you may end up weakening your
workforce because valuable employees
may leave your organization due to fears of
losing their jobs.
The first mistake people make is they
tend to not get HR involved early enough,
says Cappuccio. If a company is going

Its like anything else; anything thats well thought out and
planned beforehand, generally the better off youll be after
the fact as opposed to rushing headlong into something
because there are some compelling events. You may be at
the end of a five-year lease at a major data center, youre
behind the 8-ball, and youre rushing to get out because
you dont want to incur penalties for overstaying your
lease. Those are the realities. As much as youd like to
prepare . . . sometimes it just doesnt happen that way.
COLM KEEGAN
Senior Analyst
Enterprise Strategy Group

to consolidate five sites down to two, for


example, he says someone needs to go
through a list of all the employees supporting the three sites, note which ones are
important to keep or involve in the consolidations process, and then talk to them
before an announcement is made. Tell
them whats happening, how youre going
to make their life whole, how their job is or
isnt going to change, and make sure thats
all resolved up front before you even start,
says Cappuccio.
After you get employees squared away,
you can start to think about the infrastructure and making sure that what you
consolidate down to is still capable of supporting the needs of your employees and
customers. Beware the potential of consolidating too much and not taking into
account future growth. You could have
a situation where theres some contention
there where the resources are being shared
across these environments and you have
to plan how to [distribute them] in a way
thats equitable, says Keegan. That can be
tricky, and its something youre going to
want to put some thought into.
Keegans advice is to not rush the process if at all possible. He says that organizations sometimes get into situations where
theyre at the end of a five-year data center
lease and need to move quickly so they
dont incur penalties for overstaying. The

key is to prepare as far ahead as possible to


give yourself time to properly plan. If not,
you could end up with a situation where,
for instance, you dont properly bake
BCDR (business continuity and disaster
recovery) into your consolidation plan, so
you end up not having enough redundancy in place and your BCDR team
has to work on a shoestring budget,
Keegan says.

Impact On Physical Infrastructure


Once consolidation begins, there are decisions to be made in terms of keeping or
replacing equipment. Cappuccio says a lot
of these decisions come down to product
lifecycles and, in many cases, people dont
relocate equipment because youre talking
about not a few miles but sometimes hundreds or thousands of miles, so its just not
worth shipping that equipment, stabilizing
it, recertifying it, and reinstalling it. He
says there are situations where companies
have no choice but to move big systems,
like mainframes or large UNIX systems
and that some companies opt to move
entire data centers. He points out that its
often easier to bring in a third party to sell
the equipment for you after the move.
Keegan agrees that the ideal case for
physical infrastructure in a data center
consolidation is to buy new equipment
for the new facility and not move any of

CyberTrend / December 2015

37

A lot of times when people consolidate, they think theyll


save a lot of money on staff and they dont. In fact, in
most cases, weve seen staff costs either stay fairly flat or
sometimes go up slightly. . . . When you use IT staffing as
a rationale for consolidation, its very difficult to justify that
youve won at the end of the game, unless you have a
really good understanding of what the true business costs
are before you start consolidation.
DAVID J. CAPPUCCIO
Vice President & Distinguished Analyst
Gartner

the older equipment over. However, as


all businesses know, ideal cases are few
and far between. For that reason, Keegan
says its often a mixture of both where you
replace equipment thats end of life and
move equipment that still has operational
value. It really depends on what the remaining life is of the asset that will drive
that decision, says Keegan. You look at
what the cost is for the labor to move it,
transport it, and the risk of that system getting damaged in transit. Nothing ever goes
as planned and there are always going to
be hiccups. The less you have to physically
move, the better off you are.

Migrating Data & Applications


Another facet of the consolidation
process that sometimes get overlooked is
the migration of data and apps from one
location to another. In some cases, you
may be able to send data over the network,
depending on how much bandwidth you
have, but Keegan says the most cost-effective way is typically to use backup tapes.
Say you cut a backup on a Friday night,
and then overnight the tapes, he says.
They arrive on Saturday and you start
doing the restore. The restore could last
multiple days if not multiple weeks. Basically, what you have is the baseline copy
at that point, then you say, OK, we have
changes to apply to that underlying data.
It could be several days or several weeks
worth of changes, but at least you have the

38

December 2015 / www.cybertrend.com

bulk of the information there already, and


then you can just push the changes over
the network.
After you get that data in place, Keegan
recommends baking time into your plan
to test systems before you put them into
real-world operation. He says the typical
process is to gradually bring up less risky
systems before moving on to the more
custom-facing applications that need to
work properly from the moment theyre
turned on. You also want to keep both
environments running in tandem for a
period of time, just in case there are any
unforeseen issues that crop up at the new
site, says Keegan. This process can take
weeks or even months to get through, but
once you feel comfortable with how the
new facility is running, you can go through
preparations for shutting down your old
data center.

Advantages & Potential


Disadvantages Of Consolidation
There are plenty benefits and advantages to strive for when consolidating data
centers. You have the opportunity to improve energy efficiency and reduce energy
consumption by as much as 50% to 60%,
according to Cappuccio. You can cut costs
across the board and make management
easier for your IT team by standardizing
and centralizing the locations of things to
get control of the architecture itself. These
are all potential benefits of data center

consolidation, but only if you go about the


process in the right way. There are a few
pitfalls as well that could cause problems
at different points throughout the process.
For instance, Cappuccio says, if the
project isnt managed from a very high
level, then the individual business units
may become the biggest problems. What
he means is that many of these remote
sites or secondary offices evolve to a point
where they have their own ways of operating, so theyll keep finding reasons why
you should consolidate someone else,
which can potentially extend the time it
takes to consolidate by months or even
years. Having that central authority who
will put a stake in the ground and say, This
is what were going to do and this is the
timeline, and stick to it, thats the only
way to make these things really work well,
Cappuccio says.
One of the biggest challenges of data
center consolidation for Cappuccio, and
one that needs to be addressed to ensure
the success of a project, is communication.
He says many people dont realize the consolidation process can take as long as two
to four years, and even though companies
typically start with a meeting to discuss
the plan, its hard to keep people engaged
over a long period of time. Its not enough
to send out email reminders and updates
about progress.
What works well is creating some kind
of portal that is essentially a communications log of everything that has to do
with the consolidation from end to end,
says Cappuccio. Lets say its role-based,
so based on my level of authority, I can
log in and get different levels of information about the project: where it is, whats
happened so far, whats coming up, what
the timelines are, who the new people are
involved, what the phone numbers are to
call, etc. I have a central source of authority
to go to. If I dont have that, then rumors
quickly start flying and it gets really chaotic. I see communication being one of the
biggest problems that most of these projects have.

Desktop Improvement vs. Replacement


KNOW WHEN TO UPGRADE & WHEN TO BUY NEW

IF YOUR ORGANIZATIONS policy is to replace its PC fleet on a constant basis, you


may find the task can become quite expensive and impractical. The truth is that not
everyone in your organization needs a new
computer every two or three years. In fact,
some employees may be able to keep their
existing PCs for five years or more with the
help of simple upgrades and refurbishing.

What Components
To Upgrade & Why
Compared to laptop computers,
desktops provide IT departments with
a greater number of upgrade options.
As with laptops, upgrades to a desktop
computers memory or storage offer the
simplest and often the most productive opportunities for improvements.
Unlike laptops, however, desktops allow
for significant upgrades, such as changes
to motherboards, graphics processors,
optical drives, and power supply units.

You can even replace a desktops cooling


system by swapping out fans for liquid
cooling or vice versa. Each one of these
components will have a different effect
on the performance of the desktop, which
means that you may need to replace only
one or multiple parts to get the computer
back up to speed. The key to a successful
upgrade is pinpointing the specific problems users are having and then choosing
the right components to upgrade.
Some performance issues are obvious
and easy to pinpoint, such as a hard drive
thats close to full capacity. In that situation, you would want to either replace the
internal hard drive with a larger one or
buy an external hard drive to add more

capacity without having to make any


changes to the PCs internal components.
However, even with something as simple
as a hard drive, youll have some decisions
to make. Do you want to stick with an
HDD (hard-disk drive) or go with an SSD
(solid-state drive)? HDDs are typically less
expensive than SSDs, especially when you
get into the larger capacities, but SSDs can
give you some additional boosts, such as
faster operating system boot times and
better overall performance. Not all systems support this upgrade, however.
Another relatively simple upgrade you
can make is to your desktops memory.
If you havent bought new computers for
quite a few years, then chances are they

THE KEY TO A SUCCESSFUL UPGRADE IS


PINPOINTING THE SPECIFIC PROBLEMS
USERS ARE HAVING AND THEN CHOOSING
THE RIGHT COMPONENTS TO UPGRADE.

CyberTrend / December 2015

39

could be skating by with as little as 2GB


or 4GB of memory. Fortunately, the price
of memory has dropped in recent years,
which makes memory a more reasonable
purchase for organizations to make. Even
a quick NewEgg.com search returned result of 8GB of memory that cost as little as
$40 or $50 and 16GB of memory that cost
around $70 to $100. Keep in mind there
are different types of memory, such as
DDR3 and DDR4, so you need to doublecheck what type of memory your systems
motherboard supports.
When considering upgrades to other
components, keep in mind that your
desktops motherboard may only support
certain processors, memory types and capacities, graphics cards, and inputs and
outputs. In other words, you may run into
situations where the entire motherboard
needs to be replaced to support the upgrades you need.
When you start digging into the advanced components, thats when its most
important to decide which employees
require computer upgrades and why.
The average office worker who only uses
email, Microsoft Office, and an Internet
browser probably wont need the highest
performance desktop PC available, so a
few simple storage or memory upgrades
may do the trick. But for video editors,
graphic designers, engineers, and others
with more demanding computer requirements, you may need to institute a consistent upgrade path to make sure your
desktops are capable of handling the latest
and greatest software. In those situations,
you may need to buy new motherboards
every year or two to keep up, or you may
just need to buy new PCs altogether (more
on that later).

Cleaning, Repairing
& Refurbishing
A desktops tower is essentially a case
that houses the computers components.
One nice aspect of this type of setup is
that towers are easy to open for cleaning,
repairing, and refurbishing purposes.

40

December 2015 / www.cybertrend.com

Dust is sometimes the No. 1 enemy of


desktop computers because it can collect
on individual components, cables, and
connectors, generally gumming up the
works and causing potential performance
issues. For that reason, youll want to open
your desktops case every so often and use
a can of compressed air to clear away dirt
and debris.
Apart from basic cleaning, there are
also situations where you may need to
open the case in search of causes to specific problems. For example, if applications are performing poorly, there could
be an issue with your systems memory.

desktop case, then he may cause more


problems than he solves.

Buy New When The Situation


Calls For It
The key with upgrading desktops is
to not go overboard. You may get so involved in the process that you end up replacing each individual component when
all is said and done, which can quickly
become expensive. Instead, you need to
pick your battles and only upgrade components that make financial and practical
sense. And if it gets to the point where
your desktops are five or six years old and

WHEN YOU START DIGGING INTO THE


ADVANCED COMPONENTS, THATS WHEN
ITS MOST IMPORTANT TO DECIDE WHICH
EMPLOYEES NEED UPGRADES AND WHY.
Most memory is separated into modules
(i.e., two 4GB modules to make up 8GB
of total memory), so its possible only one
module is to blame. If you can easily identify the damaged module or determine
the troublemaker by trial and error, then
you can replace it and solve the problem
quickly. On the other hand, if you are investigating what is causing your system to
overheat and turn off automatically, you
could be dealing with something as simple
as a broken fan, or it could be something
more complicated.
The important thing to keep in mind
when repairing and/or refurbishing
desktops is to know your limits. If you
have an IT expert (or team) who is familiar with repairing and upgrading PCs,
then you may be able to rely on him to
troubleshoot issues and make the necessary fix. But if you dont have that onsite expert to rely on, it may be best to
speak to the manufacturer about potential fixes for the desktop, send it to a
professional refurbishing company,
or have a third-party come in to service
the system. If you let someone with little
experience tinker around inside the

simple upgrades arent doing the trick


anymore, then it may be time to buy new
desktops. Once you make the decision
to buy a new system, a whole host of options will open up: You can buy prebuilt
desktops from a manufacturer with lower
or higher specifications, depending on the
use case, or you can customize desktops to
your exact needs.
If your company is interested in virtualization or moving more workloads to the
cloud, then you may not need to have as
much firepower in your office desktops.
In fact, since much of the processing will
be done in the cloud or on a virtualized
server, you can essentially buy less expensive commodity desktop hardware and
save money on new PC purchases. In this
case, it would be worthwhile to investigate
VDI (virtual desktop infrastructure) and
thin-client computing options.
Ultimately, the path you choose depends on the needs of your workforce, so
you cant force one solution on every employee company-wide. Identify specific
use cases and then go from there to ensure successful upgrades or new desktop
purchases.

Add A Guest Wi-Fi Hotspot


ESTABLISH A SEPARATE, PUBLIC WIRELESS NETWORK FOR GUEST ACCESS

FOR BUSINESSES, guest Wi-Fi


hotspots are great for providing customers and visitors with an Internet
connection without giving them access to the companys private network. Home users can even benefit
from setting up a separate Wi-Fi
hotspot to provide wireless access
to their Internet connection when
friends or family come over to visit.
Although this article is geared toward small to midsize businesses, it
includes information that is also applicable to larger businesses and consumers. We will describe what you
need to get started, as well as what
to expect along the way. Setting up
a guest Wi-Fi hotspot is a relatively
simple task, whether you want to use
an existing router or consider a new
model that includes features that
make it more appropriate for sharing
a Wi-Fi connection.

42

December 2015 / www.cybertrend.com

Internet Connection
A strong, stable Internet connection is the foundation on which you
will set up your guest Wi-Fi hotspot.
Having a good connection that works
properly will stave off potential questions and complaints, particularly if
you go out of your way to advertise
your free wireless network to customers or visitors. Pin down the approximate number of simultaneous
connections you expect the hotspot
to support at any given time and use
that figure to determine (1) what type
of Internet connection you need and
(2) what router capabilities are necessary to handle the expected amount
of traffic.
Not all Internet services are created
equal. For example, the most basic
Internet services available from ISPs
(Internet service providers), either
broadband or DSL (digital subscriber

line), offer speeds that tend to start at


between 5Mbps (megabits per second)
and 10Mbps. In some cases, the reported speeds are just the peak, actual
average speeds can be much lower. In
the context of the modern Internet
loaded with streaming media and
multi-tab browsing these speeds are
on the lower end of being acceptable.
To give you an idea just how slow
they are, a 25MB video downloaded
on a 5Mbps connection could take as
long as 40 seconds. Using a 50Mbps
connection, that video will arrive in as
little as 4 seconds.
On the high end, broadband speeds
can reach data transfer speeds as high
as 150Mbps or more, which is more
than enough to handle five or more
simultaneous connections. Most wireless routers will limit the number of
connected users that are permitted to
use the guest network simultaneously.

If you anticipate that your guest


Wi-Fi hotspot will attract a large
number of users, you may consider upgrading your service to fiber optic or
designate multiple dedicated broadband
connections. Note, fiber optic Internet
service may have limited availability in
your area.

Wireless Router
If you plan to use one wireless router
to support two separate networks, look
for a business-class router that supports
a maximum data transfer speed of at
least 300Mbps; some models will support much faster rates. The most recent
Wi-Fi standard available as we went to
press is 802.11ac, which can offer realworld data transfer rates of 600Mbps
and better when communicating with
devices using 802.11ac adapters.
A dual band router is a good option
because it operates on both the 2.4GHz
frequency (which most current and
older devices support) and the 5GHz
frequency (which 802.11n/ac devices
support, and which offers the fastest
data transfer speeds). Most dual-band
routers offer the choice of broadcasting
in 2.4GHz, 5GHz, or both simultaneously to support the widest range of
devices and prevent signal interference.
Many consumer- and business-oriented routers let you easily create two
separate networks: one you can use for
your companys internal network, and
one you can use to provide wireless
Internet access for guests with Wi-Fienabled devices. Wireless routers and
access points can range in price from
$20 to as much as $300 or more depending on their speed and feature sets,
so make sure you only pay for functionality that you and your customers or
visitors are likely to use.

Software
All wireless routers come with a software or firmware-based user interface
that enables you to change the routers

settings. Some software might be included on a disc or accessible only via


a Web browser and the routers default
IP address. When attempting to access
the settings menu, always follow the instructions specific to your device.
Most often, you can launch a Web
browser on the computer connected
to the router, type the IP address for
your router into the address bar (a commonly used IP address 192.168.1.1, but
this can vary by device), log in, and
then manage the routers settings as
desired. If the router your organization
doesnt support this feature, you may
be able to use a third-party firmware
to set up standard and guest networks
and tap into numerous additional features. CoovaAP (www.coova.org) and
DD-WRT (www.dd-wrt.com) are two
examples of free firmware that you can
install on select routers.

Setup
Once you have all of the hardware
and software you need for a Wi-Fi
hotspot, its time to get it up and running. Most mainstream routers sold in
the U.S. have software that is intuitive
to use and makes it simple to add a
guest hotspot. With some routers, its
as easy as clicking Yes during setup to
enable and allow guest access, but with
other routers you may need to follow
more steps or launch a special setup
wizard. Because there are variations in
this process depending on the device
and manufacturer, check the manual
for instructions specific to your router.
In the process of using the routers
software to establish a guest Wi-Fi
hotspot, you will discover relevant settings that provide you with further
control over the hotspot. Some devices
allow you to set specific days of the
week and ranges of time during which
the guest network can be accessed. Use
these settings to make sure the hotspot
is available when youre open for business and not available when youre

closed; this prevents unwanted, unauthorized access.

Security
Perhaps the most important thing to
think about when setting up a public
Wi-Fi hotspot is to make sure that your
guest hotspot is separate from your
company network. Most routers support WEP (Wired Equivalency Privacy),
WPA (Wi-Fi Protected Access), and
WPA2 technologies, which provide for
encryption and password protection.
Use one of these settings as a minimum
safeguard from potential unauthorized
access or abuse; we recommend using
WPA2 as it provides the best security.
From a guests perspective, Wi-Fi
security means that they will have to
select the SSID (service set identifier,
or network name) if it's visible, or type
it in when connecting if the SSID is
hidden, as well as enter a password
in order to log on to the network. To
simplify matters, make sure your guests
know how to obtain a password, and
change the password on a regular basis.
Requiring guest users to accept a ToS
(Terms of Service) agreement can also
be beneficial. You can do this by employing a captive portal, a common feature of business-class routers, which is
essentially a splash page users will see
on their device screen when logging on
to the network.

Other Considerations
Using router settings, you can impose bandwidth controls to protect
your other networks from experiencing
a bottleneck. You can also set guest
connection time limits and designate
which websites or applications are
permitted to use the hotspot and even
charge fees for using the hotspot. Once
you have installed any necessary hardware (routers, range extenders, and access points), adjusted the settings, and
turned on your guest Wi-Fi network,
its ready to be discovered and used.

CyberTrend / December 2015

43

The Road Ahead For Wi-Fi


PRESENT & FUTURE TECHNOLOGIES ARE ABOUT MUCH MORE THAN SPEED

OVER THE PAST 15 years, Wi-Fi technology has changed dramatically. Early
Wi-Fi specifications 802.11a (using the
5GHz frequency) and 802.11b (2.4GHz)
supported maximum speeds of 54Mbps
and 11Mbps, respectively, and had limited
range. Current and upcoming standards,
however, are pushing Wi-Fi well beyond
those limits. The now-popular 802.11ac,
for example, uses the 5GHz frequency
and offers speeds up to 1.3Gbps. Although
real-life data transfers are much slower
than these theoretical maximum speeds,
the leaps between 802.11b, 802.11n, and
802.11ac are quite noticeable, and nearfuture specifications stand to leave existing
Wi-Fi standards in the dust.
Alongside the changes in speed and
range, the Wi-Fi Alliance continues to add
features and functionality to Wi-Fi networks via its Wi-Fi CERTIFIED programs.
A Wi-Fi CERTIFIED logo indicates the
product has undergone rigorous testing

44

December 2015 / www.cybertrend.com

and meets specific performance requirements. The Wi-Fi CERTIFIED concept,


however, goes beyond speed, range, and
reliability, as this article exlpains.

because they can share company-related


information between their devices and
avoid prying eyes from seeing or intercepting data over a public hotspot or potentially insecure network connection.

Wi-Fi Direct
Wi-Fi CERTIFIED Wi-Fi Direct is a
P2P (peer-to-peer) networking technology,
meaning that it doesnt require a router or
wireless hotspot to connect devices. With
Wi-Fi Direct, you are essentially creating
a dedicated Wi-Fi bridge between two devices thats protected by WPA2 security,
so you can safely send information back
and forth. To establish a Wi-Fi Direct connection, users can tap two NFC-capable
devices together, such as a pair of smartphones. You can also connect mobile computing devices, such as smartphones and
tablets, to compatible peripherals, such as
printers, and displays.
Business users in particular might appreciate the benefits of using Wi-Fi Direct,

Miracast
Building on Wi-Fi Direct (and in
fact using that exact P2P technology as
its foundation) is Miracast. The Wi-Fi
Alliance created Miracast as a certification program specifically for streaming
images and video between devices (such
as between two laptops, or from a tablet
to a large-screen display). Miracast is open
to any vendor, and there are scores of vendors and thousands of products that are
Miracast-certified. Smartphones, tablets,
ereaders, set-top boxes, laptops, projectors,
cameras, gaming devices, and displays are
among the types of products that manufacturers can design with Miracast certification in mind.

There are clearly many entertainment


uses for Miracast. In the business world,
Miracast enables you to share your device display directly with co-workers in a
meeting room without having to worry
about network access, quality, or speed.
As another example, you can connect
Miracast-certified laptops and projectors
to share presentations and other information quickly and easily.

WMM Programs
Another way the Wi-Fi Alliance improves the quality and control over
streaming media is with its WMM (Wi-Fi
Multimedia) programs. WMM is essentially a quality of service function thats
built into Wi-Fi networks to help administrators monitor and prioritize certain
types of traffic. For instance, voice and
video require a steady connection with
low latency, whereas email or other less
demanding applications dont need quite
as much bandwidth.
Two other aspects of WMM are
WMM-Power Save and WMM-Admission Control. WMM-Power Save serves
two functions as it improves the battery
life of a mobile device and makes voice
calls over Wi-Fi more reliable. WMMAdmission Control focuses on improving
the quality of real-time data. It provides
bandwidth management tools, building on
the basic foundation of WMM and adds
advanced features and controls to categorize and prioritize network traffic in a
way that makes sense for the business
or consumer.

receive certification under this program,


devices and access points must be tested
and demonstrate that they operate above
specified thresholds for jitter, latency, and
packet loss. Only hardware that is VoicePersonal certified can use the related enhanced applications.
Augmenting the Voice-Personal program on the enterprise side is Wi-Fi
CERTIFIED Voice-Enterprise certification. This works in conjunction with the
WMM, WMM-Power Save, and WMMAdmission Control programs to not only
ensure the best possible quality of service,
but also beef up security and add management options.

Passpoint
A Wi-Fi hotspot is a good way to get a
solid network connection when you are in
a public place, such as a restaurant, coffee
shop, hotel, airport, or even a retail store.
However, its sometimes difficult to find
the correct hotspot, let alone make a connection and know that it is secure. Wi-Fi
CERTIFIED Passpoint serves as a solution
to that problem by offering hotspot operators a way to provide a visitor with a secure
and seamless connection.
The idea behind Passpoint is that a mobile user signs up online for a given operators network and uses that account and its
built-in credentials to automatically connect to a WPA2-protected wireless hotspot

Wi-Fi Speeds

WiGig CERTIFIED
Wi-Fi speeds have continued to increase quite a bit over the past few years,
and there are no signs of that stopping any
time soon. In fact, the Wi-Fi Alliance anticipates there will be WiGig CERTIFIED
products capable of supporting multigigabit network speeds (potentially as high
as 7Gbps or more) in 2016. The Wi-Fi
Alliance and WiGig Alliance have been
working together on this project to make
sure the WiGig CERTIFIED products and
technologies are not only fast, but also
stand up to the same security and performance standards as traditional Wi-Fi
CERTIFIED solutions.

How data transfer speeds have changed


with the release of new technologies.

Max Data Transmission

Frequency

802.11a

54Mbps

5GHz

802.11b

11Mbps

2.4GHz

802.11g

54Mbps

2.4GHz

802.11n

450Mbps

2.4GHz/5GHz

802.11ac

1.3Gbps

5GHz

WiGig (2016 release)

Potentially 7Gbps or more

60GHz

Voice Programs
The Wi-Fi Alliances Wi-Fi CERTIFIED
Voice Programs also take advantage of
the WMM programs but focus primarily
on voice quality, power consumption,
and security for both enterprise and personal voice applications. For the Wi-Fi
CERTIFIED Voice-Personal program,
voice applications are tested in the same
types of environments that you might
find in your home or in a small office. To

without having to search for it and log in


every time. This technology also enables
the idea of Wi-Fi roaming where users can
take their smartphone, tablet, or other device with them and automatically connect
to networks as they move along.
Passpoint is particularly important for
retail stores to take advantage of because
it can encourage loyalty and completing
purchases. In fact, a study conducted by
Wakefield Research on behalf of Wi-Fi
Alliance revealed that 69% of consumers
have used a mobile device while shopping
and if a service like Passpoint was available, 28% would stay longer, 29% would
return more often, and 41% would research purchases.

CyberTrend / December 2015

45

The Case For Network Virtualization


HOW ENTERPRISES CAN BENEFIT

FOR THOSE WHO lack an extensive background in networking, NV (network virtualization) can be a difficult concept to grasp.
There is also considerable confusion about
its connection to SDN (software-defined
networking) and NFV (network functions
virtualization). Still, theres good reason why
executives should familiarize themselves
with NV. Although enterprises interest in
NV has stemmed primarily from a technical
perspective, thats changing as vendors are
positioning themselves to make business
cases for adopting their NV offerings.
Mark Tauschek, Info-Tech Research
Group associate vice president, infrastructure research practice, says in the near future, as enterprises ready themselves to
refresh their network infrastructures, any
vendor they work with will have NV on
the table. Anything youre going to purchase in your next refresh cycle is going
to support NV, and likely SDN, more
broadly, he says. You definitely need to

46

December 2015 / www.cybertrend.com

understand the benefits. I think it will be a


good fit for pretty much any organization.

A Complicated Existence
Deciphering where NVs relationship
with NFV and SDN starts and stops can
be perplexing. NV is commonly described
as enabling an enterprise to slice a physical
network into several virtual ones that reside on the same infrastructure but remain
isolated. This is similar to how server virtualization permits creating multiple virtual
server instances on one physical server.
Matthew Ball, Canalys principal analyst,
describes NV as enabling the creation of
these dedicated, separate networks for different uses, applications, departments, and
more. For example, an enterprise could
create one virtual network for testing and
development purposes and others to meet
specific QoS (quality of service), performance, and security requirements for different business groups.

Tauschek considers NV the last man


standing in terms of data center infrastructure yet to be widely virtualized. As he
sees it, NV allows for taking a commodity
piece of hardware or component of hardware and easily creating separate, logical
connections on it. For example, an enterprise could provision a segment of a port
on the network to a particular resource
as opposed to dedicating a full port to a
physical server or physical NIC (network
interface card).
While NFV often gets mixed in with
NV discussions, Ball says NFV is just the
virtualization of network-based services
delivered on x86 servers in data centers.
Andre Kindness, Forrester Research principal analyst, says when he speaks with
business professionals, he usually equates
NFV as the software version of network hardware, similar to what a word
processing program is to a typewriter.
Enterprises can buy switches, routers,

load balancers, WAN optimization controllers, and firewalls all in software form.
Basically, any network hardware has a
software counterpart, he says.
SDN, meanwhile, is considered to have
more to do with delivering automation
and programmability in complex network
environments by decoupling software (the
control plane) from hardware (the data
forwarding plane). Tauschek considers NV
a subset of SDN. NV isnt really about
running applications on your network infrastructure, he says. SDN is also about
essentially providing a signal point of control and policy control for the network
infrastructure. Obviously that makes use
of NV, but NV is really a subset of SDN."

Benefits & Problems


Testing is often cited as a benefit of NV
in the sense that enterprises using NV can
fairly easily create testing environments
not dependent on the physical world.
Kindness says this is similar to CAD or
solid modeling seen in manufacturing
or design sectors. Overall, NV and NFV
offer numerous benefits, Kindness says,
including a low-cost alternative to purchasing hardware and with less associated setup time required. IT, for example,
can download software and begin using it
rather than order the physical network infrastructure and wait for it to arrive before
configuring the components.
NV and NFV can also be well-suited
to small environments. Physical network
components, for example, can be larger
than what a smaller organization needs.
Specifically, most switches are 24- or 48port devices, but a small site may only need
four ports, Kindness explains.
NVs ability to simplify architecture is
another positive. Networks have generally become ugly messes due to the need
to attach firewalls and other services in
certain areas, Kindness says. Add in redundancy and networks can transform
into complex, costly infrastructures.
With NV, software isnt physically constrained, which can simplify architectures.

Network virtualization allows services to


exist right next to applications and data
in the virtual world instead of appliances
dangling off the network, Kindness says.
In his column, Lee Doyle, principal analyst with Doyle Research, writes that implementing NV can introduce scalability
and multitenancy benefits. Specifically,
each application, or tenancy, can have its
own network and security policy thanks
to NVs ability to isolate network traffic.
Technically, Doyle notes that NV offerings are available from numerous suppliers, each putting its own spin on NV
with associated strengths, weaknesses,
protocols, and pricing options. NV is also
available in various open-source options.

to manage with NV than with individual


switches, where if someone makes a mistake with, say, an ACL (access control list)
on one switch, you just open the door to
problems, he says.

The Market
Implementation- and technologywise, NV is still in its early stages. Doyle
writes, however, that because using NV
software as an overlay to existing network
infrastructure offers a fairly easy option to
alleviate virtual machine networking challenges, he expects NV adoption to strongly
increase throughout 2015 and 2016. Doyle
explains that adoption will depend on
pricing, standards, the open-source im-

NETWORK VIRTUALIZATION ALLOWS SERVICES TO EXIST RIGHT NEXT TO APPLICATIONS AND DATA IN THE VIRTUAL WORLD,
INSTEAD OF APPLIANCES DANGLING OFF
THE NETWORK.
ANDRE KINDNESS
There are a number of people who believe some performance abilities and security controls with NV dont equal those of
hardware. Here, Kindness says organizations should have solid processes and controls in place before using NV. Because NV
is nebulous, he says, more due diligence
around procuring, deployment, and management is required.
As with server or storage virtualization,
monitoring can be an issue with NV in
terms of usage and running up against
capacity, which will impact performance.
If you go this direction, you have to
monitor your infrastructure more closely,
Tauschek says.
Where security and NV relate, arguments go both ways, says Tauschek, who
tends to believe NV is better for security,
as most problems we have with security
are human error. Often, for example,
security problems result from someone
misconfiguring something. Standardizing
policies across different groups of applications, servers, and infrastructure is easier

pact on the market, and other variables.


He projects global spending on the SDN
software market, of which he counts NV a
subset, to reach about $1.2 billion by 2018.
Tauschek says Info-Tech is seeing more
client interest in NV and SDN, though the
technologies are emerging within enterprises primarily in test beds, lab environments, or safe segments of networks. That
said, executives should be learning more
about NV now.
Anything you buy now is going
to be proprietary SDN/NV-ready, or
its going to be OpenFlow-ready [an
open networking standard for configuring switches], or both, Tauschek says.
Whats really driving interest now is vendors. Here, few independent NV solution
startups remain as large network vendors
have all bought into the technology with
acquisitions, he says. Now, a lot of the
vendors are saying to their customers,
Look at what we can do. Isnt this cool?
You should buy it.

CyberTrend / December 2015

47

Keep The Business Running


GAIN A DEEPER UNDERSTANDING OF BUSINESS CONTINUITY

KEY POINTS
BCDR is just as much about
protecting your business from
small failures as it is about preparing for major disasters.
Performing a business impact
analysis and risk assessment
will help you prioritize systems
and applications to meet recovery time objectives.
There are different BC solutions to choose from, including
internal software and cloudbased alternatives.
SMBs may require more help
and expertise from providers
than large enterprises, but the
base needs are similar.

48

December 2015 / www.cybertrend.com

ITS SAFE TO SAY that BC (business continuity) is a major concern for every organization in existence. You work so hard
to build up your company and make it a
success, so of course youll do anything in
your power to keep it operating at a high
level. Unfortunately, no matter how wellprepared you are and what precautions
you have in place, there will undoubtedly
be instances where you need to recover
from an outage or a full-blown disaster as
quickly as possible. Its in those situations
where your BC plan and incident response
will be put to the test. How well you respond and rebound ultimately depends
on how you have prepared and what solutions or services you have in place, as well
as the resiliency of your infrastructure
and employees.

What BC Actually Entails


Most people understand the basic concept of BC as keeping the business up and

running, but theres much more to it than


that. Business continuity is the ability for
an organization to respond, recover, and
restore business operations in a timely
fashion, says Roberta J. Witty, research
vice president at Gartner. Its not just about
preventing your companys infrastructure
from going down, because that will happen
from time to time. Its more about having a
plan in place so you know how to respond
to an outage or disaster once it happens
and how long it takes you to recover.
Different parts of the organization will
have different time frames of needs, says
Witty. If you look at banking or capital
markets trading where they could lose billions of dollars in 15 minutes, they need to
be real-time for the most part. Their operations cant go down, necessarily, whereas,
accounts payable in an organization, you
could usually wait a week or even up to
a month to restore that function without
a huge negative impact to the business.

Its always a matter of zeroes. A smaller company isnt


as impacted as a large company is as far as how much
money theyll lose. But the more important point is that
there are absolutely technologies available for small and
midsize organizations that are right-sized. They have less
complexity and less cost because they are not quite as
quick or as expansive, and thats OK. There really are
right-sized technologies for small and midsize organizations, because they are just as dependent on their IT
services as large enterprises are.
JASON BUFFINGTON
Senior Analyst
Enterprise Strategy Group

Depending on the business operation and


various aspects of impact, you put your
recovery plans and solutions in place to
recover from business disruption.
The ideas of planning and recovery,
however, dont paint the full picture. BC
is a huge umbrella term thats composed
of several components. Phil Goodwin,
research director at IDC, says that backup
and recovery, data protection, high availability, and DR all play a part in BC. The
term business continuity means that
the IT organization can continue to provide services to the business regardless of
what kind of interruption might happen,
says Goodwin. All of those other techniques I mentioned are designed to address specific failures, but in totality, they
should allow the organization to continue
to function and operate almost regardless
of circumstance.
In addition to the more technical side
of BC, you also need to keep the nontechnical aspects in mind. Not only do
you need to make sure your IT is durable
and resilient, but also that you establish
the process, people, and culture to ensure
that the business is ready to take advantage of whatever those IT processes are,
says Jason Buffington, senior analyst at
Enterprise Strategy Group. After all, BC
isnt a fully automated process, so you need

to make sure you have the right people in


place to manage it and see any potential
recovery scenarios through to completion.

Why BC Is So Important
Buffington says that an overly simplistic answer to why BC is important
would be to say that its like choosing not
to have insurance. He says most responsible people understand that while they
would rather not pay for insurance, it is
better to have it and not need it than to
need it and not have it. But Buffington
also points out that BCDR (business continuity and disaster recovery) isnt always
about site-wide or regional calamities.
Its sometimes something as small as
a single server or a network access point
going down that can cause issues. You
dont build BCDR because youre worried about a flood or a hurricane, he says.
You build it because servers go down and
wires break. A responsible approach for
BCDR will let you accommodate crises of
all sizes.
That last part is particularly important
because being able to handle a crisis of
any size requires a larger scope in terms
of how you look at BC. For example, if
your company suffers from a natural disaster or major power outage, you may not
be the only facility suffering. Resources

may not be available, especially if youre


not the only one impacted, says Witty.
There may be other organizations that
have gotten to different providers before
you have and therefore the recovery resources arent available for you, which only
means youre going to delay your ability to
recover even further. Its about availability
of resources and the investment that you
make post-event vs. pre-event from a planning perspective.
Planning is a key part of BC. The fact
that many businesses dont plan enough
is probably the No. 1 reason why it is so
important to focus on BC. Goodwin estimates that 50% of organizations could
not survive if they experienced a disaster
disruption. Depending on how real-time
your products or services are, being down
for an extended period of time could mean
you not only lose revenue, but you also
could lose everything.
In many cases, it literally could mean
the survival of the organization, says
Goodwin. If you cant bring the systems
back within a reasonable period of time,
then you simply cease to be able to conduct business, and that is probably going
to be the biggest motivator. I would say
that most users are really looking for less
than true business continuity, but it is that
planning that is really critical to getting
the infrastructure right-sized based on the
business requirement.

Developing A BC Plan
Crafting a strong BC and DR plan is
clearly important, but what actually needs
to be included in a plan? For starters, it
should outline the specific actions youre
going to take when that disruption occurs,
Witty says. What some organizations may
not realize is that each may require its own
separate plan, such as a damage assessment
plan, crisis communication plan, IT service
recovery plan, business recovery plan, or
supplier recovery plan. You need to know
what kind of plan youre talking about,
then the next thing is, in the document
itself, you need to have some of the basics,

CyberTrend / December 2015

49

like who owns this document? Witty says,


What is the recovery point objective that
youre dealing with in regards to the scope
of that plan? You need to identify the location of the recovery site, and there needs to
be a short summary of how you are recovering. You also need to have all necessary
contacts detailed in the plan and have a
firm grasp on what employees make up
your recovery team, what their roles are,
and how they will keep employees, customers, and, in some cases, even the press
updated on the recovery progress.
Once you have a firm grasp on whos
going to be in charge of the operation and
what actions to take, you need to look at
the infrastructure. Buffington says many
companies perform a BIA (business impact analysis), where they look at their IT
platforms and determine what the cost
would be if they went down for a day, a
week, or longer. What systems would be
a problem if they went down? Buffington
asks. Rank them. Even if its an unofficial rounded guess, put a dollar value on
that. There are 100 people and the average
knowledge worker in America makes
$70,000 a year, divide that by X number
of hours. Just do some basic math and say
what would happen if these top 10 systems
went down. What youll very quickly find
is that those numbers add up much faster
than you think.
Once youre done with the BIA, the
next step is risk analysis. Buffington uses
the example of a house on a 500-year
floodplain. While theres a chance a flood
could hit the house, its very slim, so flood
insurance would be relatively inexpensive. However, in a location such as New
Orleans, for example, you cant buy flood
insurance, because [flooding is] so likely
that no one is willing to invest in that,
Buffington says. The key is balancing the
two extremes and figuring out which systems fall into those categories.
If you run those two pieces of math,
what you come up with is effectively the
cost of your problem, says Buffington.
Figure out what the likelihood is and the

50

December 2015 / www.cybertrend.com

Any data protection or business continuity solution has


got to start with what service levels are required. If your
business is truly a 24/7/365 organization, then thats
going to determine a lot about what kind of infrastructure
you have to put into place to maintain it. Other organizations may be 16 hours a day, five days a week. Others
may be eight hours a day, seven days a week. It depends
on what those service levels are. Business continuity for
one of those organizations that, say, is eight hours a day,
seven days a week, would really mean being able to keep
the system up for those working hours and not necessarily 24 hours, seven days a week. Theres a big difference
in the money that would have to be spent to maintain
those different things.
PHIL GOODWIN
Research Director
IDC

business impact of your top 10 systems


failing. If you could spend a dime to save
a dollar and the odds are youre going to
get bit by something every 10 years or less,
thats good money. If you had to pay a
quarter to save a dollar, you may not be as
interested. And if you had to pay a penny
to save a dollar, you probably wouldnt
think its worth it. Take a zero off of what
you think the cost of the problem is and
that should be your starting budget.
Then, once youve categorized your IT
platforms, youll need to dig a little bit
deeper and prioritize specific applications
and equipment. Goodwin recommends
using the tiered method of having Tier 1,
Tier 2, and maybe even Tier 3 applications
where Tier 1 is mission-critical, Tier 2 is
business-critical, and Tier 3 is operational
applications that, if theyre not brought up
in five or seven days, you can still survive.
This is the part of the process where RTOs
(recovery time objectives) come into play
because you have to decide which systems
you need right away and which ones you
can live without for an extended period
of time. Prioritizing in this way is crucial

because, in the event of a disaster, it allows


you to get the most important systems
up and running first, so you can continue
to operate, and then slowly bring other
systems online as you move through the
recovery process.

BC Solutions & Services


To Consider
In addition to a strong BC and DR
plan, you need to have a series of software
solutions or services in place to help you
get through an outage when it occurs.
Witty says there are three common types
of software or systems that need to be in
place to ensure proper BCDR. One is an
emergency mass notification service or
automated call tree. Witty says many organizations outsource this service to a third
party, because it should be considered a
mission-critical application. When something bad happens, you need to have access
to it right away, she says. If that something bad thats happened is your own data
center and you havent made provisions
for recovering the notification system, then
whats the point of even having it?

Your security incident response teams really need to


work directly with IT disaster recovery, business recovery, and crisis management so that there is a whole of
organization response to whats taking place. That gets
really nuanced because if you have an attack thats being
perpetrated by an employee, there are certain things you
dont say. Do you bring down your services and alert the
attacker that you know about them? Those are all questions that need to be discussed, because it may not be in
your best interest to do a traditional recovery.
ROBERTA J. WITTY
Research Vice President
Gartner

Youll also want BC management planning software. This is the tool youll use to
perform a business impact analysis and
risk assessment as well as to put your recovery plan in place. The business continuity management planning software
tools help you automate all of that, says
Witty. Having all of your recovery plans
in a single repository is actually the gold
mine of business operations. I find that
more progressive business continuity
management programs are actually integrated into strategic planning for the company and strategic IT planning. Theyre
leveraging this data that theyve amassed
about how the business works into much
more proactive and strategic activities for
the organization.
Crisis or incident management software
is another important tool to have. This is
the platform from which youll manage
the activities around recovering from a
business disruption. You get a view of
whats actually taking place, says Witty.
A plan is a good starting point, but every
event has its own uniqueness to it, so you
need to manage that situational awareness
against your plan. There may be actions
that youre not going to take or actions
you didnt know youd have to take but
now need to in the face of the event. Those
crisis management tools become critical

in helping you execute your plans, keep


track of activities, and keep track of expenditures, contacting people, and all of that.
If you feel like you cant handle every
aspect of BCDR in-house, then you can
always consider engaging with a thirdparty or cloud provider to help with
certain components. Specifically with disaster recovery, you may want to offload
backups or even some infrastructure to
a third-party DR facility so you dont
have to own and operate a second data
center just for DR. And theres also a
newer technology called DRaaS (disaster
recovery as a service) that promises to
take some of the burden of recovery off
of the organization and the IT team. Its
especially important for SMBs (small
and midsize businesses), because some of
these companies cant afford their own
offsite facilities. With DRaaS, they get access to cloud-based DR and a faster way
to recover in case of an outage.
Im extremely excited to see where
DRaaS technologies are going because it
encapsulates the two things that midsize
organizations need, says Buffington. It
encapsulates the second site, which is arguably as, if not more, reliable than the
primary site, and it encapsulates expertise.
If the only thing you had was a cloud provider and you still had to figure out how to

do it all, nothing would really change. But


the good DRaaS providers not only provide the tech, but they also have the expertise to help you get where you want to go.

How Company Size & Industry


Impact BC
You may be surprised to find out that,
according to Goodwin and IDC research,
small and medium businesses have exactly the same service level requirements
as large-scale organizations. It has more
to do with the industry your company
is in than strictly size, because financial services or some retailers have more
stringent requirements than a CPA or law
firm, Goodwin says. Especially if your
organization is in a regulated industry,
youll need to think about what systems
need more protections to ensure youre
always in compliance.
But that doesnt mean size doesnt
matter at all when it comes to BC, because its certainly a matter of scale. The
cost of protecting everything in a larger
organization will inherently cost more
than protecting the same systems in an
SMB, so even if the solutions and services
are similar, its important to make sure
theyre right-sized to a given companys
needs. For example, a smaller organization may not have access to the same
internal resources as a larger organization, so they may require a little extra help
from a BCDR provider.
In a large enterprise, you have
Clark Kent somewhere in the IT team,
and when something bad happens,
he goes into a broom closet and comes
out as Superman to save the day, says
Buffington. Midsize organizations dont
have Clark Kent. Theyre lucky if they
have Jimmy Olsen. They dont have somebody who can don a cape and solve the
problem, so its extremely important that
midsize organizations partner with a provider that has that expertise as well as the
technical capabilities of actually doing delivery. Thats probably even more important than the tech itself.

CyberTrend / December 2015

51

Disaster Recovery As A Service


TAKE ADVANTAGE OF THE CLOUD TO PROTECT YOUR BUSINESS

KEY POINTS
DRaaS (disaster recovery as a
service) can be a less expensive
alternative to traditional DR, but
it depends on how many workloads you back up.
Many traditional DR providers are moving customers to
DRaaS, but some newcomers
offer different levels of service.
Customers have the opportunity to save money by paying
monthly and not purchasing
their own DR-specific hardware.
DRaaS is currently best for
smaller businesses, but a nearfuture market shift could open it
up to large enterprises, as well.

52

December 2015 / www.cybertrend.com

IF YOUR BUSINESS has embraced the


cloud and understands the benefits of
offloading certain workloads to a thirdparty provider, then youre probably
looking for the next frontier in the cloud.
Although backing up data and applications to the cloud is nothing new, the
idea of doing most (if not all) of your
disaster recovery in the cloud is a novel
concept many smaller companies are exploring, and it could have potential with
large enterprises in the future. DRaaS
(disaster recovery as a service) promises
to make DR easier by removing some of
the hassle of traditional DR and giving
companies access to their critical assets
in the cloud rather than solely on thirdparty hosted hardware.

DRaaS vs. Traditional DR


The main difference between DRaaS
and traditional DR is DRaaS adds a cloud
component to the equation and tends

to involve virtual machines and virtualized environments. Instead of owning a


separate data center or buying hardware
to host in a third-party facility, you can
essentially send your data and applications to a DRaaS provider, and then access them via the cloud in the event of a
disaster. What that means, especially for
SMBs (small and midsize businesses), is
disaster recovery becomes attainable. No
longer do SMBs have to eschew building
a solid DR plan due to the high cost of
buying all of the backup servers, software
solutions, and facilities space necessary to
protect data and get the business back up
and running.
Virtualized DRaaS uses vast and scalable infrastructure, and allows virtual access of assets, with little or no hardware
and software expenditures, says Monolina
Sen, senior analyst for digital security at
ABI Research. In other words, DRaaS
requires far fewer operational resources.

This results in significant savings in software licenses and hardware, allowing the
organization to increase its budget in other
areas of operation. Traditional DR services
can prove to be expensive, thus scaring
off small and medium-sized businesses,
but DRaaS affords providers the opportunity to offer a less-expensive alternative for
cost-conscious markets.
Another benefit of the cloud aspect of
DRaaS is that it gives organizations the
ability to tier applications and workloads as
a way to prioritize them for recovery. This
means mission-critical apps will take precedence over less critical ones, so you can
get those workloads up and running first
to lessen the impact of a disaster on the
business. Plus, the cloud-based nature of
DRaaS means that backup and recovery
assets can be easily accessed and initiated
in the case of a disaster, Sen says. And for
that reason, disaster recovery as a service
is most beneficial to those organizations
who need a recovery time objective [RTO]
or a recovery point objective [RPO] thats
measured in minutes, rather than hours or
days, she says.

DRaaS On The Provider Side


From the provider point of view,
Werner Zurcher, research vice president
at Gartner, says DRaaS is not that much
different from what regular DR service
providers were doing before. He uses
the examples of IBM and Sungard, currently the two biggest players in the DRaaS
market, because theyve been providing
disaster recovery services in some shape
or form for some time. Zurcher says in
the old days customers would pay quarterly for services that provided hardware
when they needed it, and then they would
typically get two DR tests per year on that
equipment. One major differentiator is
that traditional DR customers had to physically ship backup tapes on a consistent
basis to the providers.
Now, those same companies have
shifted many of their customers to DRaaS
and are using the cloud to remove some

Pretty much all the DRaaS customers these days are


customers that dont have multiple data centers. If they
have multiple data centers, theyre still typically doing
DR in one of those data centers. For those people that
dont have a data center, it gives them the ability to, in
effect, rent hardware only when they need as opposed to
having to procure the hardware themselves.
WERNER ZURCHER
Research Vice President
Gartner

of the more costly and time-consuming


aspects of traditional DR. For example,
instead of shipping backup tapes regularly,
you may do that once to set up a solid
foundation, and then replicate or back
up data and applications via the cloud
from that point forward. Then, instead of
paying quarterly, you pay monthly as you
would for any other SaaS (software as a
service) solution.
Things get more interesting when
you compare what providers offer their
customers with what some of the newer
providers bring to the table. Zurcher says
Gartner is currently aware of 240 providers
in the market, and, he says, there are new
ones every week. Zurcher adds that some
of these vendors take a very simplistic approach and have you send in your backups
as usual, migrate network addresses, and
then start up your services if and when
you need to. There are others, however,
that offer more support throughout the
process and even try to automate as much
of the DR process as possible.
The differences in these providers can
be traced back to their target markets.
For example, providers such as Axcient
and Datto are primarily focused on
small company backup and recovery,
according to Zurcher. He says these vendors typically provide the least amount
of services and primarily focus on simple
backup to the cloud and recovery to the
cloud with a portal where customers can
monitor their data and start the recovery

process, if necessary. Then there are the


larger providers that offer more services
and a level of automation that you may
not get from smaller vendors.
At the other extreme, you get the
IBMs and Sungards that have the most
services to operate, and in the case of
Sungard, try to automate the DR as much
as they can when its done as DRaaS, says
Zurcher. A lot of the high-end DRaaS
providers support replication of the data
to cloud as opposed to just backups, which
can get you running a lot faster than
backups can. What you get in terms of
services varies significantly between the
different providers.

DRaaS On The Customer Side


Where the provider side of the equation mainly relies on a shift to cloud computing, the consumer side can be quite
different, which is why it hasnt exactly
taken off with large enterprises yet. In fact,
Zurcher says, 90% of the 30,000 estimated
organizations that use DRaaS today is
comprised of small businesses with fewer
than 50 VMs (virtual machines), and as
much as 80% of that 30,000 total is made
up of small business with fewer than 25
VMs. The reason for this is because they
have so few workloads compared with
larger organizations, it doesnt make as
much sense for them to continue shipping
backup tapes to vendors.
Instead of doing tape backups or
just holding them in a closet, they are

CyberTrend / December 2015

53

Businesses using DRaaS do not have to worry about


backup servers, as their service provider will be supported
by a data center, along with enterprise-grade bandwidth
and computing power. Compared to more traditional
methods of backup, DRaaS offers more flexibility. The
various DRaaS services offer clients more options in how
to handle different business systems. Any enterprise using
DRaaS solutions can select from a variety of recovery
scopes depending on the type of the disaster. DRaaS can
support a diverse range of systems from virtual machines
to a variety of user endpoints, including smartphones and
tabletsscalability is a big advantage of DRaaS.
MONOLINA SEN
Senior Analyst For Digital Security
ABI Research

now shipping their data to the cloud and


are able to recover their services in that
cloud, says Zurcher. Its a much better
solution, and often when I talk to customers, its for no more money than what
they were paying to ship tapes to [traditional providers]. Most users of DRaaS
are smaller organizations that have replaced tape shipping with sending their
data to a place where they can actually
recover their services.
Sen agrees that DRaaS is particularly
beneficial for small businesses because it
often gives them better capabilities than
they could set up for themselves with a
limited budget, including better security, flexibility, and recovery, in general.
And DRaaS is flexible enough to not
only support those onsite workloads, but
also those hosted in private and hybrid
cloud environments.
Still, even though DRaaS has some automation to it, that doesnt mean the customer role in the process is completely
hands-off. Theyre always going to be
involved in actually initiating the disaster
and then, to an extent, cutting cover,
says Zurcher. It doesnt just pop in the
cloud automatically for you. The question
is how manual vs. how automatic it is. Its

54

December 2015 / www.cybertrend.com

usually fairly manual. You still need to


have your backup administrators start up
VMs in the cloud. Its very rare that its
completely automated.

Current Market & Future


Consolidation
The interesting thing about DRaaS
is that its actually growing in the small
business market and hasnt made a lot of
headway with large enterprises as of yet.
DRaaS primarily benefits small businesses
with 25 to 50 VMs because it means they
get solid recoverability from it without
having to own multiple data centers or
host hardware in a third-party facility.
They get to rent hardware and only use
it when they truly need it, which makes
sense for smaller businesses, but on a cost
basis, doesnt necessarily translate well for
larger enterprises.
Cloud DR isnt cheap, says Zurcher.
We did a survey and asked 11 of the top
providers for their prices for 50-, 250-,
and 500-VM configurations with various
amounts of disk space. For 500 VMs, the
average cost was $160,000 a month, so almost $2 million a year. The range of prices
was from a little over $1 million to $4 million a year. Big companies arent doing this

yet, because A) they have multiple data


centers and B) because it gets quite expensive. For small companies, its still cheaper
and they dont have to acquire additional
hardware, so they like going that route.
However, in addition to IBM and
Sungard, there are other big players
taking steps to move into the DRaaS space
and shake up the market. For example,
Microsofts acquisition of InMage in July
2014 shows it is taking DR and business
continuity seriously, and potentially for use
with its Azure cloud offering. InMage is
a provider of converged disaster recovery
and business continuity solutions, with
a focus on continuous data protection,
says Sen. The company is known for its
high-end Scout line of disaster recovery
appliances. A week after closing the deal,
Microsoft reported that its InMage Scout
software appliances for Windows and
Linux physical and virtual instances will
be included with Azure Site Recovery subscription licenses.
Currently, Azure Site Recovery is typically only being used with Hyper-V, which
is a system most larger companies arent
using, according to Zurcher, but its certainly an early sign of things to come.
Microsoft, Google, Amazon, and other
companies are all working on some form
of DRaaS that could be used by larger organizations at some point. And as those
companies enter the market in a real way,
Zurcher expects it to change significantly.
Going forward, were going to see the
200-plus providers that we track getting
significantly consolidated, says Zurcher.
And there is significant price pressure
that these new vendors, and in particularly AWS [Amazon Web Services], are
exerting on vendors. To date, the 240
different vendors havent had to compete with Microsoft Azure, VMware, or
Google and as a result, the prices have
been kind of high. We think the price
pressure is going to make it so that it becomes more affordable over the next two
years for organizations, and maybe even
larger organizations.

THE LATEST PREMIUM ELECTRONICS

DROID Turbo 2 Shatters Expectations With Shatterproof Screen


WWW.VERIZONWIRELESS.COM
A shatterproof screen has been a long time coming, as engineers have made great strides in developing bendable displays.
But the unbreakable screen has finally arrived in the form of Motorolas Moto ShatterShield technology, featured for the first
time on Verizons DROID Turbo 2 by Motorola. While some phones have a specific high point or two (performance, durability,
long battery life, design, great camera), the DROID Turbo 2 offers all of the above. Key specs include Qualcomms 2GHz quadcore Snapdragon 810 processor (complete with a 600MHz graphics processor), 3GB RAM, 32GB or 64GB integrated storage
with a microSD Card slot that accommodates as much as 2TB more, twin cameras (21MP rear, 5MP front) with high-end photography features. Along with the 5.4-inch shatterproof 1440p Quad HD screen, another cutting-edge feature is TurboPower
charging (ideally up to 13 hours of life on a 15-minute charge, per the manufacturer) and as much as 48-hour battery life.

56

December 2015 / www.cybertrend.com

Dell Out-Innovates The Competition With New XPS Models


WWW.DELL.COM
Dell has struck again with its XPS family of laptops, which are now available in three models: the XPS 12, XPS 13, and
XPS 15. (Each model number corresponds to its screen size in inches.) Introduced at the 2015 Consumer Electronics Show,
the XPS lineup was the first to use Dells InfinityEdge display, which sports a bezel so narrow it makes for a nearly borderless screen experience. All models within the current XPS family include Intels new 6th generation Intel Core processors
and Windows 10. The XPS 12, the sole 2-in-1 hybrid in the bunch, is the first 2-in-1 on the market to feature a 4K Ultra
HD display. The XPS 15 (pictured here) is the lineups high-end powerhouse model and presently the worlds smallest
15-inch laptop. The XPS 15 accommodates up to an i7 Quad Core processor, 16GB RAM, and 1TB of storage. It uses the
Thunderbolt 3 interface for super-fast data transfers and runs for up to 17 hours on a charge. Prices for these models vary
depending on the configuration, but start at $999 for the XPS 12, $799 for the XPS 13, and $999 for the XPS 15.

CyberTrend / December 2015

57

Smartphone Tips
GET THE MOST OUT OF THE LATEST ANDROID & IOS VERSIONS

ANDROID

58

Track Down A Lost Android

Use Automatic Backup To Google Drive

Missing your Android smartphone?


Go to a Web browser, log into your
Google account, and visit the Android
Device Manager website (www.google.
com/android/devicemanager). Doing
this allows you to view the precise location of your smartphone on a map. If
youve simply left it at home, no worries, you can retrieve it at your earliest
convenience. If you discover that the
phone is somewhere nearby, select
the Ring function in Android Device
Manager so your phone can make its
presence known. If the phone turns out
to be stolen or left in a public place, you
can use the Android Device Manager to
lock the phone or, in a worst case scenario, erase all of its data.

Automatic backup to Google Drive is not new to Android, but with the introduction of Marshmallow (Android 6.0), your data and applications are automatically backed up. To view and control what exactly is getting backed up, open
Google Drive on your smartphone, tap the menu icon (three bars), tap Backup &
Reset, and tap Back Up My Data to ensure that the feature is active. Go back to
the previous screen and select Manage Backup to view all of the apps that are currently backed up to your Google Drive account in the cloud. Here you can change
the setting for each app. According to Google, only a set amount of data is backed
up per app, and the data isnt included in your Google Drive storage limit.

December 2015 / www.cybertrend.com

Enjoy Full-Sized Browser Features On Android


Since KitKat (Android 4.4), the native WebView browser in Android devices
has been based on the same open-source project behind the desktop Chrome
browser. Many Android apps rely on the WebView browser for all or some
of their functionality. Now, with the introduction of Chrome Custom Tabs in
Marshmallow (Android 6.0), youll see a more streamlined use of such features
as automatic sign-in, form text auto-fill, and saved passwords.

ANDROID
Make Your Android Even Sleepier

Anroid Wants You To Know Whats On Tap Nearby

Theres sleep, and then theres deep


sleep. When it comes to preserving battery life, whenever it makes sense for
your smartphone to enter deeper sleep,
the longer your charge will last. Thats
the logic behind the Doze feature new to
Marshmallow (Android 6.0). Doze lulls
your smartphone into a deeper sleep
when, using your phones motion sensors,
the feature detects the phone hasnt been
used for a while. According to Google,
this feature improves your smartphones
standby time by about a third. Google
also says that this deeper sleep wont
interfere with such things as urgent incoming calls.
If you want some control, though,
over what apps are affected by Doze,
access Settings, tap Battery, tap the
menu icon (three bars), and tap Battery
Optimization. Tap the down arrow and
All Apps, and then select any app you
want to control. Select either Optimize or
Dont Optimize to include or exclude it in
your phones Doze and low-power mode
features, and tap Done.

What do you do when you need to look something up on the go? If youre
like most people, you google it. Messaging someone about where there might be
a good place to eat nearby? Google it. Want to know whether a business is currently open? Google it. Well, with Marshmallow (Android 6.0) comes the introduction of Now On Tap, a feature designed to help you out quickly during these
google-able moments.
To call Now On Tap into action, tap and hold the home button on your
smartphone. Now On Tap recognizes what youre doing at any given time, so if,
for example, you are viewing an email discussing travel plans for an upcoming
event, the Now On Tap feature will offer a few suggestions related to that. For
example, it might serve up the address for the event site (with an option to open
the Maps app to get directions), a phone icon so you can call the site directly,
any apps you might have that are related to the event or location, and an option
to create a new Calendar item about the event.

Safely Replace A Memory Card


If, for whatever reason, you would
like to replace the memory card in your
Android smartphone, dont just open
the case and remove the installed card.
First go to the Home screen, and then tap
Menu, Settings, Storage, and Unmount
SD card. When you receive a warning
message, tap OK and wait until you see
SD card safe to remove on the screen.
At that point you can remove the smartphones cover, release the SD card (this
can involve moving a small guard,
pressing down on the card to unlock it, or
a similar action), remove it, and replace it
with another SD card.

Get A Turbo Charge


The kinds of power
features found today in
the Motorola DROID
Turbo family of smartphones from Verizon
Wireless may become
de rigueur for phone
manufacturers in short
order, but for now youll
find them in the DROID
Turbo and DROID
Turbo 2 models. Billed
by Motorola as the
worlds fastest charging
smartphone, the
Turbo 2 can gain up to
13 hours worth of battery life on a 15-minute
charge. A fully charged
battery in either model
will last as long as 48
hours, and both models
offer wireless charging
as well. A combination
of battery design and
software make this quick
charging possible.

The Motorola DROID Turbo smartphones from Verizon include


Motorolas TurboPower feature, with cooler charging temps.

CyberTrend / December 2015

59

iOS
Get Started With iCloud Drive

Kick Poor Wi-Fi Hotspots To The Curb

If you havent used Apples online


storage service yet, its a good time to
start along with your upgrade to iOS 9.
There are two reasons for doing so: to
use iCloud storage itself for backing up
data and applications stored on your
phone, and to use other online storage
services (such as Box or Dropbox) more
easily within the operating system.
To get started, access Settings, tap
iCloud, and look for iCloud Drive. The
iCloud Drive option appears at the top
of the list, and if you arent currently
using it the selection will read Off. Tap
iCloud Drive, tap Upgrade To iCloud
Drive, read the message about multi-device access to iCloud, and tap Continue.
The operating system will then return you to the previous Settings page
and the word Upgrading will appear while Apple does its work in the
background. This can take a few minutes or more to complete, after which
Upgrading will change to On. You
can then tap iCloud Drive and select
which apps you want the service to automatically back up for you.

New to iOS 9, the Wi-Fi Assist feature comes automatically turned on in new
devices running iOS 9 or later, or when you upgrade to iOS 9. The feature is
designed to detect whenever you are using a poor quality or slow Wi-Fi connection, and automatically switch the device
to cellular usage. This
can, of course, come
in handy. If, for example, youre in the
middle of sending or
receiving an email attachment, or trying
to find a location
on the map, and the
app youre using is
struggling to use the
established Wi-Fi
connection, Wi-Fi
Assist will keep your
device on task. But
if you want to avoid
inadvertently using
cellular data services, you can turn
Wi-Fi Assist off. To
find the feature, open
Settings, tap Cellular,
and scroll all the way
down to the bottom
(past the potentially
long list of all of
Apples Wi-Fi Assist feature, new to iOS 9, helps you avoid slow or
your installed apps
poor quality Wi-Fi Internet connections by diverting your connecthat may use cellular tion to cellular when necessary. Slow-loading Web page due to
data), and look for the Wi-Fi? It will speed up when Wi-Fi Assist automatically activates.
Wi-Fi Assist toggle.

Try A Different Search Engine

Its easy to get started with iCloud Drive, but it


can take some time to complete.

60

December 2015 / www.cybertrend.com

Apple provides four search engine options for its Safari browser: Google,
Yahoo, Bing, and DuckDuck Go. If you arent familiar with it, DuckDuckGo,
despite its unusual name, is a swift search engine whose claim to fame is that it
doesnt track your every move while youre browsing, and packages search results a little differently (some say better) than Google. Access Settings, tap Safari,
and tap Search Engine to try a different search engine.

iOS
More Email Attachment Options

Find Apple Pay-Friendly Retailers

With iOS 9, you can attach more


than just photos from your camera
roll when youre composing an email
message, provided that you have the
iCloud Drive in use on your device.
In the body of the Mail message you
want to send with an attachment, press
an empty area until you see a menu
appear that reads Paste and Quote
Level. Tap the arrow to the right of
these words to reveal more options, including Insert Photo Or Video and
Add Attachment. The former option
is pretty clear, but the one were looking
for is Add Attachment. When you tap
Add Attachment, the iCloud Drive
opens, providing you with the option to
select any file to attach. (Keep in mind
that these files are coming from the
cloud, so if you wish to attach a large
file it could take some time, and eat up
some data, to download and attach to
your message.) If the file you want to
send is in iCloud, select it, tap Done,
and send your message when its ready.
If you need to attach an item that is
stored in one of the other cloud storage
services you use, when the iCloud Drive
opens tap Locations in the upper right
portion of the screen. Tap the three buttons next to the word More, and you
can the include items from your other
services.

Apple Pay is gaining steam with businesses and users alike. According to Apple,
there are hundreds of banks and credit unions that work with Apple Pay in addition to a rapidly growing number of retailers. If youre using Apple Pay, theres a
quick way to find out whether an establishment works with Apple Pay. Open the
iOS Maps app, search for
the location youre interested in, and when you
spot the little call-out on
the map for the appropriate location, tap it to
reveal details. The results
will include basic location information and
hours, but look in the
Category section for the
Apple Pay icon. Only the
presence or absence of
the Apple Pay icon indicates whether the business takes Apple Pay, so
if you dont see the icon
youre out of luck.

You can now add Mail attachments from iCloud


Drive and other online storage services.

Turn Your iPhone


Into A Personal
Hotspot
The process for transforming your iPhone
into a personal hotspot
for sharing Internet access is an easy one, but Look up any business in the iOS Maps app and you can see
bear in mind that the whether or not it accepts Apple Pay.
connection will be slow
and your iPhones battery will drain quickly, so only use it when absolutely necessary. If you have a
good cellular connection (not Wi-Fi), access Settings, tap Cellular, tap Personal
Hotspot, and follow the instructions. You can then share with a PC or other devices via Wi-Fi, Bluetooth, or USB.
There are multiple advantages to using your iPhone as a personal hotspot.
Doing so offers a more secure connection when youre in a location where you
dont necessarily trust the security of the available Wi-Fi hotspots, for example.
Establishing a personal hotspot also offers a way for you to share your connectivity with others nearby.

CyberTrend / December 2015

61

Isolate Malware
HOW TO COMBAT ATTACKS

AN UNFORTUNATE FACT ABOUT using


an Internet-connected computer these
days, whether it is a personal or company-issued notebook, is the constant
threat of malware infection. Even when
taking preemptive action to combat
malware attacks, theres a fair chance
one will eventually hit your notebook
anyway, if for no other reason than
the sheer volume of malware that attackers introduce daily. Frighteningly,
a leading security software maker reportedly detected more than 20 million
new malware strains between January
and March 2015 alone. Of this number,
Trojan horses accounted for 72.75%
of all newly detected malware threats,
and were responsible for 76.05% of all
global computer infections
Whats startling is that these attacks
included zero-day threats in which, as
the name suggests, zero days expire
between when a given vulnerability is

62

December 2015 / www.cybertrend.com

discovered and when attackers release


malware targeting the vulnerability.
With malware being so prevalent and
persistent, a large part of combatting
it is being able to recognize signs that
a system may be infected and then
knowing how to troubleshoot the
problem. Also important is what security tools are available to detect, protect
against, and remove malware. The following details these issues and others
for notebook business users.

The Warning Signs


Although new malware variants
are constantly being developed and
released, malware is generally categorized into several common groups,
including viruses, worms, rootkits,
spyware, Trojans, keyloggers, adware,
and ransomware. What these groups
have in common is an aim to infect
a users notebook to steal personal

or company information, hijack the


system outright, or cause other types
of damage. Malware infections can
transpire in numerous ways, including
when you visit an infected website, install software or an app with malware
hiding inside, click links or open attachments in email, or insert an infected USB thumb drive.
Though warning signs that malware
may be present can differ depending
on the malware type, there are some
primary indicators to look for. Michela
Menting, digital security research director at ABI Research, says the most
common include applications and
programs running noticeably more
slowly, slower Internet performance,
and data or files that are unexpectedly
deleted or altered. A notebook running more slowly, for example, could
indicate malware is stealing computing
resources to fuel whatever activity the

malware was designed to execute, such


as hijacking the system to help generate
and spread spam to other systems.
Some specific examples of changes in
notebook performance to watch out for
include programs, files, and folders that
take longer to open or that dont open
at all and the notebook taking exceedingly long to shut down or not shut
down at all. Menting says an easy way
to check for system performance issues on Windows notebooks is to look
at the processes running in the Task
Manager and pay particular attention
to memory or CPU resources. If users
regularly check the Task Manager,
they may be able to more easily spot
when something looks different from
normal, she says.
Other odd or strange system-related
occurrences that can signal possible
malware activity include the notebooks
battery draining more quickly than
normal, beeps or alarms sounding unexpectedly, and internal fans speeding
up for no obvious reason. Elsewhere,
the sudden and constant appearance of
error messages can be a clue that malware is present, as can a Web browsers
home page changing or new toolbars
appearing in the browser without the
users involvement. Additionally, an
inability to access various system tools;
messages that report that administrator
rights have been denied; and a sudden
disappearance or appearance of unfamiliar icons, shortcuts, folders, photos,
and file types are all other possible malware warning signs.
Pop-up messages, including those
that appear out of the blue when a
Web browser isnt even open, are another indication that malware (particularly adware and Trojans) may be
present. An especially cruel type of
malware-related pop-up is one that
warns a user of security vulnerabilities
on his notebook and recommends that
he download or buy the suggested security software (which happen to be

Most malware will use the Internet connection to send


information back or infect other computers on a network.
Isolate the laptop and then run an antivirus scan.
MICHELA MENTING
Digital Security Research Director
ABI Research

fake). Another indicator to watch for


includes phony social network posts
that the user appears to initiate and
share with his contacts.

files. These are really useful for protecting sensitive documents, she says.
On browsers, there are a number of security features that can also be activated
or increased.

Immediate Response
When you suspect malware has
infected your notebook, Menting
advises turning off its Internet connection. Most malware will use the
Internet connection to send information back or infect other computers
on a network, she says. Isolate the
laptop and then run an antivirus scan.
Additionally, ensure that antivirus software on the notebook is up-to-date
with the latest malware signatures.
If not, then copy a free AV program
onto a USB thumb drive and use it to
install [the software] on the disconnected infected PC, she says. More
sophisticated malware, Menting says,
may be able to obfuscate its presence,
and others, such as zero-days, have
simply not yet been uncovered by security firms and, therefore, an antivirus
[program] will not help. In such cases,
Menting says the best option may be to
wipe the hard drive clean and reinstall
the operating system.

Means Of Prevention
As a means of prevention, Menting
says, at the least, you should ensure that
a firewall is running and working properly. Generally, she says, most operating
systems have built-in security features
that users should activate. Additionally,
numerous programs (including PDF
and document-creation programs)
provide options to password-protect

Malware Removal Tools


Beyond built-in tools, numerous
malware-removal tools are free for
download and use, as are numerous
useful and easy-to-use program-based,
on-the-fly encryption tools and antitheft products. Menting says, Users
should definitely consider protecting
their data as well as their devices. She
says specific features and abilities to
seek out in such tools included antivirus, antispam, antiphishing, and
antispyware; firewall and intrusion
prevention systems; email, browser,
chat/instant messaging, and application protection; privacy, ID, and online
transaction protection; encryption and
password management; antitheft and
remote locate/lock/wipe; and cloudbased services and backup platforms.
Usage-wise, routinely run antivirus
scans and avoid opening email and
attachments or clicking links within
messages from senders you dont recognize; dont reply to suspicious email;
avoid visiting suspicious or unknown
websites; dont click pop-ups that appear suspicious and consider using a
pop-up blocker; and dont download
and install software from suspect
sources. Additionally, keep software,
including Web browsers and security
programs, updated; back up data regularly; and report suspicious activity to
your companys IT department.

CyberTrend / December 2015

63

Social Media Privacy Tips


TAKE CONTROL OF YOUR ONLINE PRIVACY

SOCIAL MEDIA IS ALL ABOUT sharing


our lives with friends and family, and
vice versa. From daily musings about life,
such as a friend thats excited about an
upcoming vacation, to important events,
like the birth of a new grandchild. And
although it might not seem like the news,
photos, personal achievements, failures,
and videos you post would be of much
interest to people you dont know, the
information could be useful to cybercriminals trying to steal your identity. The
default privacy settings on many social
media websites make it so your posts,
tweets, and photos are visible to the public.
Fortunately, its easy to adjust these settings, so that only the people you know will
see the updates.

Facebook
When setting up your Facebook profile,
the service will ask for a lot of personal
informationincluding education history,

64

December 2015 / www.cybertrend.com

workplace, and phone numberthat you


might not want visible to everyone. To
complicate matters, Facebook hasnt exactly been known for consistency when
it comes to users' privacy settings, as past
interface changes have reset settings and
forced users to continually ensure their
posts and personal information remain
private. To correct some of these issues,
Facebook has made changes in the last year
to simplify its privacy controls.
Click Privacy and youll see a list
of configurable options. For example,
in the Who Can See My Stuff? section,
manage who can see your future posts by
selecting Public, Friends, Friends Except
Acquaintances, Only Me, or Custom. This
way, you can make certain that your posts
won't be viewable to the public at large if
you forget to change the privacy settings
when you post an update. You can also
review the posts youve been tagged in, as
well as change the audience for updates

youve previously posted. This way, you


can control whether any old updates are
available to the public.
There are also Who Can Contact Me?
and Who Can Look Me Up? settings that
let you filter access to non-friends.
One easy way to assess the entirety
of your Facebook privacy is to use Facebooks Privacy Checkup (click the Lock
icon in the top-right corner of Facebook).
Select Privacy Checkup and, in the resulting pop-up window, Facebook shows you
the controls for who can see your posts.
If youre following our steps, youve already addressed this step. Click Next Step
to see what apps youve logged into with
Facebook. Delete the apps you no longer
use. When you're done, click Next Step.
Finally, Facebook will bring up the information shared on your profile. Here,
youll see options to add a phone number,
email, birthday, hometown, and other information. Click Finish Up to finalize your

new privacy settings. All of the information


in the last step can be found in the About
section of your profile, which also contains other information you might want to
make private. To do so, click your personal
timeline and select About. Under the tabs
for Work And Education, Place Youve
Lived, and Contact And Basic Info, you
can adjust the privacy settings for details
that werent part of the Privacy Checkup.

Facebooks primary privacy settings can be


found in the Privacy window.

Twitter
By default, Twitters account settings
make your tweets available for all to see.
The alternative is a protected mode, where
your tweets are only visible to your approved Twitter followers. Protected tweets
are not retweetable, so even approved users
cant share your tweets. You also cannot
share permanent links to your tweets with
anyone but approved followers. If you
want to use Twitter to drive Web traffic,
the restrictions in the protected mode
might undermine why you joined Twitter
in the first place.
If you want to adjust your tweet privacy level, or the other privacy controls
on Twitter, sign into Twitter and open
your account settings. Next, click Security
And Privacy and scroll down to Privacy.
If you only want approved followers to see
your tweets, click the Protect My Tweets
checkbox. You can also control who can
tag you in photos, whether your tweets
include a location, and how others can find

you. After making your privacy selections,


click the Save Changes button.

Google+
For Google+, privacy has been a key
consideration from the very beginning. For
example, youve always been able to assign
a privacy level for each post you share. And
based on the Circles (friend groups) youve
set up, its easy to share content with only
a specific crowd. Google+ also offers detailed privacy settings where you can control most every aspect of your profile. Visit
your Google+ page, click your name, select
the drop-down menu under the Google+
logo, and choose Settings.
In the Settings window, you can customize who can send you notifications,
comment on your public posts, and
manage subscriptions. If you want to
configure the audience settings for your
posts, photos, and profile updates, scroll
down to the Your Circles section and click
Customize. By default, Google+ pushes
updates to the people in your Friends,
Family, and Acquaintances groups. To
block a particular group, remove the check
from the checkbox. If you want to reach a
larger group of people, you might want to
add a check to the Following checkbox, so
followers of your Google+ profile will be
added to Your Circles list.
Next, scroll down to the Profile section where you can configure how people
are able to find your profile and control
what content displays in your profile. A
setting of interest for businesses is Allow
People To Send You A Message From
Your Profile, as this setting offers a way for
consumers to reach out to you. If the setting is limited to Your Circles or Extended
Circles, customers might not be able to
contact you.
If you use Google+ on your mobile
device, youll also want to examine the
Location Settings section. These settings
let you enable or disable location reporting
via your smartphone and tablet. If enabled,
you can control who can see your current city and/or exact location. The precise

Google+ offers a variety of privacy controls.

location is ideal for those who wish


to share their location with friends and
family. If thats something you dont plan
to do, then it might be best to disable location settings.

LinkedIn
The business-focused nature of LinkedIn ensures that privacy is a priority. To
examine your settings, log in to LinkedIn,
hover your pointer over your profile
photo in the right-hand corner, and select
Manage next to the Privacy & Settings option. In Privacy Controls, youll find a host
of options to control what others can see
on your profile and activity feed.
If you use LinkedIn to search for new
clients and key connections within an organization, you can opt to remain anonymous, so people wont know that you
looked at their profile. To do so, click
Select What Others See When Youve
Viewed Their Profile. There are two anonymous options, one where others will see
an industry and title, or you can opt to
be completely anonymous. You can also
manage who can follow your updates, edit
blocked connections, and shut down users'
ability to view your connections.

Manage All Your Online Accounts


Now that we've explored the basic steps
of managing your privacy settings, it would
be wise to check your privacy settings for
other social networks you might use. This
way, you can have a measure of control of
your publicly available online data.

CyberTrend / December 2015

65

Rootkit Attacks
WHAT TO DO TO FIGHT BACK

EVEN SEEING THE WORD rootkit can


send shivers up the spine of someone
who has suffered through the inconvenience and damage a rootkit can
exact. According to Dan Olds, principal
analyst at Gabriel Consulting Group,
rootkits are some of the most insidious and dangerous pieces of malware
out there today. Thats due to the fact
that rootkits are both extremely difficult to detect and get rid of completely.
Therefore, the more you know about
rootkits, the better.

What Is A Rootkit?
A rootkit is software that infects and
gains privileged access to a computer.
This means it can perform administrator-level type tasks, says Michela
Menting, digital security research director with ABI Research. The primary
feature is that it can hide itself in the
system and remain undetected.

66

December 2015 / www.cybertrend.com

One way to think of how a rootkit


wreaks havoc, says Jim OGorman, an instructor of offensive security measures,
is to envision that you are driving a car
but someone else is intercepting all your
movements and deciding if he should pass
them on to the car or not. In some cases,
he might decide to just insert some of his
own commands, as well, OGorman says.
Although rootkits are similar to viruses or Trojans, says Chris Hadnagy,
a security training professional, viruses
and Trojans usually delete data, stop
services, or cause harm while a rootkit
provides an attacker system access to
get at data. Not all rootkits are malicious (a company might install one to
remotely access and control employee
computers, for example), however,
Menting says they are extremely popular with malicious hackers and cybercriminals, which is why they have such
a negative connotation.

The Damage
Essentially, rootkits give an attacker
free reign to perform any task desired, including installing software; deleting files;
modifying programs; transmitting data;
and using spyware to steal credit card
numbers, passwords, keystrokes, etc. A
rootkits ability to modify existing programs and processes, says Menting, enables it to avoid detection by security
software that would normally catch such
software.
There really arent any limits to
how much damage it can do to a PC,
Olds says. It can delete data files and
then rewrite gibberish on the hard
drive to ensure that the data cant be
recovered, or it can quietly work in the
background and log user keystrokes,
eventually capturing workplace, ecommerce, or banking user-names and
passwords. Ultimately, a rootkit can
route that data to a hacker to plunder

accounts or gain access to a corporate


network, Olds explains.
Beyond software-based rootkits
there are hardware-based rootkits, says
Hadnagy. These, like software rootkits,
give the attacker full admin access to a
machine, compromising everything on
it and even at times the network its connected to, he says. For users, OGorman
says a rootkit destroys all trust with the
computer. You cant know what is private,
what is not. All integrity is gone.

How Youll Know


There are several ways a rootkit can
find its way into a computer. A downloaded program file a user believes to be
legitimate, for example, may have a rootkit
embedded within it. Menting says rootkits generally enter a system through existing vulnerabilities and are loaded by
malware, which can infect computers via
downloads, email attachments disguised
as genuine communication or documents,
websites with unpatched vulnerabilities,
USB thumb drives, or mobile devices.
To the average user, abnormal computer behavior is the best indicator a
rootkit might be present; warning signs
include files spontaneously disappearing
or appearing, a sluggish Internet connection, and slow-loading programs. Such
behavior can indicate other programs are
running in the background. Menting advises checking the Task Manager to detect
which applications or processes are running and using significant memory. For
the non-tech user, it may be difficult to
understand, she says. But users should
familiarize themselves with how their Task
Manager looks when its running on a
clean system so that when it actually is infected, the user can spot some differences
when looking at the tasks.
That said, detecting a rootkit is still
generally difficult. This is due to how
adept the rootkit is at installing itself and
hiding its presence in a way that is virtually undetectable by your system software, Olds explains. In this case, the

Unfortunately, the likelihood of being hacked or unwittingly downloading malware on a computer is extremely
high. Especially in the network-connected environment
of a company, even if you take all precautions necessary
someone else may not have and you get a virus from
them internally.
MICHELA MENTING
Digital Security Research Director
ABI Research

only way to find the rootkit, he says, is


to boot the system using a CD/DVD or
thumb drive that has special diagnostic
routines designed to find and remove
rootkits. Hadnagy says if a systems OS
is compromised, it cant be trusted to
find flaws in itself.In this event, it may
be necessary to boot a self-contained OS
running from a CD/DVD or USB drive
and run malware detection and removal
software from a clean environment.

What To Do
For typical users, arguably the worst
news concerning rootkits is that getting
rid of one can be beyond their scope.
Olds says, in fact, most users should
probably seek an experts help if they
suspect a rootkit infection. Though
some security programs can detect and
remove specific rootkits, Menting says,
there are so many variants that it can
be impossible to detect and remove
them all. Often, she says, getting rid of
a rootkit requires a radical solution.
If a user suspects a rootkit, he should
first disconnect the system from the
Internet to cut off possible remote access
and prevent data from leaking, Menting
says. Next, remove data from the infected
computer and scan it for malware on another device. (Menting notes that if the
data contains unknown [or zero-day]
malware, this step may not guarantee the
malware is eradicated.) Finally, the computer should be purgedwipe the hard
drive and reinstall everything, she says.

OGorman, in fact, says starting over is


the only real solution, because really, you
cant trust cleanup methods, as you are
never really sure if they worked.

How To Protect Yourself


The first defense against rootkits (and
malware in general) is keeping the operating system and all softwareespecially
security softwareup-to-date and fully
patched. Completely relying on antivirus
software is a mistake, however. OGorman
explains theres always a lag between the
time a new threat pops up and the point
at which antivirus software can detect it.
The best way to avoid issues is to not
engage in risky activities, he says. Run
trustworthy, current software thats kept
patched. Dont go to shady sites with outof-date browsers and plug-ins. Dont run
software that doesnt come from trustworthy sources.
Unfortunately, the likelihood of being
hacked or unwittingly downloading malware on a computer is extremely high,
Menting says. Especially in the networkconnected environment of a company,
even if you take all precautions necessary,
someone else may not have and you get a
virus from them internally.
Menting suggests using different passwords for all logins, encrypting sensitive
and confidential data, staying constantly
on the lookout for odd system behaviors,
and securing mobile devices, particularly if
they are connected to a company network
or business computer.

CyberTrend / December 2015

67

Laptop-Projector Setup Problems


TROUBLESHOOT COMMON ISSUES WITH THESE HANDY TIPS

YOURE READY TO give your presentation, but until that first slide appears on the big screen, you can never
be sure that your equipment has got
your back. We cant tell you not to
worry, but these handy tips should
help bail you out if your presentation
goes south.

Hardware & Cable Connections


It can be difficult to track down the
source of problems that occur when
you are connecting a notebook and
projector. Following are some things to
watch for.
Video. Turn off all equipment and
connect your notebooks video out port
to the projector. The usual connection
choices for a notebook are VGA (Video
Graphics Array), DVI (Digital Visual
Interface), HDMI (HD Multimedia
Inter-face), and DisplayPort. Many
projectors have VGA and one or more

68

December 2015 / www.cybertrend.com

digital connections. If possible, use a


digital connection for high quality.
Sound. Some HDMI and DisplayPort digital video connections can
carry audio through the same port,
but both notebook and projector must
support audio over the digital video
connection. Traditionally, audio is
connected using the notebooks audio
out jacks and the projectors audio in
ports; both of these are often RCA or
3.5mm. If youre not using the projectors built-in speakers, make sure you
connect your notebooks audio out to
the sound system you intend to use and
turn the volume down on the projectors speakers.
Mouse. If you are using a mouse, or
a remote mouse controller, make sure
the controller/mouse is connected, usually through the notebooks USB port.
If you are using a wireless device, make
sure the notebook has the appropriate

wireless connection enabled. This is


typically Bluetooth or a USB port wireless dongle.

Network Connection
Many venues supply network projectors, which are made available as
a shared resource. Making a connection to a network projector is as easy as
plugging your notebook into the corporate network via wired or wireless
Ethernet. Check with the companys
IT staff for specifics. Once connected,
use the network connection wizard in
Windows 7 to find the projector you
wish to use:
Click Start (the Windows button in the bottom-left corner
of the screen).
Click All Programs.
Click Accessories.
Click Connect To A Network
Projector.

The network connection wizard


may inform you that your
notebooks firewall is blocking
the ability to connect with the
projector. Click to establish the
network connection.
Either have the wizard search
for available network projectors
or enter the projectors address
manually if it is available.
Once the device is connected, a
Network Presentation window will
minimize to your Taskbar. When
youre ready to make your presentation, open the Network Presentation
window and select Resume. Your notebook will treat the network projector
like an external monitor.

No Video
In many cases, your notebook will
detect that you have a projector plugged
into one of its video outputs and will
automatically turn on the port. Not all
notebooks do this, however; and even
those that can still have missing video
if the notebook isnt set to duplicate the
Desktop or extend it to the secondary
monitor (the projector). Many notebooks use a function key combination
to toggle the projector port on or off
and set how you can use the display.
We recommend using the control
panels in Win7:
Right-click a blank area on the
Desktop.
Select Screen Resolution.
Select the second display from
the drop-down menu.
Select Extend These Displays
from the Multiple Displays
drop-down menu. Your Desktop
background should now appear
on the projector.
Win7 also has a pop-up display for
selecting the content that is sent to the
projector. Press the Windows-P keys
to bring up the four possible selections:

Disconnect Projector (turns the


projector display off)
Duplicate (mirrors your computers Desktop on the projector)
Extend (uses the projector as an
extension of your Desktop)
Projector Only (turns off your
notebooks display and uses the
projector as the main display)

Video Is Out Of Range


When the projector cant reconcile
a video signal from a notebook with
its preset resolution, it displays an
out-of-range message. To solve this
in Win7:
Right-click a blank area on
the Desktop.
Select Screen Resolution.
Select the display associated
with the projector.
Use the resolution drop-down
menu to adjust the resolution to
the correct value. Try 800 x 600
or 1,024 x 768 as these are resolutions that many projectors
can handle.

Display Turns Off


If the projectors display turns off
during your presentation, you'll want
to check your notebooks power management feature, especially if youre
running the notebook off of its battery. Whenever possible, use your AC
adapter to run your notebook.

Video Wont Display Or Is


Choppy
Your slide presentation works fine,
but when you try to show a video, all
you see is a blank window or a choppy
rendition of the video. Trying to display a video on two monitors can be
too much for a video card that has
marginal graphics capabilities. If video
isnt displaying correctly, change the
Display settings to make the projector
the primary display.

NOTEBOOK-PROJECTOR
TROUBLESHOOTING
TIPS
Turn off all equipment before
connecting the notebook to the
projector.
If possible, use a digital connection to ensure a high-quality
presentation.
If youre not using the projectors built-in speakers, turn
them down and connect the
notebooks audio out to the
sound system.
If youre using a wireless
mouse or controller, make sure
you can establish the wireless
connection.
Use the network connection
feature in Windows 7 to connect to a network projector.
No video? Check the ports and
Windows Screen Resolution
settings.
Adjust the screen resolution to
resolve out-of-range messages.
When a projected image isnt
proportionally correct, reposition
the projector and/or change the
projectors keystone setting.
If a display turns off during a
presentation, check the notebooks power management
settings.
If video isnt displaying correctly, change the Display settings to make the projector the
primary display.

CyberTrend / December 2015

69

GENERATORS

AUTOMATIC TRANSFER SWITCHES

UPS
PRE-OWNED GEN SETS
20-3000KW
LOW HOUR WITH WARRANTY

We buy and sell complete systems.


CALL FOR PRICING AND SPECIFICATIONS.
w w w. e m p i r e - c a t . c o m

INQUIRIES
Kris Davenport: 602.622.5619
kris.davenport@empire-cat.com

PROCESSOR

SPECIAL ADVERTISING & CONTENT


FROM OUR PROCESSOR PARTNERS

Helping IT stay on pace with the

SPEED OF CHANGE

Processor is designed for the IT world, covering the hardware


and technologies that power todays data centers.

PROCESSOR
I T & FAC I L I T I E S M A N AG E M E N T

Environmental Monitoring Without


The Need To Plug In To The Network
New AVTECH Room Alert 3 Wi-Fi Monitor Expands Facilities Monitoring To New Markets
AVTECH Softwares new Room
Alert 3 Wi-Fi marks the latest in the
companys long line of powerful
and popular IT and facilities temperature and environment monitors.
AVTECH, which has been in
business since 1988, now has more
than 130,000 customers in 180
countries. Michael Sigourney, president and CEO, says this new Room
Alert model will reach an even bigger audience because it eliminates
the need to physically plug in to
the network.

Small Footprint
The unique footprint and features
of Room Alert 3 Wi-Fi make it perfectly designed to assist with monitoring temperature and other environmental conditions where a small
footprint is needed, where a wired
connection may not exist, when the
investment cost needs to be minimal, or where deployment volume
may be high.
With one digital temperature sensor built-in, users can expand monitoring by adding another digital
sensor (i.e., temperature, humidity,

outdoor and fluid temperature) as


well as a switch sensor for conditions such as flood/water, power,
smoke/fire, airflow, and room
entry, motion, and more.

Use Anywhere Theres Wi-Fi


Room Alert 3 Wi-Fi has many
benefits and can be used anywhere a Wi-Fi connection is available. There are no cables to run, so
you can use Room Alert 3 devices around the world and moni-

AVTECH Room Alert 3 Wi-Fi


Monitor, alert, log, graph, view, map, report, manage and protect any facility.
Includes powerful Device ManageR software (free) and one year GoToMyDevices
Personal cloud service (free).
Ideal in areas that require a small footprint and have no network connection. Includes a
built-in digital temperature sensor. Use in computer rooms, warehouse, medical, cold
storage, restaurant, residential, more.

$175 price makes it affordable to deploy or use in areas requiring a large number of
devices. Over 30 sensor options.

(888) 220-6700
Sales@AVTECH.com
Go to AVTECH.com and click Store

tor them all together on a single


screen through AVTECHs Device
ManageR software (included
free) or GoToMyDevices (www.
GoToMyDevices.com) cloud service. Advanced alerting, mapping
and graphing features provide easy
overview and analysis. New customers receive a one-year Personal
subscription to GoToMyDevices at
no charge.

Affordable Price Opens


New Markets, New Uses
Based on its ease of use and price
of just $175, Sigourney says, Room
Alert 3 Wi-Fi can help bring temperature and environmental monitoring into markets that have previously been slow to adopt this
technology. Key markets include
IT, medical, cold storage, housing,
retail, food service, museums, public buildings, farming, transportation, warehousing, and distribution.
AVTECH Room Alert 3 Wi-Fi is
available direct and from professional resellers in 180 countries. P

CyberTrend / December 2015

73

PROCESSOR
S E RV I C E T O WAT C H

GIGABYTE GSM Program Caters


To Business Customers
Replacement Program Helps Companies Save Money & Time On Motherboards
Motherboard management is
often a pain point for IT departments,
system integrators, and VARs (valueadded resellers). Companies requiring
a long-term stable supply of motherboards can expend an abundance of
resources on life cycle management
tasks. For these companies, whether
they are large organizations or small
to midsize businesses, GIGABYTEs
GSM (GIGABYTE Stable Models)
program guarantees 14+ months of
stable motherboard supply.

What The Program Does


The GSM program eliminates the
headaches, time, and costs involved
in maintaining a steady supply of reliable motherboards. Customers entering into the GSM program enjoy a
minimum 14-month production and
supply, as well as cross-shipping service (your replacement is sent before
the item is received), with no minimum orders.
GIGABYTE guarantees that its
GSM products have a life span thats
longer than other models, and many
motherboards are available for more
than one year. To ensure that its
customers know what to expect in
advance, GIGABYTE provides all
of the updated data and information
online at businesscenter.gigabyte.us,
including a Product Roadmap (which
documents release dates and additional information for each motherboard), current promotions, and product change notification updates.
Thanks to its partnership with
Intel, GIGABYTE offers a reward
program for Intel Technology
Providers. GIGABYTE also offers
rebate programs for certain Intel and
AMD purchases.

76

December 2015 / www.cybertrend.com

Overall Benefits
GIGABYTE provides a threeyear warranty for all of the
motherboards in the GSM program. All products have passed
WHQL testing and are certified for Windows. GIGABYTE
assigns each customer a dedicated sales representative to ensure
the best service possible. The
company also offers dedicated
technical support and service, a
phone hotline for immediate service (Monday through Friday, 9
a.m. to 6 p.m. Pacific Time), and
BIOS customization support.

Advanced Replacement Service


One key feature of the GSM program is the GSM ARS (Advanced
Replacement Service), which is available to customers at the Valued Partner
and Premier Partner membership levels.
Customers at these membership levels can use the advanced swap service
for three years from the purchase date,
enjoy a one- to two-business day RMA
process time, and can choose from multiple shipping options (outbound, overnight inbound, and ground).
Customers at the Standard Partner
level, by comparison, can use the
advanced swap service depending on
availability. They can expect an RMA
process time of
two to three business days, and
can use either
outbound or
ground shipping.
All customers
benefit from
GIGABYTEs
three-year product warranty.

How To Become A Member


To enter the GSM program, customers must first purchase a GSM
motherboard from an authorized distributor (ASI, Avnet, D&H, Ingram
Micro, LeaderTech, Ma Laboratories,
or Synnex). Customers may then register for an account within 30 days.
Its as simple as that. GIGABYTE reevaluates all memberships on a quarterly basis. Call the phone number
or visit the website below for further
details. P

GIGABYTE GSM Program


Extends product life cycles
In conjunction with Intel Technology
Provider Program
Ideal for system integrators and VARs
Includes a wide range of options for
business and industrial users

(626) 854-9338
businesscenter.gigabyte.us

PROCESSOR
V I D E O WA L L & D I G I TA L S I G N AG E M A N AG E M E N T

Overcome Configuration Challenges


Of Multi-Screen Displays & Signage
ATENs VM Seamless Switch Series Makes Managing Video Walls More User-Friendly
Founded in 1979, ATEN Technology
has long been a global leader in the
KVM switch market. The company is
based in Taiwan, with two subsidiaries
in the United States and many offices
around the world; this gives ATEN the
ability to serve a worldwide customer
base with its innovative products. In
addition to KVM switches, ATEN also
specializes in high-end audio/video
switches, including its VM Seamless
Switch Series, which is specially
designed to make the configuration and
management of multi-display video
walls and digital signage much easier
for users in a wide range of industries.

Hardware & Software In One


ATENs VM Seamless Switch Series
acts as the brain behind video walls and
other multi-display digital signage. The
product combines hardware with an integrated software interface that enables
users to create custom configurations
for how digital content will appear on
connected screens. The switches support
applications with anywhere from two
video sources and two displays up to 16
sources and 16 displays. The VM1600
16 x 16 Modular Matrix Switch is

particularly useful because you can mix


and match modular I/O boards to supports multiple inputs and outputs, including VGA, DVI, HDMI, and HDBaseT.
For distributing highest possible video
quality, the VM6404H HDMI Matrix
Switch supports up to four video inputs
and four video outputs at 4K resolutions.
The software provided by ATEN is
available for PCs and tablets, and its
specifically designed with the user in
mind so both experts and novices can
quickly and easily make custom profiles from anywhere. The VM Seamless
Switch Series is flexible and can be used
in any multi-screen application and in a
variety of situations. From airports, traffic control centers, and meeting rooms to
sports stadiums, casinos, and almost any
other location you can think of, ATENs
VM Seamless Switch solutions simplify the process of creating eye-catching
visual displays for informative purposes
or for digital marketing opportunities.

with DVI signals. The VM5808D supports up to eight DVI video sources and
eight DVI displays and the VM5404D
supports four DVI video sources and
four DVI displays. These new models are particularly beneficial for traffic
control centers and other larger control rooms where DVI displays are still
widely used. You can also create 8 or
16 connection profiles, depending on
which model you choose, so you can
customize layouts and quickly access
them as needed. P

VM Seamless Switch Series


Video wall provides 2x2 to 16x16
video wall capability
Seamless switching worlds fastest
switching speed between different digital
video sources
Optimum video performance designed
to handle different format of digital video
sources and sinks

On The Horizon
Looking forward to early 2016,
ATEN plans to launch its VM5808D
and VM5404D Seamless Matrix
Switches, which are designed to work

888-999-ATEN (2836)
www.aten-usa.com

CyberTrend / December 2015

77

Pegasus Has You Covered

SOLD & SERVICED


IN THE U.S.

We Sell, Service & Purchase

Kronos Time Clocks POS Barcode Surveillance Systems

Pegasus Computer Marketing, Inc.


Repair, Maintenance & Sales, All From Our Texas Office, Since 1987
(800) 856-2111
www.pegasuscomputer.net

F E AT U R E D P R O D U C T

Custom Solutions For All


AICs Server & Storage Products Are Building Blocks For
Companies Of Any Size, In A Wide Range Of Industries
AICs vast array of server and
storage solutions gives it the versatility to serve a wide range of markets,
including data centers, network security, media and entertainment, industrial PC, and surveillance. And because
AIC is an OEM/ODM, its essentially
a one-stop shop for all of your product
support and replacement part needs.
Regardless of the size of your company or what industry you fall into,
AIC has a solution that will meet your
requirements. For proof, consider AICs
storage and enclosure products. With
JBODs available in 2U, 3U, and 4U
form factors and support for 12 to 60
drives as well as 30 models of rackmount chassis to choose from, you can

easily configure a solution that fits perfectly into your data center environment.
Every product that AIC offers serves
as a building block for a highly customized system. The company offers
platform solutions featuring MAX
I/O technology for improved performance, high availability storage servers
with dual controllers for better reliability, and network solutions that are
FIPS- and/or NEBS-ready. Add to that
AICs infrastructure appliance solutions, which are designed to improve
the management and efficiency of data
centers and cloud environments, and
you can build custom solutions that
will continue to serve your company
for years to come. P

AIC Server & Storage Solutions


Six product categories for five markets
Building blocks for customized systems
OEM/ODM support and expertise

(866) 800-0056
www.aicipc.com

CLOUD-BASED SEARCH

dtSearch Solution For Azure & RemoteApp


.NET Solution Offers Instant Cloud-Based Searching Across Terabytes Of Data In Nearly Any
Data Type, With Look And Feel Of A Native Application On Nearly Any Computer Or Device
dtSearch now offers a new .NET
solution for running the dtSearch
Engine fully online in the Microsoft
Azure cloud. The solution uses
RemoteApp for secure data access of
nearly any data type from nearly any
computer or device. The solution
enables cloud operation of all dtSearch
components, leveraging Microsofts
new Azure Files feature for dtSearch
index storage.
Searching (including all 25+ dtSearch
search options) runs via Microsofts
RemoteApp. Using RemoteApp gives
the search component the look and
feel of a native application running
under Windows, Android, iOS or OS X.
Developers using dtSearchs core
developer product, the dtSearch Engine,

can find a direct link to the CodeProject


article (including complete Visual
Studio 2015 .NET sample code) top
right at dtsearch.com/contact.html.

Instantly Search Terabytes Of Data


dtSearch is a leading supplier of
general enterprise and developer text
retrieval software. dtSearchs document
filters support popular file types, emails
with multilevel attachments, databases
and Web data. The product line can
highlight hits in all data types.
dtSearch products offer over 25
different search options, including
faceted searching, federated searching,
advanced data classification options,
special options for forensics users,
and much more. For developers, the

(888) 865-4639
www.lindy-usa.com

Dont compromise your network security: Let


LINDY help protect your most valuable asset!
Quickly block open network ports and easily prevent
users from connecting cables and devices or inserting
foreign objects without permission. Helps protect against
unauthorized access to a network or system, as well as
prevents unintentional or malicious damage to ports.

Quick, easy & simple to install


Block physical access to RJ45 Network ports
Prevent damage to ports
Visual security deterrent
10 x Blockers and 1 x Key

USB Port Blocker


System administrators can physically prevent users from connecting Pen Drives, MP3
Players, and other USB Mass Storage Devices to their computers to copy data, introduce
viruses, etc. The USB Port Blocker is a combined key and lock assembly which plugs
into the USB port. Simply plug the keylock into the
port and release the latchthe lock remains in place!
Plug the key back into the lock to remove. Easy!
Physically blocks access to a USB port
Consists of 4 locks and 1 key
5 different color code versions available:
pink, green, blue, orange, white

dtSearch Engine has APIs for .NET,


Java and C++, as well as SDKs for
multiple platforms.
Visit dtSearch.com for hundreds of
reviews and case studies, as well as fully
functional evaluation downloads. P

dtSearch
The Smart Choice for Text Retrieval since
1991, dtSearch has provided enterprise
and developer text retrieval along with
document filters for more than 24 years.
(800) IT-FINDS or (301) 263-0731
www.dtsearch.com

SPECIAL ADVERTISING & CONTENT


FROM OUR COMPUTER POWER USER PARTNERS

Computer Power User offers technically sophisticated readers


a unique blend of product reviews, PC industry news and
trends, and detailed how-to articles.

Intel NUC

Tiny. Powerful. Complete.

ntels diminutive NUC mini-PCs


have always been a great option for
lots of users in all kinds of situations
who need an ultra-compact PC that
is flexible and highly functional
but doesnt use much power or take
up much space. But up until now,
theyve always been barebones PC
kits that required you to install
your own memory and storage in
order to function. The latest NUC,
NUC5PGYH (code-named Grass
Canyon), changes all of that.

Everything Inside
The latest NUC is powered by
Intels Pentium N3700 processor, a
quad-core CPU with a base frequency
of 1.6GHz and a burst frequency of
2.4GHz. The N3700 is built on a
14nm manufacturing process, has 2MB

84

December 2015 / www.cybertrend.com

of L2 cache, and supports up to 8GB


of SODIMM DDR3L-1600 system
memory. The chip also comes with Intel
HD Graphics onboard, with a 400MHz
base frequency and a 700MHz burst
frequency, support for DirectX and
OpenGL, and support for a host of
proprietary video features including
Intel Quick Sync Video, Intel InTru
3D Technology, Intel Clear Video HD
Technology, Intel Wireless Display,
and more.
As powerful and feature-packed as
the Pentium N3700 is, it still manages
an incredibly stingy TDP (Thermal
Design Power) of just 6 watts, which
means your NUC will stay cool, even
when running high-end, resourceintensive applications.
Other components that make up
this impressive mini-PC include 2GB

of DDR3L, a soldered-down 32GB


eMMC (embedded MultiMediaCard)
module with Windows 10 preinstalled,
and an external 12V-19V DC wallmount power adapter.
The NUC5PGYH also includes
o n b o a rd Wi - Fi 8 0 2 . 1 1 a c a n d
Bluetooth, a Gigabit LAN port, an
infrared sensor, four external USB 3.0
ports (two on the front and two in
the back; the top port on the front
is a Chargeable port that will charge
mobile devices even when the NUC
is powered down), two USB 2.0 ports
via an internal header, a headphone/
microphone jack, a VGA port, an
HDMI port (this provides up to
7.1 surround audio, by the way), a
headphone/TOSLINK jack, and an
SDXC slot. The unit also includes an
internal bay for a 2.5-inch SSD or hard

drive, allowing you to greatly expand


the NUCs storage capabilities.
If youre keeping score at home,
this parts list means one very
important thing: The NUC Mini-PC
(NUC5PGYH) is ready to power up
and go, right out of the box. Although
you can easily expand its onboard
SODIMM DDR3L memory and its
available storage space via the SDXC
card slot and the internal 2.5-inch
SSD/hard drive mount, you dont have
to add anything as in the past to get
this NUC up and running.

Flip Your Lid


As with recent NUC kits, the
NUC-5PGYH comes with a VESA
mount that lets you put your NUC
just about anywhere (including the

back of your monitor if you want),


as well as a set of multi-countr y
plugs for international use. But one
of its coolest and potentially useful
additions is its removable lid. The
black lid that comes with the NUC
looks great and does its job admirably,
but in the event that youd like to, you
can spice things up with a replacement
lid, either from a third-party vendor or
using one that you make yourself.
To that end, Intels NUC website
(www.intel.com/nuc lid ) includes a
mechanical drawing of the lid and
files you can download and use to
create your own lid with a 3D printer.
Replacement lids can be as simple as
a custom lid with a different color or
the decorative pattern of your choice,
or as involved as a lid that includes an

NFC (near-field communications) or


wireless charging header.

Ready When You Are


The newest member of the NUC
Mini-PC family, the NUCPGYH, is
the first Intel NUC thats available
as a complete system and is ready
to go the moment you get it home.
And despite its four-inch-square
footprint, NUC offers quad-core
Intel computing performance and the
power of Intel HD Graphics, as well
as full wired and wireless connectivity
and a host of peripheral and accessory
options. Whether youre looking for a
mini-PC for work or play, Intel NUC
serves up desktop PC performance and
a full Windows 10 experience in a tiny
package.

CyberTrend / December 2015

85

ONE AIO
TO RULE THEM
ALL!

CPU Cooler compatible only with*


Designed and Made in Europe

PREDATOR

Pre-Filled CPU Xpandable Liquid Cooling


predator.ekwb.com

CORPORATE TRAVEL?
NEED A VACATION?
Let our #missionbird take you where you need to go.
Our diverse fleet of 22 aircraft offers a travel experience above the rest.

Ready when you are

STAjets exclusive membership program allows our members


to earn cash in addition to flying at an industry discount.
It requires no complicated contracts, deposits, hidden fees,
or blackout dates.
We offer our exclusive members discounted flights
while earning cash rewards on every flight.

ITS THE LOWEST COST MEMBERSHIP PROGRAM IN THE INDUSTRY


AND THE ONLY PROGRAM THAT PAYS BACK!

844 FLY-STA1 | charter@stajets.com | www.stajets.com


359-7821

Você também pode gostar