Вы находитесь на странице: 1из 72

MARCH 2015 VOL. 13 ISS.

03
CYBERTREND.COM

MORE SERVERS,
FEWER HEADACHES
HOW TO LEVERAGE
COLOCATION SERVICES
SO YOU CAN FOCUS ON
YOUR CORE BUSINESS

Volume 13 : Issue 3 : March 2015

32

COLOCATION SERVICES LET YOU ADD


CAPACITY IN DIFFERENT LOCATIONS

BETTER METHODS FOR


VISUALIZING YOUR DATA

9 COVER STORY
making sense of your colocation service
provider options

32 DATA
data visualization makes it easy to get a
handle on your information

47 SECURITY
better authentication options for better
security, and how to handle a cyberattack

14 BUSINESS
the big picture at Samsung, and how bots
interfere with cybermarketing initiatives

36 ENERGY
the latest news and research into energyconscious tech

52 WEB
fog computing, and overcoming hindrances to
seamless content delivery

24 CLOUD
why cheap and easy cloud storage may not
pay off in the long run

38 IT
managed IT services benefits, and why
many companies are building data centers
in Europe

58 ELECTRONICS
the latest in premium consumer electronics

28 MOBILITY
mobile platforms in the enterprise, and
extending wireless connectivity everywhere

CONTACT US
P.O. Box 82545
Lincoln, NE 68501
or
120 W. Harvest Drive
Lincoln, NE 68521

44 NETWORKING
the evolution of virtual private networking and
how it helps you today

60 TIPS
smartphone, social media, and business
travel tips

Advertising: (800) 247-4880


Fax: (402) 479-2104
Circulation: (800) 334-7458
Fax: (402) 479-2123
www.cybertrend.com
email: feedback@cybertrend.com

Copyright 2015 by Sandhills Publishing Company. CyberTrend TM is a trademark of Sandhills Publishing Company. All rights reserved.
Reproduction of material appearing in CyberTrend TM is strictly prohibited without written permission.

IDC: CIOs Need Clear Plan


For Strategy & Innovation
The third platformtypically defined as including cloud services, mobile devices, social technologies, and big
datahas business leaders struggling to
keep pace. Because so much of the 3rd
Platform relies on adopting emerging
technologies effectively, there is an
added emphasis on the CIOs ability
to deliver innovative IT solutions that
support and, in some cases, anticipate
enterprise strategic goals, says Fred
Magee, adjunct research analyst at IDC.
To stay ahead of business needs, CIOs
need long-term guidance and commitment to building strategic skills, Magee
says, in addition to a clear plan for
strategy and innovation. To get started,
IDC recommends tech leaders benchmark their companies in four areas:
leadership, planning, change management, and financial management.

IT Will Need To Strengthen Ties With Marketing In The Near Future


Gartner is encouraging CIOs and CMOs to develop stronger relationships in the
near future so that businesses can reach full business value through their investments.
The projects these leaders will collaborate on will involve performance, security, functionality, and scalability solutions; source data; and application portfolio management,
among others. Marketing continues to be a hot area of IT investment and technology
innovation with a rapidly growing application portfolio that demands greater integration and focused investment, says Kimberly Collins, research vice president at Gartner.
IT leaders supporting marketing will need to develop a strong relationship with marketing leaders to help marketing derive the full potential from its IT investments.
The following three predictions from Gartner indicate upcoming changes in the next
four years:
By 2018, CIOs who build strong relationships with CMOs will drive a
25% improvement in return on marketing technology investment.
Through 2018, VoC (voice of the customer) initiatives that dont share
data across the enterprise will compromise customer satisfaction
and loyalty measures by 30%.
By 2018, B2B sellers that incorporate personalization into digital
commerce will realize revenue increases up to 15%.

Organizations With IoT Plans


Anticipate Results

The 2014 Third Quarter Was A


Boom For Cloud IT Infrastructure

Technology Disrupters Will Alter


How We Work & Connect

In October 2014, Gartner surveyed 463


IT and business leaders in organizations
that have some sort of IoT (Internet of
Things) strategy, and found that 40% expect significant new revenue or cost-savings opportunities within the next three
years, while 60% expect such an impact
in five years or more. According to Nick
Jones, Gartner vice president and distinguished analyst, The survey confirmed
that the IoT is very immature, and many
organizations have only just started experimenting with it. Gartner found that
35% of those who expect transformative
results from IoT work for organizations
with designated IoT leadership.

Cloud infrastructure revenues experienced 16% year-over-year growth, eventually reaching $6.5 billion in revenue
in Q3 2014, based on statistics in IDCs
recent Worldwide Quarterly Cloud IT
Infrastructure Tracker. Whether internally owned or rented from a service
provider, cloud environments are strategic assets that organizations of all types
must rely upon to quickly introduce new
services of unprecedented scale, speed,
and scope, says Richard Villars, vice
president, data center and cloud research,
at IDC. Their effective use will garner
first-mover advantage to any organization
in a hyper-competitive market.

The rise of the connected worker is


changing the way work, collaboration, and
connections occur in the marketplace. In fact,
IDC research indicates that the impact spans
employees, customers, partners, and suppliers; and access to important information
is vital in making task-based decisions for
these groups. Vanessa Thompson, research
director at IDC, says, the major technology
disruptersmobile, cloud, big data, and
the Internet of Thingshave become intertwined with social workflow and communications. This will continue to have a dramatic
impact on the way we get work done and
ultimately how we connect with employees,
customers, partners, and suppliers.

March 2015 / www.cybertrend.com

Tablet Users To Top 1 Billion


This Year
More than a billion people will use a
tablet this year, according to eMarketer,
equating to about 15% of the worlds population. Although the number of tablet users
has more than doubled in the past three
years, eMarketer expects the growth will
slow dramatically starting this year, with
the number of users reaching 1.43 billion
by 2018. The research group notes three
key factors for the slowing growth: the
perception that tablets are luxury items,
increased competition from smartphones
and other connected devices, and the lack
of a compelling use case for tablets in markets with robust smartphone and phablet
use. The shared nature of tablets and increased competition from other connected
devices reduce the likelihood that the tablet
audience will match the size of the smartphone audience worldwide, says Cathy
Boyle, senior analyst at eMarketer.

Marketing A Message? Having A Process Is Key


According to a report from marketing and sales communications firm Corporate
Visions, about 75% of companies, to their detriment, lack a formal process for getting out their marketing message. The report points to lack of consistency in both
the messaging and the dissemination of that messaging as problems for companies.
Implementing a consistent, structured methodology that focuses on identifying customers unconsidered needs and creating a buying vision that defeats their status quo
bias will help marketers prepare their salespeople for the conversations that matter
most, says Tim Riesterer, chief strategy and marketing officer with Corporate Visions.
The accompanying figures indicate the perceptions of the 500 B2B (business-to-business) marketing and sales personnel surveyed for the report.
28.7%

Everyone follows a well-established message development process

35.1%

Have an established message development process, but it is not applied consistently

13.2%

Have a message development process, but it is rarely followed because people


are unaware or feel unaccountable

12.2%

Dont have a formal process for message development; we hire people and expect
them to do the right thing, but its hit-or-miss in terms of execution

10.8%

Dont know what the company does

Smart Glasses Set For Growth


In Enterprise Use

5G Adoption Will Likely Take


Two Years Longer Than 4G

Online Companies Racing


To Improve Authentication

The consumer use case for generalpurpose smart glasses is weak, says Nick
Spencer, senior practice director at ABI
Research. Last year, he says, smart glasses
were hyped as replacements for smartphones. However, 2014 showed the use
case for smart glasses is task specific, for
example remote assistance, security (facial
and number plate recognition), augmented
reality, and virtual reality, he says. ABI
Research expects unit shipments of smart
glasses to increase 150% this year, with
almost all attributed to the enterprise and
public sector for tasks such as remote assistance, police and military, security, and
warehouse and barcode scanning.

Although 4G subscriber growth took


off quickly thanks to increasingly powerful smartphones and access to 4G devices, ABI Research says 5G will reach
the 100 million subscriber milestone
after five years, which is two years longer
than it took 4G to reach the same goal.
5G will be a spectrum of evolution to
revolutionit will be an evolution of
the way the core network and network
topology is transforming now, but it will
be clearly delineated as a fifth generation
mobile air interface on which the mobile
network of the 2020s and 2030s will be
built, says Philip Solis, ABI Research
research director.

Usernames and passwords only go so


far in protecting us online. Alternative
methods for ensuring that a person
using a login is not an imposter include two-factor and multi-factor authentication, in which one or more
factors (such as a smart card or a fingerprint scan) are required, usually in
addition to a username and password.
Interest in this practice is growing, as
ABI Research has found. According to
ABI, the worldwide market for multifactor authentication software and services will reach $13.2 billion in 2020,
compared with a worth of $1.6 billion
forecast for the end of this year.

CyberTrend / March 2015

Age Of The Online Employee


The Internet has been profoundly influential
on the way we work, and there is still a great deal
of debate over the extent to which the Internet
helps or hurts productivity. There is, of course, no
definitive answer that serves every company and
every situation, because the Webs usefulness is
entirely dependent on context. The Pew Research
Center surveyed 535 working adults to help define
our current views on the impact of technology in
the workplace. For example, office-based workers
found the email (78%) and the Internet (68%)
very important to their jobs, while office-based
workers said they relied on email (25%) and the
Internet (26%) less. Phones are important across
the board, according to the survey; 37% of office
workers and 31% of non-office workers found
landline phones vital, although more non-office
workers (28%) than office workers (22%) found
mobile phones vital. The accompanying charts
indicate how often employees are working outside
of the office and workers perception of how the
Internet has impacted their productivity.

WORKING OUTSIDE
THE WORKPLACE

INTERNETS IMPACT
ON PRODUCTIVITY

10% A Few
Times A Month
8% Almost
Every Day

13% A Few
Times A Week

13%
Every Day

15%
Less Often
41%
Never

7% Less
Productive

46%
No Effect On
Productivity

1% Dont Know/
No Answer

46%
More Productive

More Shares For Facebook,


Fewer For Twitter

83% Have Difficulty Using


Intelligent Devices

Mobile Commerce To Match


E-Commerce

Businesses weighing the potential


impact of various social media outlets
will be interested in a recent report
about social sharing from ShareThis.
The firm analyzed 2 million U.S. social media users during Q4 2014 and
found that more social sharing occurred on Facebook than any other
outlet. Facebook accounted for 81% of
all social shares, an increase of 8.2%
over January 2014. The only other increase was seen in email shares, at 0.3%,
while there were declines in all other
sites. The sharpest decline was seen in
Twitter, with 3% fewer shares compared
with January 2014.

Of those purchasing intelligent devices


such as smart thermostats and wearable fitness trackers, 83% have had some
sort of trouble using them, according to a
recent survey by Accenture. Of intelligent
device owners, 21% found the devices
overly complicated, 19% had trouble with
device setup, 19% claimed the devices
didnt work as advertised, 19% couldnt
connect the device to the Internet as intended, and 17% didnt appreciate the
device appearance or design. Because 33%
prized ease of use as a top purchase
consideration, Accenture advises manufacturers to pay close attention to these
survey results.

Mobile e-commerce figures are on the


rise, according to the latest findings from
Gartner, and they are becoming more than
a small piece of the overall e-commerce pie.
Indeed, the research firm forecasts that by
2017, m-commerce will account for 50%
of all U.S. e-commerce sales compared
with about 22% today. In related findings,
Gartner anticipates that by the end of 2016,
online sales completed with the help of mobile assistants such as Apples Siri, Google
Now, and Microsofts Cortana will surpass
$2 billion. Gartner suggests developing
cross-functional teams that include IT,
sales, customer support, and legal to meet
the rise in m-commerce demand.

March 2015 / www.cybertrend.com

STARTUPS
Startup Offers Encrypted
Collaboration

Carry-On Luggage Gets Very, Very Smart

The Vancouver-based
startup Witkit recently
launched a customizable
social collaboration platform designed to replace
all aspects of project
communication, including document collaboration, file sharing, messaging, videoconferencing, and calendar. The key
selling point for this business-focused
platform is that Witkit uses the companys own WitCrypt encryption service,
which encrypts data on one users device
and then decrypts the data on another
users device so that unencrypted information doesnt pass through the cloud or
reside on Witkits servers. President and
CEO Sean Merat says . . . in the unlikely
event that the Witkit servers are compromised, there will be no decrypted data to
be found or at direct risk.

What might the suitcase of the future be able to do? Let you know its location whenever you might be looking for it, regardless of where youre at? Tell you how much it
weighs to spare you any unpleasant surprises when checking in at the airport? Alert
you (and lock itself) when its no longer nearby? Bluesmart, a startup founded in
2013 and headquartered in Mountain
View, Calif., has designed a suitcase that
possesses these and
other features. Billed
as the worlds first
smart, connected luggage, the Bluesmart
TSA-approved hardshell suitcase also
includes a battery
charger and a complementary mobile
app that controls the
suitcases digital lock, helps you plan your trip, and provides the aforementioned vital
statistics for your suitcase. The company recently announced it would receive $2 million
in funding from its Y Combinator backers.

A More Exclusive Professional


Networking Solution

Startup Offers E-Discovery Via


The Cloud

Startup Dedicated To Smoother


Email Integration

Unlike social networking services that


can at times feel like spam delivery services, the business-oriented social service
Shapr is designed to operate on mutual
trust by ensuring that users get exclusively introduced to others by people
they already know. The goal is the same
as other business networking solutionsto
help people find collaborators, employees,
and jobsbut the functionality is meant to
reduce the number of contacts who arent
especially helpful in achieving these goals.
Based in New York and Paris, Shapr recently launched and announced it had
raised $3 million from private investors in
a seed round.

The traditional approach to e-discovery


is unsustainable, says Monica Enand,
CEO of Portland, Ore.,-based e-discovery
firm Zapproved. Citing exponentially
growing amounts of data that sit in disparate places and a collect everything
mindset, Enand offers her companys
Web-based services as a way for corporate legal departments to gain control over
all of that data and find the precise information needed for quick and inexpensive
resolutions to civil litigation. Founded in
2008, Zapproved has raised $20 million so
far, including a recently announced $15
million round. The startup plans to use the
funding to continue its growth pattern.

If your companys developers have


had difficulties creating apps that work
smoothly with multiple email, calendar,
and contact platforms, youre not alone.
Nilas, a year-old San Francisco-based
startup, offers a platform of the same
name (originally called simply Inbox)
thats designed to make it easier for developers to incorporate support for Microsoft
Exchange, Gmail, and many other services
into their apps. Co-founded by Michael
Grinich and Christine Spang, MIT alumni
who worked for Dropbox and Oracle,
respectively, recently raised $8 million
for its cause in Series A funding led by
Formation 8.

CyberTrend / March 2015

WHOLE CHICKEN
Tecumseh Farms Smart Chicken is truly the most natural chicken in the United States. All Tecumseh Farms
products are raised without the use of animal by-products, antibiotics, or hormones, are 100% all-natural, and are
processed using puried cold air instead of adding non-potable waterthats the air-chilled difference.

WWW.SMARTCHICKEN.COM

Data Center Colocation Options


TODAYS MARKET OFFERS MORE PROVIDERS & ADDED SERVICES

THERE HAS ARGUABLY NEVER been a

KEY POINTS
While data center colocation generally means leasing data center facility
space, its not uncommon today for
colo providers to offer a variety of supporting services.
Add-on colocation features include
managed and hosted services, crossconnects with other tenants, and
industry-specific expertise.
Colocation pricing plans have
shifted from space-based charges to
power-based charges.
Using one or more data center
colocation providers can enable your
organizations internal IT staff for
focus more of its attention on projects
strategic to the business. This practice
also improves your data security.

more opportune time for organizations


to move portions of their IT infrastructure into a colocation data center
facility. Data indicates more vendors
have entered the market and have
thereby driven colocation prices down.
At the same time, vendors are offering
more unique features and services to
entice organizations.
In research conducted through 2014,
Sophia Vargas, a researcher at Forrester
Research, found 1,430-plus facilities operating in more than 330 U.S. cities.
She also found 68 million square feet of
reported colocation space and up to 120
million square feet of total estimated
space. A forecast from Research and
Markets analysts, meanwhile, projects
the North American colocation services
market to increase at a compound annual growth rate of 13.36% from 2012
to 2016, driven in part by organizations
seeking to reduce their own CAPEX and
OPEX facility costs.

For companies eyeing colocation,


the following provides a broad look at
options the market now offers, costs
benefits, factors to consider, and more.

The Basics
At its core, colocation entails a
provider leasing data center space in
which an organization can place its
own infrastructure. The vendor generally provides the mechanical and
electrical equipment (power, cooling,
physical security, power distribution
units, etc.) and the rack space, and
the organization maintains responsibility for procuring, configuring, and
maintaining the compute, storage,
networking, and security infrastructure. This arrangement, says Lynda
Stadtmueller, cloud services vice president at Frost & Sullivan, grants organizations considerable management
control over workload performance
and security but without the costs of
building out a private data center.

CyberTrend / March 2015

Beyond these basics, many vendors offer additional services, such as


helping enterprises move hardware
into the facility. Many providers also
offer remote hands, or service personnel who will, for example, reboot a
router or install a server for the organization, a potentially valuable service
for businesses that arent located near
the facility.
Primary purposes for which organizations are currently using colocation
services include backup and disaster
recovery, says Kelly Morgan, research
director for North American data
centers at 451 Research. After a provider has proven its worth, she says,
sometimes enterprises swap, or place
their DR equipment in their own data
centers and use the colocation facility
for their primary sites. SMBs (small to
midsize businesses), meanwhile, are
using colocation in combination with
managed hosting, managed firewalls,
and/or other services a vendor may
provide, she says.
John Dinsdale, chief analyst and
managing director at Synergy Research
Group, says technological advances are
propelling companies to become much
more data-oriented, thus vastly increasing workloads they must support.
Running large data centers is a highly
specialized operation requiring dedicated real estate and specialized staff,
he says. This is also expensive and is
now seen by many companies as being
a distraction from their main focus of
running a business.
Using colocation facilities extensively enables a company to manage its
data center requirements more costeffectively and flexibly, Dinsdale says.
Apart from advantages in flexibility
and speed, this also replaces some
heavy capital expenditure with ongoing
monthly leasing costs, so there can

[With a cross-connect,] you have a direct link into the


[cloud providers] public cloud environment, not over the
Internet but a private link with little to no latency because
youre literally within the same facility.
SOPHIA VARGAS
Researcher : Forrester Research

be substantial financial advantages,


Dinsdale adds.
In general, more organizations are
now adopting hybrid IT strategies,
combining an in-house data center
with cloud services and hosting and
colocation facilities. Stadtmueller explains that as part of a hybrid strategy,
colocation offers a less capital-intense
alternative to an on-premises data
center for workloads that require high
levels of control (security, compliance,
etc.) for which the on-premises facility
may lack adequate space or sufficient
resiliency or physical security.

Make A Case
Among the additional services that
colocation providers may offer is office
space, conference rooms, and various
amenities (access to food, break rooms,
etc.), Morgan says. Other offerings are
more operationally focused. Facility
described as colocation interconnect
facilities, cross connects, or carrier
hotels, for example, provides organizations with direct access to multiple
carriers and cloud service providers.
By avoiding the last mile of network access and connecting directly
with high-speed carrier backbones,
organizations can minimize network
delay and costs for their latency-sensitive workloads, Stadtmueller says.
Financial services, gaming, media, and
entertainment industries in particular
can benefit from such services.

USING COLOCATION FACILITIES EXTENSIVELY


ENABLES A COMPANY TO MANAGE ITS DATA
CENTER REQUIREMENTS MORE COSTEFFECTIVELY AND FLEXIBLY, DINSDALE SAYS.

10

March 2015 / www.cybertrend.com

Vargas, meanwhile, says that crossconnectsmethods for connecting an


organization into the providers environmenthave always been around,
but providers are starting to showcase
them more as a feature that enables
customers to tie into another tenants
services. If a cloud provider resides
in the same colocation facility, for
example, the colocation vendor can
cross-connect the organization and the
cloud providers environments. Now,
you have a direct link into the [cloud
providers] public cloud environment,
not over the Internet but a private link
with little to no latency because youre
literally within the same facility,
Vargas says.
Elsewhere, providers may offer managed and hosting services. These can
include managing an organizations
operating system, storage, backup/data
recovery services, or software; hosting/
managing its website; and managing
security services. IaaS [infrastructure
as a service] is also increasingly popular, Morgan says. For enterprises
that dont have a large IT staff, these
services can be very helpful. In addition, with a good, experienced provider
you get staff who have worked with
a lot of companies and have seen a
variety of problems, so they can help
head off problems before they happen.
Some providers have developed expertise in industries representing their
valued customers they make available. Commonly, these include online
gaming and entertainment, financial,
and media industries. In some cases,
the provider develops an industryspecific sales channel to address specific content-distribution needs for an
industry, Stadtmueller says. In other

Look beyond the marketing brochures to focus on


whats important to your business. . . . If youre like most
organizations, you care less about the credit associated
with SLAs than about keeping your workloads running.
LYNDA STADTMUELLER
Vice President, Cloud Services : Frost & Sullivan

cases, a provider might develop industry-specific offers or SLAs (servicelevel agreements). Understanding
and responding to industry-specific
needs and challenges allows a colo provider to bring additional value to the
tenant, she says.
Typically, providers offering managed services possess certifications
(HIPAA, PCI, etc.) that most industries require, Vargas says. This is fairly
middle-of-the-road, she says, although
where vendors start going above and
beyond is adding flexibility into contracts. Generally, this requires some
forecasting in terms of the organizations move-in requirements progressing year on year so that by the end
of, say, a five-year contract it will want
to reserve a given amount of space,
Vargas says.
Ive seen some vendors be a little
more accommodating with users when
theyre not certain about what their
capacity requirements are going to be,
Vargas says. So, instead of having to
penalize them for having to expand or
retract parts of the contract, theyre
being a little more flexible in those
terms up front.
Elsewhere, Vargas says, providers
are offering more variability in terms
of resiliency levels. The old colocation model, for example, saw a provider build up its facility so everything
was at, say, a Tier III-plus level. Now,

however, providers realize Tier III-plus


might be overkill for less-critical applications and some organizations dont
want to overpay for an environment
it doesnt necessarily require. Thus,
a vendor may have multiple rooms
within the same facility that enable
customers to spread out across different redundancy and resiliency levels
and tier their workloads and hardware
to what makes the most sense.

The Costs
In terms of infrastructureservers,
storage, security appliances, networking, etc.using a colocation facility usually requires the same capital
investments as an on-premises facility.
Additionally, tenants may pay a fee
covering space (by the rack, cage, or
floor), heating/cooling, power, physical security, and network access.
Companies must also budget for IT
personnel to provision and maintain the infrastructure as they would
for on-premises circumstances,
Stadtmueller says.
Beyond charging by the square
foot or rack, providers may charge by
the kilowatt used, Morgan says. For
smaller deployments entailing a few
racks, she says, organizations often pay
a flat fee that grants access to a certain amount of power and bandwidth
and sometimes a set number of IP addresses. Larger deployments may entail

paying a flat fee for space plus a fee for


power used and network connectivity
charged separately, she says.
Today, Stadtmueller says, more
providers are likely to charge by
the power used. Vargas says the
change runs in tandem with racks
becoming denser. So the constraint
now becomes how much power can
you draw to the building and not
how much space can you provide in
the building, she says. Vargas says
pricing methodologies in general are
changing due to different types of
services now being offered.
In terms of the cost benefits of using
colocation, for organizations facing
storage and compute growth requirements, colocation is often a better economical choice due to the high costs
of expanding an existing data center,
Stadtmueller says. Furthermore, colocation facilities are also generally
available in much shorter time frames,
something important in businesses
where market responsiveness is vital.
Morgan says the cost benefit comes
from turning CAPEX into OPEX,
or spending for a service rather than
spending on capital. The cost to build
a data center can be anywhere from
$10 to $30 million for a 1-megawatt facility thats roughly 10,000 square feet
of raised floor, she says. This doesnt
include the operational cost of paying
engineers to ensure the facility stays
operational or for security, janitorial,
and other staffing.
For organizations that need a large
data center or already have expertise
on staff, building a data center can be
cheaper, Morgan says. However, she
adds, large data center providers do
get some economies of scale when they
build, and those with large campuses
can save quite a bit on operations costs

IN TERMS OF THE COST BENEFITS OF USING COLOCATION,


FOR ORGANIZATIONS FACING STORAGE AND COMPUTE GROWTH
REQUIREMENTS, COLOCATION IS OFTEN A BETTER ECONOMICAL
CHOICE DUE TO THE HIGH COSTS OF EXPANDING AN EXISTING DATA
CENTER, STADTMUELLER SAYS.

CyberTrend / March 2015

11

because their engineers can be busy


taking care of multiple facilities at a
site, she says. Larger providers can
sometimes negotiate better power rates
with the electrical company, as well.

Matters Of Location
A key factor in choosing a colocation provider is location itself. This
pertains both to the organizations
geographic focus and the providers
footprint, which can range from multinational to local. Currently, Dinsdale
says, there arent many choices for
companies needing a truly multinational provider. Conversely, for companies that only need a provider in a
single U.S. metropolitan area, the list
of potential providers is substantial.
Generally, organization first
choose a broad geographic location,
such as one near its customers, partners, branch locations, or interconnect facilities in other global regions,
Stadtmulleur says. They then seek a
specific provider. Stadtmueller recommends reviewing the features a vendor
supports at its individual locations,
including if the facility is built along
a fiber network route or has access
to two separate fiber routes for highavailability workloads.
For workloads requiring the highest
possible uptime, Stadtmuller recommends determining whether the facility is protected from power outages
through access to power substations on
two grids, noting, however, that not all
workloads require the highest levels of
access, security, or resiliency. Thus, be
sure to consider price performance factors in decisions.
Other location-specific considerations include noting how often IT
staff will need to visit the facility, how

easy it is to travel and deliver parts


to, and if there are accommodations
for staff, Morgan says. Additionally,
assess the facilitys exposure to natural disasters and whether its located
far enough from the companys own
data center so that both sites arent
impacted by the same disaster. You
need to have at least two data centers
in geographically diverse areas in order
to reduce disaster risk, she says.

A key factor in choosing a


colocation provider is location
itself. This pertains both to the
organizations geographic focus
and the providers footprint,
which can range from
multinational to local.

To ensure a facility is prepared to


handle a natural disaster, find out
whether the facility is on a priority
electrical grid, such as the type hospitals use; make sure that there is good
access to diesel fuel delivery; and ensure routes and contracts are in place.
Furthermore, assess how important
water is for cooling, where it comes
from, and if theres a backup source.
Overall, Morgan says, find out whether
the facility is near major highways,
flight paths, or railroad tracks; whether
it is located in a flood zone; and if riskmitigation methods are in place. Also
check the neighborhood crime rate.
Data centers are often not in the best

OTHER LOCATION-SPECIFIC CONSIDERATIONS


INCLUDE NOTING HOW OFTEN IT STAFF WILL
NEED TO VISIT THE FACILITY, HOW EASY IT IS
TO TRAVEL AND DELIVER PARTS TO, AND IF
THERE ARE ACCOMMODATIONS FOR STAFF,
MORGAN SAYS.

12

March 2015 / www.cybertrend.com

areas, she says. Thus, note whether


IT staff will be comfortable driving
to the facility at night, and if theres
gated parking if the neighborhood is
rougher.
Typically, real estate taxes, labor
costs, and utility costs are top considerations for most customers, Vargas
says. If considering facilities in remote
areas, ensure theres access to appropriate space, labor, bandwidth, and free

cooling. If considering facilities outside


of the country, check local regulations.
In Germany, for example, you have
to keep any customer data for German
clients within country, she says. So if
you want to do business there you have
to have a facility based in Germany or
some sort of resources.

Take Responsibility
A chief concern for any organization using colocation is what becomes
of the colocation agreement should the
organization become unhappy with the
services, or if the provider goes out
of business. In this area, Stadtmueller
says youd better get it right before
signing the contract. Her recommendations: When researching prospective
providers, check their stability (time in
business, ownership, financial strength,
etc.), tour their facility to note how
they operate, and ask for customer references.
Morgan says to ask the provider to
produce incident and maintenance

logs, noting its procedures, how often


equipment maintenance occurs, and
what types of incidents have occurred. The Uptime Institute has
launched an M&O [Management
and Operations] stamp of approval
designed to certify when a provider
follows industry best practices to ensure high availability, Morgan says.
You should also ask their customers
how the firm handles incidents, how
is their communication, can you get
someone on the phone when theres
a problem, and how quickly have they
worked to resolve issues.
Stadtmueller emphasizes the importance of reading contracts down to
the small print to ensure there are no
misunderstandings about who will be
responsible for what. Look beyond
the marketing brochures to focus on
whats important to your business,
she says. For example, pinpoint what
the actual uptime is vs. the SLAs. If
youre like most organizations, you
care less about the credit associated

The cost to build a data center can be anywhere


from $10 to $30 million for a 1-megawatt facility
thats roughly 10,000 square feet of raised floor.
KELLY MORGAN
Research Director, North American Data Centers
451 Research

with SLAs than about keeping your


workloads running, she says.
Another concern is how using a
provider will impact the organizations
own IT personnel. If bypassing remote
hands services, IT will maintain responsibility for procuring, provisioning, and
maintaining infrastructure in the colocation facility, meaning personnel need
to be located within easy access to the
facility. Stadtmueller says it also makes
sense to install a monitoring platform
in the organizations data center to
track performance of the colocation infrastructure. Furthermore, IT will need
to perform vendor-management tasks,
such as ensuring SLAs are met.

Reduce Down-Time.
Reduce Waste.
Go With The Best.
PC-Doctor offers the industrys #1 PC diagnostic
repair toolkitssupporting all major brands,
including tablets and ultrabooks.

As Vargas notes, colocation


overall essentially means the company is outsourcing the facilities itself, thus eliminating various onsite
facility management responsibilities.
This may lead to repurposing parts
of the facility and redistributing internal staff. Thats one of the key
value propositions of the outsourcing
movement in general, Vargas says.
Organizations are realizing if theyre
running hospitals or restaurants,
theyre not in the business of running
data centers, so why maintain that
expertise when they could use those
folks to do something more strategic
to the organization.

GO DEEP

If quality time with the latest, fastest home computing technologies


is your idea of well-spent leisure time, CPU is the magazine for
you. Each month CPU serves up how-to articles, interviews with
tech industry leaders, news about cutting-edge research,
and reviews of the newest hardware and software.

PC-Doctor Service Center


Save 20% with code Cyber20

866.289.7237 | PCDServiceCenter.com

Check out the latest edition right now at www.computerpoweruser.com


or on your iPad via the iTunes Store.

IMAGES IN THIS ARTICLE COURTESY OF SAMSUNG

Samsung In Living Color


FROM BLACK-AND-WHITE DISPLAYS TO 4K TVS, VR HEADSETS & ENTERPRISE SERVICES

WHERE DO YOU START with Samsung

KEY POINTS
Samsung started off as a trading
company but quickly expanded into
the electronics and home appliance
markets in subsequent decades.
Samsung is commonly known
for its mobile products, including its
TV sets, smartphones, tablets, and
cameras.
Samsung 360 Services for
Business give you 12/5 or 24/7
support for mobile devices, security, and more.
For enterprises, Samsung offers
mobile, networking, telephony, and
marketing solutions. All of these
products build off of Samsungs
heritage but show the companys
ability to adapt and innovate.

14

March 2015 / www.cybertrend.com

(www.samsung.com)? The company has


been around since 1938 in one form or
another and is primarily known as an
electronics manufacturer, but its story
is much more interesting and complex
than that. Samsungs televisions, smartphones, and other consumer devices are
some of the most innovative products
currently on the market, which makes it
hard to believe that it originally started
out as a trading company before transitioning into a major manufacturer of
products for a wide variety of markets.
Samsungs longevity is something that
many companies strive for, and the secret to its long-standing success in electronics and other areas lies in its ability
to adapt to market changes and give customers what they want regardless of the
industry.

A Diverse Portfolio
Its much easier to look at Samsung
as a group of business units rather than

as a singular corporation, or at least


it was the in past before the company
split up into more manageable segments. It started off in a familiar place,
with black-and-white TV sets, starting
production in 1970 and selling them in
1972. Samsung would go on to product
lifetime totals of 1 million black-andwhite TVs by 1976, 4 million by 1978,
and 10 million by 1982. Within that
time frame, the company started producing washing machines and refrigerators (in 1974), its first microwave
ovens (in 1979), and a line of air conditioners (in 1980).
I n a d d i t i o n t o c o n s u m e r e l e ctronics, Samsung also established
multiple business units spread across
different industries. Some examples
include Samsung Heavy Industries and
Samsung Petrochemical in 1974 and
Samsung Fine Chemicals and Samsung
Construction in 1977. The introduction of new business units and product
lines would become a running theme

for Samsung over the next few decades


as the company continued to move beyond its perceived comfort zones to
venture into new arenas.
Although Samsung had subsidiaries
and business units in a wide breadth
of industries at that time, electronics
would ultimately be Samsungs bread
and butter. In 1977, it started exporting color televisions and would go
on to produce 20 million of them by
1989. Samsung also started producing
personal computers in 1983, began exporting VCRs to the United States in
1984, and created a 4mm video tape
recorder in 1986. All of these different
product categories would
eventually lead to where
Samsung is today with its
LED TVs, laptops, tablets,
and Blu-ray players, as
well as its digital cameras
and HD video cameras.

Innovation In
The 1990s

30-inch LCD screen in 1997 and began


mass producing digital TVs in 1998,
which would lead to the development
of flat-screen TVs. In 1999, Samsung
was the first in line to offer digital TVs
and introduced the first 3D LCD monitor to the world. The company also
started working on its first smartphone
in 1999, which was designed to be a
wireless-based, multi-function device
from the beginning.

Samsung In The New Millennium


Samsungs line of televisions would
see major changes throughout the
2000s. In 2000, Samsung developed a

DVD combo in 2003, and an optical


Blu-ray disc recorder in 2004. In 2006,
Samsung released the worlds first
Blu-ray player, which was capable of
playing higher-capacity Blu-ray discs
with the highest visual and audio fidelity for home entertainment.
Perhaps the only product line that
would rival Samsungs televisions
would be its line of cell phones. In
2000, the company introduced a PDA
phone to the market and in 2002, it
launched a cell phone with an HD LCD
display. By 2004, Samsung sold over 20
million cell phones in the U.S. alone.
This innovation streak continued with
a 7MP camera phone in
2005, a 10MP camera
phone in 2006, and the
OMNIA line of phones
in 2008. In that same
year, Samsung became
the number one manufacturer in the U.S. cell
phone market and would
continue to introduce not
just mobile phone, but a
wide range of other popular consumer devices for
years to come.

Samsung would continue to innovate on its


electronics product lines
throughout the 1990s,
building foundations for
future mobile devices,
Consumer Devices
too. For example, in 1992,
A companys present
Samsung not only built
and future is shaped by
a 250MB hard drive and Samsungs newest Galaxy tablet, the Galaxy Tab 4, runs the Android 4.4 (aka KitKat) operating
its past, and Samsung is
the worlds first 64MB system and keeps a long charge (10 hours of continuous Internet use) for both consumer and
no different. Decades of
DRAM technology for use professional use.
innovation led to the inin computer memory, but
troduction of devices that
also started working on a
would help consumers in
mobile phone system. A year later, in
high-definition digital TV and then
the home, and that would follow them
1993, the company would introduce
developed a 40-inch LCD TV for the
wherever they might go.
the first DVD-R recorder.
first time in 2001. From 2002 to 2004,
Innovation continued at an expoSamsung developed a 54-inch LCD
Mobile Devices & Laptops
TV and would go on to release a 46nential pace as Samsung developed a
With its Galaxy-branded lineup of
inch LCD TV as well as the PDP-TV,
256MB version of its DRAM technology
smartphones and tablets, Samsung is
which was the thinnest monitor at the
in 1994 and a 1GB version in 1996. Also
blurring the line between traditiontime. In following years, Samsung tranduring this same time period, it proally segmented electronics markets and
duced its 30 millionth microwave, anmaking it possible for consumers to
sitioned from LCD and plasma telenounced a 33-inch TV, and introduced
do more with less. Take the Galaxy
visions to LED alternatives with 4K
its Alpha chip, which was the worlds
Note, for example, with its 5.6-inch
capabilities (more on that later).
fastest CPU at the time in 1996.
AMOLED display, 16MP camera with
Throughout the 2000s, Samsung
From 1997 to 1999, Samsung conOptical Image Stabilization, 3GB of
designed many new home entertaintinued to make strides in the television
RAM, and 32GB of onboard storage.
ment products. It launched an alland computer markets. It developed a
Its somewhere between a phone and
in-one DVD player in 2000, an HD

CyberTrend / March 2015

15

pending on how you intend to use the


a tablet (sometimes referred to as a
of UHD (ultra high definition) and
laptop. The former configuration is
phablet) and even comes with an S
SUHD (super UHD) TVs for home enfine for watching videos and browsing
Pen stylus so you can hand-write notes
tertainment enthusiasts who want the
the Web, while the latter is better for
or draw images on the fly. The sheer
clearest possible picture and the fullest
performing processor-intensive tasks
computing power coupled with the
feature sets. Alongside more affordable
such as video editing and running dahigh-quality camera and larger-thanalternatives, Samsung offers some of
tabase applications.
average screen means that for some
the most high-quality and high-cost
users, the Galaxy Note is a potential
televisions available in the market. For
3-in-1 device that covers the smartexample, the 4K UHD S9 Series Smart
Wearable Technology
TV is a $40,000 85-inch behemoth with
phone, tablet, and digital camera catIn addition to its traditional of4K Ultra HD visuals, Internet-based
egories in one fell swoop.
ferings, Samsungs many forays into
Smart TV applications for streaming
For those who require a dedicated
wearable technology cant be ignored.
movies and TV shows, and even a
tablet experience or a larger display,
The company offers smart watches
built-in camera for video
Samsung offers numerous
calls. Its 3D-capable and
options. The Galaxy Tab
has four HDMI ports, three
comes in a 7-inch model
USB ports, an Ethernet
thats only slightly larger
port, built-in Wi-Fi, and
than the Galaxy Note, with
much more.
a screen as large as 12.2
Samsung also offers
inches. You can also choose
multiple devices to go
between Wi-Fi and 4G LTE
along with its televisions,
(Long Term Evolution)
including DVD players,
models, so consumers
Blu-ray players, and a
and business users alike
Smart Media Player. Most
can have access to email,
of these devices have builtInternet, and applications
in Wi-Fi for streaming
regardless of their location.
applications, but will also
In addition to screen size,
support physical media
you also have options in
such as DVDs and Blu-ray
terms of onboard storage
Whether youre new to photography or a professional, Samsung offers digital cameras with
discs. Some of Samsungs
capacity from 8GB up to a wide range of features to support your needs. The Samsung NX1, for example, is a 28.2MP
Blu-ray players also sup32GB and beyond, with camera with a 3-inch Super AMOLED Tilt display and touch panel.
the ability to expand caport 3D and 4K UHD uppacity using additional SD
scaling, so you can get the
memory cards.
that interact with your phone or work
most out of your 4K TV. The Smart
While some users might be able to
as standalone voice-enabled devices,
Media Player is interesting for conget away with a Galaxy Note or Galaxy
and also fitness bands designed to help
sumers because even though it doesnt
Tab to perform most of their necesusers exercise more effectively and
play physical media, it does support
track progress. Samsung even has its
sary tasks, some people still need a
most of the major video streaming
own VR (virtual reality) headset that
dedicated laptop for those more reservices out there today and can even
works hand-in-hand with its Galaxy
source-intensive projects or just for
replace your cable box in certain situaNote 4. Developers can leverage this
the sake of convenience. For those
tions. If you dont own a smart TV and
to develop new immersive games and
people, Samsung offers ATIV Book 9
want a relatively inexpensive way to
applications, and consumers can use
laptops, which also come in multiple
convert it, the Smart Media Player can
it to watch videos and engage in other
styles and configurations. Screen sizes
help. It not only supports the aforeunique experiences. This technology is
range from 12.2 to 15.6 inches and you
mentioned streaming applications but
on the cutting edge right now, so use
can choose from various processor,
also gives you access to a Web browser,
cases are still being explored for conmemory, and hard drive options as
so you can download applications diwell. For example, you might chose
rectly to the TV.
sumers and business users alike.
the 12.2-inch ATIV Book 9 with an
Because audio is also crucial to
Intel Core M processor and 128GB SSD
the home entertainment experience,
Home Entertainment
drive or you might go with a 15.6-inch
Samsung offers more options in this
In addition to plasma and LED
ATIV Book 9 with an Intel Core i7
area than ever before. Whether you
TVs, Samsung has fully embraced the
processor and a 256GB SSD drive, dewant an audio dock to play music from
4K future and currently offers lines

16

March 2015 / www.cybertrend.com

your mobile device or a fully featured


surround sound system with built-in
Blu-ray support and other features,
Samsung has something for you. Most
of these devices also support Wi-Fi and
Bluetooth, so in some situations, you
might not even need to use cables of
any kind to stream content from your
device to your speakers.

Digital Photography
In the realm of digital cameras,
Samsung offers models as sharp as
28MP or greater, with bright displays
and touch panels. Samsungs digital
camera range includes simple point and
shoot models and cameras that support
interchangeable lenses. Whether youre
a digital photography novice or a professional, Samsung offers cameras with
a wide range of features that support
your specific expectations.
On the point and shoot side,
Samsung offers cameras with builtin Wi-Fi for easy sharing, as well as
models with dual-view LCD screens so
you can see what youre shooting from
the front of the camera as well as behind
it. And for the savvier photographer,
Samsung offers high-end NX system
cameras that offer multiple lens options,
so you can fine-tune your configuration
for the best possible image regardless of
distance or lighting conditions.

Home Appliances & Beyond


Samsung has also remained true to
its home appliance roots. The company offers microwaves, full-sized
ovens and ranges, refrigerators, dishwashers, washers and dryers, and
vacuum cleaners, among other products designed for home and office use.
Samsung even offers LED lighting fixtures for indoor and outdoor use.
In this area, as with all of its other
products, Samsung offers consumers
a wide variety of options when it
comes to colors, capacities, and more.
Samsung also continues to develop new
ways of improving home energy efficiency, infusing these and other products with the companys unique sense
of style.

Samsung 360 Services


For Businesses

or deployment process, Samsung has


a service that will support your needs.
Speaking of deployment, Samsung
also helps companies better manage
the deployment of mobile devices,
which includes provisioning, maintenance, and upgrade support. This process is particularly difficult for larger
organizations that have to deploy as
many as three devices to potentially
hundreds of employees. You can also
take advantage of MobileCare, which
is a device repair and device warranty
solution that helps you protect your
device investments.
Samsung also offers solutions to help
you manage your devices. In addition to
EMM (enterprise mobility management)
license support, Samsung supports maintenance, migration, and upgrades for
your EMM implementation.

Aside from catering to businesses


large and small with some of its consumer-oriented devices, such as tablets
and smartphones, Samsung also offers a collection of services designed to
help businesses run more efficiently.
Samsung 360 Services for Business is a
portfolio of solutions that is currently
in a pilot phase but should soon expand to reach a greater number of clients. And all of these services can be
used by companies in different industries, whether its finance, manufacturing, or elsewhere.
Included in Samsung 360 Services
for Business is a technical support
help desk that can operate as either
a 12/5 or 24/7 setup, depending on
the needs of your company. The help
desk includes
support for
enterprise mobility management licenses
and platforms
among other
features. You
can also utilize
the included
customer support portal for
quick and easy
access and take
advantage of
remote support, so you
can get help
r e g a r d l e s s o f Samsungs Galaxy S III smartphone features a 4.8-inch HD Super AMOLED display and
w h e r e y o u r e includes cutting edge features such as S Beam for transferring files by simply touching one
Galaxy S III to another, built-in virtualization features for secure access to the corporate
located.
network. and pre-installed security solutions.
Samsung experts can also
help you with
Security is another major pillar of
applications, whether youre trying to
Samsungs approach to business serdevelop them on your own or just devices. Solutions including Mobile
ploy existing ones throughout your orHealth Check, Mobile Architecture,
ganization. Samsung offers enterprise
Security Review, Mobile Security
application developer training and
Assessment, and Mobile Security
consulting and support for application
Policy Assessment are all designed
migration and enterprise file sync and
to help cover mobile security from
share. Regardless of what stage youre
every angle. And because Samsung
in with the application development

CyberTrend / March 2015

17

approaches the issue from both the


device angle and the mobile infrastructure angle, youll know that your devices will be protected and that youll
have a reliable mobile backbone to
properly support those devices.

have access to inventory and sales data


wherever they are.
Samsung also offers mobile solutions that are less about specific use
cases and more for general protection
and security. Samsung Knox is a security platform designed for Android devices that works with containerization
technology to protect specific data and
applications on a given mobile device.
In essence, Knox provides a safe zone
on the mobile device where you can
securely work on company projects.
Knox also offers in-depth monitoring
and management tools. With this twopronged approach, you get security on
the device itself plus a means of identifying potential threats to corporate
assets available to the device.
Another powerful mobile technology within Samsungs enterprise
portfolio is the Solutions Exchange,

Networking Solutions

Because Samsung has been extraordinarily successful in the mobile arena,


it should perhaps come as no surprise
that Samsung also offers a line of
WLAN (wireless local-area network)
solutions and management tools. The
Enterprise Solutions
Samsung WLAN solutions are deSamsung offers numerous entersigned to help you build and manage
prise-focused solutions that range from
a strong wireless network spanning
mobile management and networking
your entire organization. Samsung
to printing, unified communications,
WLAN products include access points
and marketing. These solutions cover
and controllers, while the Wireless
a wide range of industries, including
Enterprise WLAN Manager makes
health care, education, and hospitality,
your network accessible to users in a
to name a few, and all of them are desafe and secure manner.
signed to be easy to use.
With Samsungs WLAN solutions
in place, customers get seamless conMobile Solutions
nectivity and a wider coverage area.
While Samsungs mobile devices
Employes will not, for example, have
are powerful enough for enterprise,
to manually change their wireless netthe company supports those devices
work connection as they move from
one part of a building to
another, as the WLAN detects a users movement
from one coverage area to
another and automatically
connects a given device to
the right access point. This
is particularly important
for employees who use mobile devices on a regular
basis, because they can start
working on a project in one
location while connected to
the Internet and move to
another without being disconnected.
On the network administration side, Samsungs
WLAN solutions are deSamsung Knox is a mobile security solution that features robust on-device security, including fingerprint scanning, as well as mobility
signed especially for easy
management tools for administrators and secure access to approved mobile business-oriented apps.
initial setup and include
support for ongoing management. The WLAN
which provides access to more than
with business solutions. For example,
Manager user interface, for instance,
130 applications for use on SAFE
Samsung Mobile Manager is a retailprovides quick access to all of the con(Samsung for Enterprise) smartphones
oriented, real-time analytics solution
figuration options required to easily
and other devices. These applications
that takes advantage of Samsung deidentify and respond to potential isare made available through 16 different
vices and partner software to deliver
sues. This centralized management
business categories and help your cominformation to salespeople on the fly.
platform gives administrators greater
pany achieve a mobile-first mentality,
With Mobile Manager, employees can
control over how employees and their
which is vital for success in todays
access data and applications via a virdevices connect to the network debusiness world.
tual desktop, and store managers can
pending on their location. All of this

18

March 2015 / www.cybertrend.com

prevents or reduces downtime


and ensures the network performs at the highest level possible.
Samsung points out that its
WLAN solutions have multiple
industry-specific use cases, as
well. Using Samsung WLAN
access points and the WLAN
controller in a hotel, for example, ensures that guests will
be able to access the Internet
wherever they are in the
building, and the network expert onsite can manage those
connections. The same idea
goes for schools as they move
more and more toward using
computers and tablets. You
can place access points in every
room and make sure that every
teacher and student has access
to the resources they need.

Samsungs Videowall displays can be arranged in various configurations in order to make the most spectacular impression.

Telephony, Unified
Communications & More
Samsungs telephony and UC (unified communications) services for
enterprises include the Samsung WE
VoIP applications for smartphones
and Samsungs other CTI applications,
which combine to ensure employees
and customers can communicate regardless of where they are. The WE
VoIP solution is particularly helpful
because employees can use it as they
travel to stay in touch with colleagues
back at the office or with customers at
any location. You can even start a conversation at the office and then transition it to your smartphone as you
leave.
In order to use these features, however, you need to have a strong PBX
foundation in your company. SCM
(Samsung Communication Manager)
supports up to 3,000 users and offers
enterprises a powerful phone system
for use in a UC-focused environment. In addition to the SCM solution,
Samsung offers a line of IP (Internetbased) phones that are built to take
advantage of Samsungs VoIP features.
Its these products that integrate with

the mobile WE VoIP application and


allow you to collaborate across multiple devices and continue phone calls
regardless of location.

Marketing Solutions
Given the companys history in televisions and displays, it makes sense
that Samsung would create unique
ways of using those technologies to interact with potential customers. This is
where Samsungs SSP (SMART Signage
Platform) comes in. Perhaps youve
noticed those backlit signs in airport
concourses or in the shopping mall.
Samsungs SSP displays appear similar,
but allow people to fully interact with
the display, and companies can create
custom content for them. Imagine
being able to deliver an interactive advertisement to potential customers in
high traffic areas and actually get them
to stop and interact with your brand.
SSP makes this possible.
You can also take this a step further with Samsung Videowall displays,
which enable the display to become
part of a rooms architecture. The
Videowall comprises separate HD displays that can either be used to show

different content or to show a larger


image spread across them. And because the system involves separate
panels, they can be arranged in whatever way makes the most sense for the
space, allowing for some truly unique
configurations. The content backbone of these large display solutions
is MagicInfo, which is a creation platform that helps you make innovative
and interactive content for marketing
and sales purposes. As a customer, you
receive full control over how your SSP
display will look and how users will
interact with it.
Solutions such as these make it possible to create colorful content that
catches the attention of potential customers and draws them in. You might
say that concept is close to Samsungs
original idea when it first started manufacturing products, including their
many lines of TV sets and displays.
Given the advances in Samsungs display technologies and the ever-more
striking visuals they produce, its safe
to say that Samsung has come a long
way from its black-and-white past, and
that the company has clearly never forgotten its roots.

CyberTrend / March 2015

19

Bots & Cybermarketing


DETER & DETECT FRAUD

KEY POINTS
In 2014, 56% of all Web traffic
came from bots, and 29% of the
traffic were bad bots. Of the bad
bots, 22% were impersonators,
which are bots that look like real
traffic at first glance.
You might be paying for click
and ad impressions that come from
bot activity, essentially wasting
those dollars.
Digital advertising and marketing
efforts can be tough to track. Youll
want to find a marketer that can
provide you with verifiable results.
To determine the actual traffic
for visitors to your website, youll
need some hardware that can sort
the bad bots from the good ones.

20

March 2015 / www.cybertrend.com

FOR MANY BUSINESSES, online marketing is a cornerstone of the overall


marketing strategy. A traditional appeal
of digital marketing and advertising is
the ability to track the number of clicks
and redirects. Its smart, of course, to
spend marketing dollars based on results, but the online traffic youre
paying for might not be a true representation of whos viewing your message.
A recent study from Incapsula found
that bot traffic accounted for up to 56%
of all Web traffic in 2014. And while not
all bots are bad (well get into this later),
the Incapsula study indicates that 29%
of all website visits were the bad bots,
which included impersonators (22%),
hacking tools (3.5%), scrapers (3%), and
spammers (0.5%). Here, well examine
how bots affect the value of your cybermarketing efforts.

What Is A Bot?
If youre a regular CyberTrend
reader, youre likely familiar with bots

from their roles in DDoS (distributed


denial of service) attacks and other
forms of spam. A bot is a small program tasked with performing a specific
function on the Internet in an automated manner, says Chris Rodriguez,
network senior industry analyst at Frost
& Sullivan. Bots are able to perform
tasks repeatedly, quickly, and with predictable results, making them perfect
tools for both bad and good guys.
The Incapsula report found that impersonator bots are the most common
form of bad bots. Impersonator bots
include programs for fake search engines, DDoS attacks, and spywareall
programs aimed to circumvent server,
network, and PC security.
Malicious goals of impersonator bots
can include downtime, data theft, site/
server hijacking, and overall degradation of service. Worst of all, the use
of impersonator bots is on the rise.
Incapsula reports indicates that impersonator bots accounted for 19% of Web

Digital advertising presents a whole new set of challenges compared to broadcast, which is protected by the
trifecta of inventory scarcity, visibility, and standardized
controls. Digital has the opposite trifecta in play: near
infinite inventory, difficult visibility and hundreds of different standards for reporting and controlswhich have
made the space both ripe with opportunity and a breeding
ground for fraud to grow unchecked.
JARED BELSKY
President : 360i

traffic in 2012, 20.5% in 2013, and 22%


in 2014.
As we mentioned earlier, not all bots
are malicious. Some bots are used for legitimate purposes, says Rodriguez, For
example, Google uses bots to crawl the
Web and index websites to make their
searches more efficient and accurate.
Anyone whos done basic research into
SEO (search engine optimization) knows
that blocking all bot activity on your
website is a bad idea. Rodriguez notes,
A business that would block Google
Web crawlers would make it hard for
customers and partners to find the business online. A searchable website is absolutely critical in this day and age.

Bots Effect On Marketing


Bot activity can blind companies
as to the effectiveness of their social
media campaigns and Web presence,
says Rodriguez. The false information
can also mean that youre not receiving
the full value of your advertising dollars. Jared Belsky, president of the
digital marketing agency 360i, says, A
recent report by the ANA [Association
of National Advertisers] in partnership
with White Ops indicates that bots will
account for roughly $6.3 billion dollars
in un-recoupable ad spend.
Almost unbelievable is that the
ANA says more than half of thirdparty sourced traffic is fraudulent.
Traditionally, third-party sources are
used to drive unique visitors to your
website, but unless the third party can

provide you with stats and analytics to


verify the additional traffic, you might
not want to rely too heavily on the data.
The ANA/White Ops study also indicates that 23% of video ad impressions
and 11% of display ad impressions were
classified as bot fraud.
Why such high percentages for bot
fraud levels? Belsky provides us with
some valuable insight: Reasons include
porous data, long tail inventory, and
other black hat tactics that focus on
campaigns with soft metrics and goals,
he says. While not all within the display/video space, current concerns seem
to be centric to areas where cookies can
be gamed, inventory is masked, and
where standards are in their infancy.
The study found that the most likely
marketing areas affected are ad exchanges and video inventory.
The significantly boosted traffic and
impression numbers means you might
be paying an awful lot for Web traffic
that isnt generating any interest in
your business. The motive for click
fraud is that popular websites generally receive more advertising revenue
compared to sites with less followers,
says Rodriguez. That being said, some
marketers might exclude costs for clicks
that have been identified as invalid.
Google AdWords, for instance, displays
how many invalid clicks have been filtered from your account.
Complexity is one of the biggest issues when it comes to digital tracking.
Digital advertising presents a whole

new set of challenges compared to


broadcast, which is protected by the
trifecta of inventory scarcity, visibility,
and standardized controls, says Belsky,
who goes onto note, Digital has the
opposite trifecta in play: near infinite
inventory, difficult visibility, and hundreds of different standards for reporting and controlswhich have made
the space both ripe with opportunity
and a breeding ground for fraud to
grow unchecked.
Data from the ANA study is backed
up by a research report from Solve
Media found that, on retail-specific
sites, 55% of the traffic was suspicious
with 34% confirmed as bot traffic. Solve
Media estimates that $4.6 billion of
the $11 billion spent on digital advertising in 2014 could have been wasted.
Overall, the study found suspicious
Web traffic at the end of 2014 was up
36% from the second quarter and bot
traffic increased by 34 percent.

True Numbers
So how do you go about finding
real, human digital traffic? Belsky says,
Marketers invest millions of dollars in
advertisingfrom banner ads to video,
websites, and morebut often spend far
less on accurate measurement of those
assets. Additionally, advertisers also
need to ensure the media investments
are driving real value. Belsky notes,
Advertisers need to take better care to

WHILE YOU WERE SLEEPING


The ANA (Association of National
Advertisers) worked with fraud detection firm White Ops to study bot fraud
on digital advertising. One of the
more interesting takeaways was that
peak activity for bot fraud occurred
between midnight and 7:00 a.m.
Because of this, it would be ideal to
limit your advertising during wakeful
hours. For example, you might use
dayparting as a way of reducing timeof-day fraud spikes.

CyberTrend / March 2015

21

Bots are able to perform tasks repeatedly, quickly, and


with predictable results, making them perfect tools for
both bad and good guys.

ADWARE
Adware is a type of spyware that
can boost clicks by running videos or
ads in the background of an infected
PC. Adware-enabled activity may not
be classified under bot fraud, because
adware does not use bots to create the
additional traffic. The impressions are
still fraudulent, but the adware traffic
doesnt typically appear on users PCs.
Instead, it will run in the background.
For example, a pop-up video may appear thats showing a video ad. The
user will close the ad, but the adware
will loop in the background with silenced audio. Because the ad is always
hidden, youre seeing no viable impressions from the traffic.

make marketers know that their investments are safe, real and tangible, and
that their digital dollars arent going up
in fraudulent smoke.
When working with an advertiser,
youll want to make sure that they can
verify audience measurement. The advertising industry must provide more
proof and better metrics that both

CHRIS RODRIGUEZ
Network Senior Industry Analyst : Frost & Sullivan

help to substantiate ad and impression


numbers, and because bots are capable
of hundreds of impressions per second,
its critical to establish the scope of any
click fraud.
Another way to validate numbers is
with private ad exchanges, which are
digital marketplaces that allow advertisers to buy and sell ad space across a
range of sites vs. working with select
publishersthat have fraud mitigation
practices in place, says Belsky. By connecting with direct sources, you wont
have to deal with third-party ad networks. Private exchanges can utilize a
variety of fraud mitigation practices,
such as active black and white lists, as
well as known bot/spider publishers.
Finally, Belsky advises, Take measurement beyond just display, and
across other channels including email,
social, and mobile to protect your advertising investments. A variety of

MARKETERS INVEST MILLIONS OF DOLLARS IN


ADVERTISINGFROM BANNER ADS TO VIDEO,
WEBSITES, AND MOREBUT OFTEN SPEND
FAR LESS ON ACCURATE MEASUREMENT OF
THOSE ASSETS.
JARED BELSKY, PRESIDENT AT 360i
the media and the website itself are
working, says Belsky. In general, you
should not rely on the reputation of a
publisher alone as a way to predict the
effectiveness of a given campaign.
There are a few key ways to get proof.
Enlist third-party partners including,
but not limited to, White Ops, Integral
Ad Science, and Double Verify to protect
against fraudulent inventory and validate their media buys, Belsky suggests.
Third-party fraud detection services can

22

March 2015 / www.cybertrend.com

online services are available to detect


fraud. With search campaigns, Google
can catch fraud as it happens via the
AdWords interface, says Belsky. Youll
also want to make certain the marketer
has tools to identify and police fraud
after it happens.

Website Traffic
When it comes to figuring out the
actual amount of traffic for your business website, youll need some hardware

to sort the bad bots from the good ones.


Some basic bot detection capabilities
are included in many endpoint security suites today that look for bot signatures, says Rodriguez. Otherwise, he
adds, finding and blocking bot activities usually starts with identifying communications and control channels. A
bot without instructions or communications capabilities is nothing.
Your business might already have
some IPS (intrusion prevention
system) devicesincluding NGFWs
(next-generation firewalls) and Web
application firewallsthat can prevent
malicious bots from doing damage.
More advanced solutions utilize different methods to identify signs of infection and bot activity, Rodriguez
says. There are also services available
that will provide you with multiple
ways to detect and block bots, particularly if you are concerned about such
things as DDoS attacks.

Beat The Bots


There are many different ways for
bots to disguise traffic and ad impressions. To make sure youre getting the
most out of your advertising dollars,
youll need to work with a marketer
that can help to identify and resolve
ad fraud. Belsky advises, In 2015,
make a resolution to beat the bots by
becoming more vigilant and cutting
off fraud in the planning cycle, by investing in media that is quality, verifiable, and productive. Feel free to
announce your intent to conduct audits
of supply chain partners, but its also a
good idea to monitor covertly, as bot
traffickers might drop bot use if they
become aware of active audits. With a
dual-monitoring strategy, you can get a
better sense of the digital marketing and
advertising impact.

Missing CyberTrend when


youre on the go? View or
download the digital edition at

STAY AHEAD
OF THE CURVE

www.cybertrend.com to get up
to speed on the latest technology news and information about
products for your company.

Make Cloud Storage Pay Off


ENSURE THE BENEFITS POSSIBLE EQUAL THE ORGANIZATIONS NEEDS

KEY POINTS
Adopting cloud-based storage
can mean immediate cost savings,
but many companies experience
unforeseen long-term complexities
and costs.
Rather than for cost savings,
organizations are increasingly
adopting cloud storage for speed
and agility reasons.
To see a long-term cloud
storage payoff, begin by pinpointing what the organization
specifically hopes to achieve from
cloud storage.
SLAs (service-level agreements), cloud storage model, and
migrating to another provider can
all generate unexpected costs.

24

March 2015 / www.cybertrend.com

FOR YEARS, A NOTION HAS existed


that going the cheap and easy route
with cloud-based storage was a nobrainer for an organization because it
could see immediate and sometimes
significant cost savings. While this was
often true in the short-term, many organizations found that over time they
actually experience complications and
costs over the long-term that they
didnt see coming.
Using cloud storage should not be
about cost, says Ashar Baig, Analyst
Connection president, principal analyst, and advisor. Its not a definite
that if you go to cloud youre going to
save money. That should be the No.
1 rational of what you should not do
when looking at cloud storage.
Today, in fact, many IT departments
insist they can deliver internal storage
at an equal or cheaper price than a
cloud provider. Even so, there are still
several viable reasons for using cloudbased storage, even if those reasons

are not necessarily tied to costs savings. Whats key is knowing what an
organization should ask of itself and
of potential cloud storage providers to
determine how the organization can
benefit and ensure it gets what it requires from a provider. The following
explores these issues and others in
order to forge a cloud storage strategy
that makes getting a payoff from cloud
storage possible.

Cheap & Easy


Increasingly, organizations are realizing that cloud storage and cloud
computing in general are about more
than just achieving cost reductions. As
Henry Baltazar, Forrester Research senior analyst, puts it, the real power of
the cloud is elasticity, or the ability
to scale up and scale down resources
on demand. So, even though IT might
provision internal storage more affordably than a cloud provider, it generally cant do so with near the quickness

I believe that the transparency or abstraction of assets


held in the cloud can and often will lead to a sort of out of
sight, out of mind attitude among IT workers that will be
painful for many organizations.
CHARLES KING
President : Pund-IT

Executives must ask stakeholders what theyre specifically


looking for with cloud storage, be it performance, obtaining new abilities, etc. It doesnt make sense to launch a
cloud program without knowing what stakeholders actually want.
HENRY BALTAZAR
Senior Analyst : Forrester Research

or ease. Further, provisioning storage


in-house means the organization is
stuck with that infrastructure and associated resources potentially for years,
possibly long beyond its really needed
any longer. This makes cloud attractive for bursty or seasonal workloads
for test/development environments,
Baltazar says.
Charles King, Pund-IT president,
says with IT in general, a short-term,
cheap and easy approach usually leads
to long-term costs and complexity. The
cloud is no different and, in fact, can
lead to further complications because
data and processes occur beyond direct oversight, he says. Elsewhere, King
says, generational issues pose a constant challenge for IT in terms of how
management and staff turnover impact
efficiency and a full understanding of
assets and processes. I believe that the
transparency or abstraction of assets
held in the cloud can and often will
lead to a sort of out of sight, out of
mind attitude among IT workers that
will be painful for many organizations,
he says.
Baig estimates he speaks with more
than 200 CIOs annually. Where cloud
storage is specifically concerned, he says,
the perspective you get from them is
totally different from any forecast or

survey youre going to read thats associated with IT and IT directors who
dont have visibility into the big picture.
Specifically, Baig says, while its
possible for IT to deliver on-premises storage below the cost of cloud if
done efficiently, cost benefits arent
the primary goal when CIOs choose
cloud storage. Its always, always the
agility. Thats because hardware procurement-to-deployment times can run
into the months before completed, he
says. Conversely, procuring needed
storage via the cloud can literally take
minutes. I live and breathe cloud
storage every day, Baig says. To me,
thats the biggest misconceptionthat
cloud is all about economics. Cloud is
actually all about agility.

The Payoff
While some organizations do use
cloud providers for primary storage
needs, various data and experts point
to cloud storage as more commonly
being used as good options for disaster
recovery, backup, and archiving purposes. In other words, cloud storage is
good for data that needs to be accessed
easily but infrequently, if at all. King
says cloud is also seen as offering the
means to efficiently and cost-effectively
address continuing storage growth

and complexity, especially for ongoing


problems such as backing up employee
PCs and mobile devices.
Ensuring that an organization sees
a long-term payoff from cloud storage
can depend on several things, including
the use case in question. For archiving,
for example, it makes sense to analyze
how often the company will need to
access archived data because there
are bandwidth charges and longer access times for retrieving content from
a cloud, Baltazar says. If looking at
primary storage use cases, such as databases and other transaction-sensitive
applications, expect to be charged for
high-performance resources differently.
King says primarily, organizations
should understand why theyre contemplating cloud storage in the first
place. This includes asking questions
around data and/or processes, prices,
and potential benefits. You should
not proceed if you have any unanswered questions or doubts about those
points, he says. How these points map
against potential cloud storage providers/services should enable determining whether to proceed. King says
such an approach is especially wise with
regard to costs, because some cloud
storage fees arent always entirely clear.
For example, some cloud storage archiving services offer low costs to upload data but charge considerably more
to retrieve it, he says.
Baltazar emphasizes that executives
must ask stakeholders what theyre
specifically looking for with cloud
storage, be it performance, obtaining
new abilities, etc. It doesnt make sense
to launch a cloud program without
knowing what stakeholders actually
want, he says.
Among the finer details that a companys should verify in a potential provider is the type of SLAs (service-level
agreements) it offers and its reputation
for reliability, because outages are likely
to occur occasionally. Here, Baltazar
says, as cloud strategies evolve and organizations use multiple clouds for their
workloads, it will be easier to compensate for a cloud outage.

CyberTrend / March 2015

25

Security is another factor that can


differentiate one provider from another. Many companies never even
consider the cloud for storing critical
business data due to security concerns, King says. While most objections against cloud storage continue
to focus on security and compliance
concerns, Baltazar says, these are gradually declining. Similarly, Baig says security is improving every year, but if
security is a paramount concern, cloud
storage may not be the right choice.
Additionally, he says, the cloud might
not be a good choice for companies
with strong privacy and compliance
concerns due to the often multi-tenant
nature of the cloud.
Other issues companies should check
out include the possibility of being
locked in with a provider, because migrating to another providers cloud is
generally difficult. To this, King says
its wise at the outset to consider what
complexity and costs will be involved
in ending a relationship with a vendor.
The primary point in all this is to limit
or eliminate the possibility of nasty surprises, he says.
Overall, Baig says, the nature of the
applications the organization is planning for cloud storagewhether theyre
bursting, consistent, etc.and what
the organizations internal capabilities
are must come into play to choose the
right provider. What capabilities are
you looking for? Static storage? On demand? Is the storage for real-time applications? How often are you going
to access the data? he says. Baig also
notes that cloud storage isnt one offering. A major cloud service provider, for example, might offer scores
of storage options, each with different
pricing (dedicated, reserved, bidding,
etc.) and SLAs associated with them
(the shorter the SLA, the more youll
pay), Baig says.

Control Costs
A notable problem with cloud
storage providers is that people dont
realize there are so many complex
pricing models out there, Baig says.

26

March 2015 / www.cybertrend.com

A notable problem with cloud storage providers is that


people dont realize there are so many complex pricing
models out there. Everything from reserved instances to
even bidding on cloud storage on platforms. . . . Its not
plain or black and white that youre necessarily going to
save money.
ASHAR BAIG
President, Principal Analyst & Advisor : Analyst Connection

Everything from reserved instances


to even bidding on cloud storage on
platforms. While various material is
available to help explain how to intelligently buy cloud storage and different
models, there can still be considerable
complexity involved. Its not plain or
black and white that youre necessarily
going to save money, he says.
Unforeseen costs can be a major
issue with cloud storage. This can include costs related to bandwidth use
and migrating data to another cloud.
Rogue IT is another example. Baig says
an average company may use seven or
eight different cloud storage options
with IT typically having no knowledge
of who is using what. The biggest
challenge every CIO that I talk to has
is the control of rogue IT, he says.
They say, Were trying to mandate
to every department in the company
that, yes, cloud storage is great. As long
as you have the bosss credit card you
can get cloud storage, but you have to
go through us so we can get a proper
ROI analysis.
If this doesnt occur, however, employees lacking proper technical knowledge may select very expensive storage
options. Even if they actually need that
option at the time, they may not opt for
a less expensive storage model when
that need no longer exists. Choosing
the right model isnt simple, and those
wrong decisions can cost you a lot of
money down the road, Baig says.
In terms of determining how cloud
storage needs will change in the future and how this will include choices,
meanwhile, there are a couple of factors

to keep in mind. For example, an organization might transfer a large


chunk of data to the provider initially
in a backup or archive scenario. This
will probably entail using a different
transfer method (hard disks, for example) and pricing than whats required later when just making changes
to data.
King says the first key consideration is knowing that on average, organizations are doubling the amount
of storage they use every three to four
years. At the same time, IT departments should have records or at least
a sense of how much storage is being
consumed and added annually, he says.
Factors such as what processes and data
the organization is targeting for cloud
support can help refine these estimates.
Another key is knowing that IT
should be responsible for anticipating
future application or data usage. This
must come from LOB (line of business), Baig says. A marketing department, for example, will need one cloud
model early on in a campaign when it
will be frequently accessing data but a
different model when the campaign is
over and it will access data less often.
You have to have the right stakeholders or application owners as part
of the decision-making process when
youre setting SLAs and choosing different cloud storage options. You have
to build a plan, Baig explains. When
regulations are in play, companies have
their own compliance and mandates
to conform to, and all those become
the feedback into the data-retention
period.

Help I.T. stay on pace with the

SPEED OF CHANGE
You use CyberTrend
to keep tabs on the latest

business technology trends.


IT and data center leaders turn
to Processor to learn more about
the products and technologies
that impact organizations
at their core.

Processor is a leading trade publication

.COM
Get the latest issue right now
online at www.processor.com
or on your iPad via the iTunes Store.

that provides the news, product information,


and technology advice that IT leaders and
data center employees can trust.

Wireless Connectivity Everywhere


INNOVATIVE METHODS COMPANIES ARE EXPLORING TO SPREAD INTERNET ACCESS

ALTHOUGH IT MAY SEEM SO, wireless


connectivity isnt ubiquitous yet. For millions living in rural and underdeveloped
parts of the world, wireless (or wired
Internet) connectivity remains a one day
soon notion. Scores of technology companies, however, are working to spread
wireless to as many locations as possible,
using such approaches as making every
customers router a public Wi-Fi hotspot.
More innovative approaches seek to employ drones and balloons. The following
explores such efforts.

Air & Ocean


Rapidly accessing a wireless connection
while afloat is becoming more affordable
and comparable to land-based connections
in terms of reliability and reduced latency.
One major cruise line, for example, detailed in late 2014 the first hybrid wireless
network at sea. Set for inclusion in all the
companys ships, the network combines
land-based antennae lining cruise routes
with advanced satellites to offer connectivity

28

March 2015 / www.cybertrend.com

touted as being approximately 10 times


faster than previously available.
Another major cruise line, meanwhile,
launched an effort in 2012 to bring faster
Internet speeds by partnering with global
satellite provider O3b Networks. In March
2014, testing of a custom-built antenna installed on a ship used with the O3bMaritime
system reportedly matched land-based
broadband-like connections. The system,
which pairs medium-Earth-orbit satellites
with ship-based antenna arrays, reportedly
makes it feasible to download streaming
video and post video clips from onboard.
Further, capacity is said to have exceeded
500Mbps in testing, or more than enough
bandwidth to go around for passengers.
In-flight Wi-Fi service via ground-to-air
and satellite-based systems, meanwhile, has
become increasingly more available globally.
Reuters reports that 40% of U.S. jetliners now
have Wi-Fi service. Worldwide, commercial
aircraft with Wi-Fi, cellular, or both service
types is expected to more than triple in the
next decade to span 14,000 planes.

Among the more compelling developments in this sector is the testing that inflight Wi-Fi provider Gogo is conducting
involving a satellite-based Wi-Fi system that
reportedly will enable connections reaching
70Mbps. Thats described as enough to
download a 2-hour HD movie in roughly
four minutes, though real-world results will
likely be slower as passengers share bandwidth.
Arguably even more compelling are
Google and Facebook projects involving satellites, drones, and balloons. While some
naysayers deem the companies efforts risky
due to potential regulatory challenges and
costs reaching into the billions, for both companies theres the promise of adding millions
of new users to their customer bases.
By most estimates, roughly half the
worlds population remains offline, sometimes due to difficulties and costs associated
with building traditional network infrastructure in rugged and underdeveloped regions.
There are definitely large percentages of
the worlds population, mainly in emerging

The underlying driver for these alternatives to wired


connectionssatellites, drones, balloons, etc.is
theyre cheaper.
CHRIS DEPUY
Vice President : Dell'Oro Group

economies, that arent served by some type


of internet connection, whether its broadband or narrowband, says Chris DePuy, vice
president at the DellOro Group. There are
also altruistic reasons for spreading wireless
connectivity to such areas. Doing so would
likely foster education, produce jobs, raise
standards of living, and improve economies.
DePuy believes that ultimately, major
technology companies and content providers
desire to get into the local Internet access
business in order to get some leverage on
the other last-mile providers and Internet
pipe companies. Some broadband providers
are acting similarly, he says. Getting in on the
act wirelessly would be more cost-effective
than rolling out high-speed broadband lines.
The underlying driver for these alternatives
to wired connectionssatellites, drones, balloons, etc.is theyre cheaper, DePuy says.

The Projects
Whatever the motivation, Google's and
Facebooks projects are intriguing. Google
and Fidelity, for example, recently provided
$1 billion in funding to SpaceX, the rocket
and spacecraft company led by CEO and
CTO Elon Musk. The result for each company was a roughly a 10% stake in SpaceX,
which reportedly will use the funds to assist
in spreading Internet access globally via 4,000
low-Earth satellites.
Previously, Google had launched its own
wireless connectivity projects, including
a satellite effort that took a hit when Greg
Wyler departed his leadership role to start
OneWeb. That company is seeking to launch
roughly 650 low-orbit, light-weight satellites
to provide wireless access. OneWeb counts
Virgin and Qualcomm as backers. Unlike
SpaceX, OneWeb already owns rights to
radio spectrum for transmitting data.
Outside of satellites, Google also has
Project Loon, a project that aims to spread
wireless connectivity via high-altitude balloons floating in the stratosphere, or twice

feasible. Even 10,000 balloons is not a major


tracking issue. Airlines, ships, and governments/communities are potential market
segments, he says. One major challenge,
Hoffman says, is whether theres enough
revenue at the bottom of the pyramid to
support this sort of thing."
Hoffman does see Project Loon as possessing an appealing and elegant simplicity. Beyond launch costs (two people
and a truck) being minimal compared to
satellites, balloons lack the extra launch
weight associated with powered flights,
he says. A possible downside is that with
potentially thousands of balloons in flight,
a take-down facility might be required
should a rogue balloon drift into commercial airline traffic lanes, he says.
DePuy is somewhat skeptical of drones'
and balloons' potential. The problem is that
they move, he says. Specifically, he says,
keeping a drone in the sky requires power,
and a balloon moves away. So you need to
continue to launch balloons, and you need

as high as airplanes and the weather. Via


partnerships with telecomm providers, the
project would enable users to connect to the
network directly from 4G LTE (Long-Term
Evolution)-enabled devices, with signals essentially traveling in a mesh network style
among balloons before traveling back to
the global Internet on Earth. Reportedly,
Google has worked with providers in several countries on testing. Balloons are said
to stay afloat 100-plus days and provide
3G-like connectivity.
Facebook, meanwhile, announced its
Connectivity Lab last year in which dozens
of experts are working to provide wireless connectivity via high-altitude longendurance planes, satellites,
and lasers. The lab is part of
a larger Internet.org mission
that Facebook CEO Mark
Zuckerberg announced in 2013
to bring Internet access to those
lacking it.
Notably, Facebook has acquired a company with expertise in creating solar-powered
drones. Ultimately, the aim is to
launch remotely piloted drones
able to stay in flight for years. Solar-powered drones are an example of innovative ways companies are
With wingspans comparable to exploring to spread wireless Internet to areas that lack connectivity.
a 747, the drones would fly at
about 65,000 feet and use mesh
networking, Wi-Fi, land-based antennae, and
to continue to fuel drones, and that to me
low-orbit satellites with lasers to send transsounds much more expensive than popping
missions. A white paper on the topic states
a device on some higher-elevation system
a drone flying at such a height could beam a
like a flag pole, water tower, hill, or building
signal powerful enough to cover a city-sized
in a town, he says.
area with a midsize population. Facebook is
Examples of such devices include lowalso considering deploying low-Earth orbit
cost, modified wireless LAN chips, now
and geosynchronous-Earth orbit satellites to
available, that provide broadband-like wirecompensate for the limitations of drones.
less access, DePuy says. Mom and pop
organizations are using the technology in
Doubts & Alternatives
emerging markets and outside city areas to
Joe Hoffman, ABI Research practice diaccomplish what I think these satellites, balrector for mobile networks, believes that
loons, and drones are going to do in the fuin the end, these are all technologically
ture, but its happening now, he says.

CyberTrend / March 2015

29

Mobile Platforms In The Enterprise


DEVICE & OS VARIETY ADD COMPLEXITY TO MANAGEMENT & SECURITY

MOBILE DEVICES are much more than


sidekicks to computers. In fact, in some organizations, tablets and hybrids are overtaking desktops and laptops as the primary
computing platforms. While this is going
on, smartphones are continuing to get more
powerful and varied in terms of form factors, features, and operating systems. In other
words, the mobile landscape within organizations is more varied and complex than ever
before, which makes it that much harder
for IT to manage and secure these devices.
Thats why its so important for IT teams to
not only recognize this growing diversity, but
also to take steps to change how they manage
devices and design their networks.

Variety Means Complexity


Traditionally, IT teams have only had to
deal with one or two operating systems covering a vast array of devices. From desktops
and laptops to servers, almost every computer or piece of equipment ran some form
of Windows that IT was comfortable with,
so it didnt matter as much what company

30

March 2015 / www.cybertrend.com

manufactured the product. Now, there are


so many different operating systems to consider, including iOS, and Android, as well
as proprietary variants, that IT teams often
struggle to keep up.
Stacy K. Crook, research director at
IDC, points out that iOS is relatively easy to
manage and control because its only used
on one manufacturers selection of products.
But when it comes to Android, not only do
you have multiple form factors and operating systems out there, but then you have
different versions of the operating system by
manufacturer, because Android is a customizable, open-source OS. Add to this the fact
that some companies still use BlackBerry and
Windows-based mobile devices and its easy
to see why the job of IT teams is much more
complex than it once was.

Problems With BYOD


As if operating system variety werent
enough to contend with, companies that
offer BYOD (bring your own device) policies are making it even harder for IT teams

to properly manage and security mobile


devices. Businesses all liked that initial
idea of great, we dont have to buy phones
for our employees or pay for their plans,
without thinking about all of the complexity
that came with it and the fact that yes, now
youre dealing with a million different devices you dont know anything about, says
Jim Rapoza, senior research analyst with the
Aberdeen Group. And because with BYOD
the devices are wholly consumer-owned, it
can be difficult for IT to convince employees
to put some heavy MDM [mobile device
management] on their personal device
that has remote wipe and potential remote
snooping capabilities, he adds.
For this reason, Rapoza says many companies are considering going back to the
traditional company-issue model where the
organization pays for and owns all of the devices and therefore can install whatever level
of security they see fit. But companies should
also think about going with a CYOD (choose
your own device) policy that helps eliminate
some of the platform variety. What youre

seeing now is a lot of businesses looking


CYOD, where employees can have their own
device as long as its on this list of preapproved devices, Rapoza says. This gives employees the opportunity to use their own
devices for business-related tasks and makes
it so IT only has to work with the mobile devices it can comfortably handle.

Security Considerations
As we mentioned before, due to their
open source nature, some operating systems are vulnerable to security threats
than others, but that doesnt mean that any
one OS is 100% out of harms way. After
all, if a device is jailbroken, or modified
by the user to add otherwise unavailable
functionality to the device, then it doesnt
matter what OS its running.
Even the newest version of iOS could
cause potential issues with the amount of
inter-application data-sharing. One of the
big things with iOS 8 is that it enables a
greater level of data sharing between applications, says Crook. From a consumer
point of view and a usability point of view,
thats great. But from a security point of
view, thats problematic because Im making
it even easier for data to escape its safe parameters.
To help overcome these issues, companies are either using fully featured enterprise
mobility management solutions or going
with more granular security approaches.
For example, Crook says, some companies
are using VPN (virtual private network)
solutions that enable you to pinpoint specific corporate applications on a device and
permit only those applications to access internal company resources. Companies can
also use containerization to completely separate corporate and consumer applications
and make sure no sensitive data is shared
between them.

Network Issues
In addition to affecting ITs security
strategy, the sheer number of mobile devices
out there is making it harder to offer safe
and reliable wireless coverage to all employees. Rapoza says users can sometimes
struggle just to connect their new device
to the network in first place, which often
causes them to go to a guest network or use

Get educated about how mobile security differs in this


new world of mobile devices. Its moving very quickly and
its very hard to keep up with all of these operating systems and vulnerabilities, so staying educated about all of
that and investing in a solution that can help you with that
is important, because trying to do it on your own is going
to eat up most of your day. I would invest in a solution
thats going to extract some of that complexity.
STACY K. CROOK
Research Director : IDC

4G and sidestep the main corporate network altogether. And there are some situations where a companys campus has spotty
network connectivity in certain areas or the
system forces employees to disconnect from
one network and connect another as they
move around.
To help solve these issues, Rapoza says,
IT needs to design their wireless networks
with mobility in mind. For example, BYOD
onboarding is one kind of tool that organizations are using to obtain a greater amount
of information about any device that's connecting to the network. Additionally, this
type of tool automatically determines the
security clearance level a given device has
within the organization. This makes the authentication process faster and practically
invisible to the end user. Companies can
also incorporate geolocation-based functions into their networks to track mobile
devices regardless of their location on the
campus, know when a device is off-campus,
and then disconnect it from the network.
Rapoza adds that some businesses are in
the process of implementing virtual LAN
and SDN (software-defined networking)
solutions as well. With approaches such as
these, employees essentially have their own
individual networks and cant actually see
any others. This makes it much easier to
segment employees into different groups
and control what parts of the network they
are allowed to access.

The User Experience


Crook warns that there is a delicate balancing act that needs to happen between
security and usability because, typically,

the more secure something is, the less usable it tends to be. One way companies
are doing this, according to Crook, is to
either add custom security capabilities to
native applications or build entirely new
applications for employees to use that are
even better than native alternatives. This
makes it so companies can fine-tune the
user experience of each individual application while still maintaining the necessary
security and management functions.
Crook adds that some companies use
EMM (enterprise mobility management)
solutions to improve the user experience
through more automation. Single signon is another capability thats end userfocused to make the end users life easier so
they can sign into one app and not have to
sign into every single other app, which in
a mobile context, can be frustrating, she
says. Biometrics-based systems allow users
to sign into devices and applications with
a fingerprint scan rather than having to
manage dozens of passwords.
On the networking side, Rapoza once
again stresses the importance of changing
the way you design your network altogether. If youre going to build your network to mainly be good for laptops and
desktops and [you] dont care about how
well mobile connects, youre going to be
in big trouble, he says. If its within your
policy, whether you have full BYOD or are
using CYOD, you want the users to be able
to log in easily, stay connected, and have a
good experience, because any time a user
cant get to a key business app or cant connect to a client, youre losing productivity
and potentially losing clients.

CyberTrend / March 2015

31

See The Big Picture


HOW DATA VISUALIZATION TOOLS ARE IMPACTING BI & ANALYTICS

KEY POINTS
It is believed that people can find
patterns and trends more easily in
visually presented data than they can
in number-filled reports.
Some BI (business intelligence) and
analytics solutions include tools that
are capable of automatically recommending the best visualization type for
a given set of data.
Many experts believe adopting
data visualization tools is no longer an
optional frill, but rather a necessity in
order to remain competitive.
Todays data visualization tools
arent perfect. Many are wary of flaws
such as lack of governance and data
management capabilities, so watch for
positive developments in this area.

32

March 2015 / www.cybertrend.com

ITS GENERALLY ACCEPTED that humans


process data faster when the data is presented visually vs. textually. People are
just naturally better at recognizing patterns,
trends, groupings, and exceptions in a visual
format as opposed to rows and columns
of numbers, says Nik Rouda, Enterprise
Strategy Group senior analyst.
Now, consider the massive amounts of
big data enterprises are accumulating and
trying to extract useful bits from. It makes
sense companies are increasingly eying DV
(data visualization) tools that BI (business
intelligence) and analytics vendors are offering. Touted for making it easier to recognize trends, correlations, and patterns
residing in huge amounts of divergent information, DV tools are also enabling business users to interact with what they find
and communicate its relevance more easily.
Boris Evelson, Forrester Research vice
president and principal analyst, says DV is
already a key part of most leading BI and
analytics platforms. Forrester views DV as
an integral part of agile, self-service BI,

he adds. Similarly, Cindi Howson, Gartner


research vice president, says DV is essential
both now and tomorrow. I dont consider
it optional, and for organizations that lack
this module in their toolkit, theyll be at a
competitive disadvantage, she says.

Visualizing Data
Rouda describes visualization in general as an approach that assists in understanding data and models more readily,
often at a glance. Though not all analytics
tools possess visualization abilities and not
all data lends itself to visualization, DV is
considered important to sharing insights
after analysis, he says.
Howson draws a distinction between
a simple visualization and what the BI
market segment of visual data discovery
entails. Generally, she says, when data is
presented graphically, we retain that information better than when its presented as
a dense page of numbers. Sometimes, she
says, you need the precise number, and
sometimes you want more of the pattern,

the trend. Compared to a page filled with


numbers, visualizations can reveal patterns
and trends more effectively, and thus speed
up the time to insight.
PC-Doctor, which makes computer diagnostics software used by leading PC manufacturers worldwide, is a company that uses
visualization tools to help its customers
including consumers, OEMs, and support
professionalsmake rapid decisions based
on large amounts of data. As a diagnostic
tool installed on millions of computers,
PC-Doctor is in a unique position to gather
failure data and system information in near
real time, and our enhanced data visualization tools allow analysis of hardware failure
rates, says Bob Zaretsky, director of business
development for PC-Doctor. Such insight
helps OEM customers make informed decisions when purchasing components like
memory and hard drives, maximizing system
reliability and minimizing support costs.

New Tools Offer More Options


Visualizations have existed for many
years, but vendors tools are adding a spin,
such as by recommending the best visualization to use for the data at hand. If your data,
for example, has ZIP codes, data may be rendered as a map immediately, rather than as
an afterthought, Howson says. Tools are enabling users to obtain new data sources with
less rigid, formal IT processes, thus providing
businesses more agility, she says.
Heat maps, fever charts, dials, and geographic maps are examples of DV. Features
can include animation, interactive abilities
to dig deeper into data, collaboration tools,
real-time data updating, and linking to related graphs. Jeff Cotrupe, global program
director at Stratecast | Frost & Sullivan, says
DV helps users who arent data scientists
quickly form queries, see results from those
queries, and run what-if analyses, he says.
It also enables people to use their natural
ability to see patterns, identify trends, and
gain insights from data without requiring
complex queries, wizards, or scripts to do
it, Cotrupe says. A related benefit of DV is
speed, as such solutions enable users to react
quickly to large amounts of data.
For example, PC-Doctors ISH (Instant
System History) tool provides instant access
to PC users system history data, allowing

I dont consider [data visualization] optional, and for


organizations that lack this module in their toolkit, theyll
be at a competitive disadvantage.
CINDI HOWSON
Research Vice President : Gartner

support professionals to make prompt decisions and view trends over time. PC-Doctor
uses real-time data collection and reporting
to quickly alert customers to increased failure
events, system crashes, and blue screens,
says Zaretsky. In a recent case, a corrupted
firmware update caused a spike in SSD [solid
state drive] failures. PC-Doctor identified the
problem and sent a targeted message to those
with affected systems via IMS (Intelligent
Messaging Service), a cloud-based service
that delivers timely, contextual messaging to
end users directly from the OEM. This rapid
and effective action saved countless dollars
in support and warranty costs, and reduced
downtime for the end users.

Building On BIs Benefits


DV has become a welcome addition to
many existing BI solutions, as well. Daniel
Ko, Info-Tech Research Group senior consulting analyst, says that while BI provides a
window to underlying data, it has shortcomings. BI can offer reporting, dashboards, and
alerts, for example, but lacks an ability to
present data in a consumable and digestible
format, he says. DV, is designed for this, and
many organizations use DV tools to complement existing analytics and BI solutions that
serve users, especially those who appreciate
interactive data, Ko says.
Brad Shimmin, service director for business technology and software at Current
Analysis, sees lightweight data discovery
and DV tools (which go hand in hand) as
having a major impact on BI and analytics
solutions. Such tools help validate investments companies make in BI/analytics solutions by helping modernize them, he says.
I think theyre in some ways pushing and
pulling those solutions forward, Shimmin
says. Today, BI systems are being modernized for the cloud and mobile devices and
employingeither by borrowing or evolving
intothese lightweight DV and discovery
tools as the driving user experience.

Shimmin sees these lightweight tools as


a response to how difficult traditional BI
solutions had become in terms of accessibility, getting answers, and determining
which questions to even ask, as well as to
the sudden influx of data everyone has affectionately called big data. All of these
factors converged to form this small cadre
of software vendors finding a terrific home
in the enterprise selling these DV and discovery tools and thus helping enterprises
use the data and investments they already
made, he says.
The DV components of PC-Doctors
software have assisted OEMs and repair
professionals in dealing with device failures,
application crashes, and similar circumstances. Traditionally, data collection focused on end-user systems, says Zaretsky.
Recent improvements have enabled collection in factory and repair centers, allowing
for the tracking of systems through the full
PC life cycle. PC-Doctor customers use this
information to identify factories with higher
success rates, providing better focus on improvements where they are needed.

Visualization Advances
Ko goes as far as viewing DV as almost
an industrial revolution in a BI context.
Users must understand and make sense of
data before they can take action on it, he
says. Data visualization is plugging that
gap by allowing users to consume data.
Visualization widgets, like maps and heat
maps, are so intuitive, they allow average
business users to get something from the
data, he says.
Increasingly, DV is associated with selfservice analytics and assisting employees
who are business-savvy but not necessarily
tech-savvy to dive into big data and drive
decisions. Ko says DV and self-service analytics go hand in hand. Users can interact
with the data and make sense from it, he
says. Traditional reporting and BI push

CyberTrend / March 2015

33

change filters and time ranges and data sets


on the fly to answer more probing questions, Rouda says.
Howson says early DV attempts simply
inserted visualization into BI but werent necessarily effective. You dont want your dashboard to look like a pinball machine with
unnecessary bling, she says. Today, theres
greater emphasis on brain research on the
best graphic displays, default colors and un-

step, were going to have to change how we


interact with data, he says.
Shimmin also says lightweight data discovery and visualization tools lack data management and governance focus. Thus, users
might generate ideas or insights based on
false premises or incorrect data. Similarly,
Ko says poor data will provide incorrect or
incomplete insights, no matter how good DV
tools are. Overall, he says, if end users dont
understand the data behind the data
visualization tools, they arent going to
have meaningful insights pulled out of
it. It boils down to dedication to data
quality and data training provided to
the end users.
Howson sees governance and
governed visual data discovery as the
next phase of maturity for DV tools.
If not, we risk replacing spreadsheet
chaos with visual data discovery
Changing Times
chaos, she says. After exploring and
DV is undoubtedly impacting ormanipulating data that users want to
ganizations interest in BI and anarepeatedly share with others, she says,
lytics solutions. One reason for this,
there needs to be software capabiliRouda says, is that presenting reports
ties and organizational processes to
and data visually readily enables a The visualization component of PC-Doctors Instant System History tool lets
allow for promoting content.
more engaging response for discus- technicians quickly view a systems history of diagnostic failures, operating
Elsewhere, Evelson says, all DV
sion. While Rouda says its unclear system crashes, application crashes, and more.
(and BI) platforms are limited by
whether DV is actually directly intheir underlying data architecture.
creasing sales of BI/analytics solunecessary optionsthings like gradients and
Essentially, its difficult to analyze or visutions, he does consider DV a necessary part
3D effects are rarely useful, she says.
of any end user analytics interface now.
alize what you havent modeled yet, he says.
Ko says while DV excels at making data
Ko says Info-Tech has spotted tradiRouda, meanwhile, says in terms of platconsumable and digestible, users are now
tional BI/analytics vendors moving toward
forms, where DV tools should be tightly inasking, So what? As a result, vendors are
DV in two ways: either by introducing DV
tegrated with the rest of the technology stack,
adding value by integrating such features
products or modules in their portfolio, or
many still seem somewhat detached.
as collaboration, predictive capability, moby injecting DV capabilities into existing
bile, and self-defined visualization, he says.
products. Incorporating DV capabilities, in
Increasing Importance
Mobile, for example, can help users take
fact, has been a lifesaver for some vendors
Because DV is becoming a component
and use DV anywhere. Many vendors
that were losing market share due to rein the BI stack, some clients are acquiring
arent keeping up with the demand in new
ports that their BI/analytics products were
a combination of BI and DV tools, which
visualizations, so some create a marketplace
not easy or intuitive to use.
creates a management nightmare, he
for the user community to create, share, and
Similar to enterprises interest in DV
says. Ideally, the data visualization consell new visualization creation, he says.
changing, DV itself is changing, including
cept should be absorbed and integrated with
by becoming more automated. Rouda says
BI into a single analytics tool. This probetter DV tools can recognize what types
cess, which Ko calls the unified theory of
Take Heed
of data and models lend themselves better
analytics, has already begun in the sense
While DV is generating excitement,
to various formats and display as such by
that BI vendors are incorporating DV funcit has some limitations. For example,
default. They can also resize and reshape
tions into products and DV vendors are imShimmin says smartphones and tablets
the display for different dashboards and
proving on BI capabilities. Ko expects this
used to visualize what DV tools display
devices, like smartphones and tablets, he
process will be mostly completed in two to
impact the ability to understand whats
says. Generally, most modern DV tools
three years, with heaven being the ability
going on with data. The visualizations we
allow for user interaction and not profor organizations to work with just one anahave right now are actually constrained by
viding a static view but letting the user
lytics platform vs. several.
the devices theyre on. And to take the next
data to end users, whereas in data visualization users pull insights by interacting with
the data. Pushing equates to teaching,
while pulling equates to self-learning,
he says. Both approaches can help disseminate knowledge, but pulling reinforces it
whereas people tend to forget what theyve
learned if its pushed, Ko says.
Cotrupe stresses that while DV is key to
business users for such reasons, its good
for all users. IT specialists and the
data science team may be proficient
in getting information from systems,
even if forced to use the most cryptic
and non-intuitive mechanisms to do
so, but why make them do it? he
says. Why not help them get above
system logistics so they can focus on
more strategic issues?

34

March 2015 / www.cybertrend.com

Greenovations
ENERGY-CONSCIOUS TECH

The technologies
that make our
lives easier also
produce some
unwanted side
effects on the
environment.
Fortunately, many
researchers,
manufacturers,
and businesses
are working to
create solutions
that will keep us
productive while
reducing energy
demands to lessen our impact on
the environment.
Here, we examine some of the
newest green
initiatives.

Lucid Energy worked with the city of Portland to install large pipes containing turbines into some sections of the
city's water system. Flowing water spins the turbines, which in turn generate electrical power for consumers.

Lucid Energy Installs & Begins Testing Turbines As Source Of


Sustainable Energy In Portland's Gravity-Fed Water System
Lucid Energy, a Portland, Ore. company that has developed a system for generating sustainable power by placing turbines in gravity-fed water pipes, has completed the commercial installment of one of its LucidPipe systems in the water
system for its home town. Lucid says the four turbines in the Portland installation
are now starting to generate power, and that after an initial testing phase lasting
up to 60 days, the system will begin operating at full production. At that point,
the system should generate enough energy to power more than 150 homes. The
LucidPipe Power System in Portland uses 42-inch turbines, but the company says
the system will work in pipe diameters ranging from 24 inches to 96 inches. Each
turbine needs to be spaced about three to four "turbine widths" apart in the pipe.
Lucid says a single mile of LucidPipe pipeline the same diameter as that used in
Portland has the potential to generate as much as 3 megawatts of electricity without
having any impact on the environment.

Bacteria That Feeds On Wastewater Now Powers Fuel Cell


Sintef, a Norwegian scientific research organization, recently announced it had
created a biological battery: a fuel cell powered by bacteria. Fuel cells contain two
electrodes (positive and negative) and an electrolyte, which is the conducting material that carries electrons. In the Sintef experiment, the researchers placed bacteria
into wastewater taken from a local dairy farm. The bacteria fed on the wastewater and
produced electrons and protons that could be used to create a circuit. As the bacteria
fed, it also provided an additional benefit by cleaning the water. Although this initial
research is still in the early stages, Sintef scientists say their experiment shows that it
may be possible to create a biological battery that purifies water to clean up the environment and that produces energy as a by-product, all at the same time.

36

March 2015 / www.cybertrend.com

UM Researchers Use Kevlar To


Improve Lithium Battery Safety
Researchers at the University of Michigan are investigating how to make lithium-ion batteries safer during operation.
Structures called dendrites can build up
in the batteries during use; if these dendrites pierce a barrier between the electrodes, the batteries may short out or
even catch on fire. The UM researchers
created a new type of barrier based on
nanofibers extracted from Kevlar, a material used in bulletproof vests. The new
material is permeable to lithium ions but
impermeable to dendrites. The research
team has formed a new company called
Elegus Technologies to bring the new
technology to the marketplace.

Michigan researchers created a Kevlar-based material


that helps prevent short-circuits in lithium-ion batteries.

SSA Consolidates Data Centers


Into New, Energy-Saving Facility
U.S. government officials say the Social
Security Administration has built a new
300,000 square-foot Tier 3 data center
and is in the process of consolidating a
number of other data centers into its new
facility. The new data center, located near
Baltimore, has been built with the latest
energy conserving technologies, and is
expected to help the SSA reduce energy
consumption by as much as 30% compared with the average data center.

A new product on Kickstarter


from Kraftwerk provides
energy for mobile products
by creating electricity from
natural gas.

Kraftwerk Is A Fuel Cell-Based Charger For Portable Devices


eZelleron is a startup company in Dresden, Germany with 25 employees that is attempting to crowdfund its first consumer product: a portable power generator, small
enough to fit in your pants pocket, that converts liquid petroleum gas such as butane
or propane into energy for recharging mobile devices. The company says it has been
working on the fuel cell technology behind its Kraftwerk device since 2008 and has
applied for 27 patents. The device works by converting the gas in the small internal
tank to electricity that can recharge your phone, tablet, or other mobile device via a
standard USB port. When the tank in the Kraftwerk is empty, you refill it from a reuseable cartridge much as you refill a lighter using a butane cartridge. The company
says the device can charge an iPhone 11 times before needing a refill. (It also claims
Kraftwerk is approved for carrying on aircraft, but that may not be the case with the
cartridges.) The company quickly surpassed its initial funding goal of $500,000 on
Kickstarter.com and also has reached its first "stretch" goal of $1 million. More than
9,500 people have signed up so far to fund the product. See the Kraftwerk page on
Kickstarter for more information.

SunShot "Race To 7-Day Solar" Contest Updates Rules On $10


Million Prize To Speed Up Deployment Times For Solar Installations
The U.S. Department of Energy launched the SunShot Initiative in 2010 with the
goal of increasing the number of solar installations in the country by reducing the
overall cost of solar energy. Now the department wants to use the initiative to help
reduce the amount of time it takes individuals or companies to research, buy, install,
and start using, the latest solar power technologies. Toward that end, SunShot is
now offering $10 million in cash awards to individuals, organizations, and communities that create a plan to reduce the time it takes from buying to deploying
solar installations. The winning plans will serve as blueprints for other communities
to follow. SunShot released information showing that it often takes six months or
longer from the time a permit for a solar installation is requested to the time it is
installed. The initiative hopes to reduce this "permit-to-plug-in" time to seven days
(for smaller systems) or seven weeks (for larger installations). Individual prizes up
to $4 million are being awarded in each category, and the competition will run for at
least 18 months. The proposed rules for the contest were originally posted at the end
of 2014 so SunShot could gather feedback from the public and make adjustments.
Now the final rules have been posted; see energy.gov/sunshot for more details.

CyberTrend / March 2015

37

Managed IT Services
BRIGHT PROMISES & CAUTIONARY TALES

KEY POINTS
Organizations typically seek a
managed service provider to obtain
an ongoing service delivered at a
high quality that meets required
service levels.
Using a managed IT service can
be attractive when theres a need for
a service that isnt currently provided
internally or if a service isnt being
performed adequately.
Many companies are attracted to
managed IT services by the potential
to reduce costs and the daily IT management responsibilities for internal
IT teams.
Over time, many industry experts
believe that managed IT services will
transform or evolve into cloud computing solutions.

38

March 2015 / www.cybertrend.com

THERES A GREAT DEAL concerning


managed IT services to entice organizations to seek out an MSP (managed
service provider) and forge a relationship with them, including the potential
to significantly cut costs and alleviate
the burden on overtaxed IT departments
struggling to keep pace with daily management chores.
There are a few cautionary tales to
consider, however. For example, as Andy
Woyzbun, Info-Tech Research Group
executive advisor, points out, although
theres potential to save money using a
managed IT service, it can actually increase costs. Further, even if a company
does save money, embracing managed
IT services may adversely impact ITs
morale. To that end, the following explores the state of managed IT services
and various pros and cons of using them.

The Basics
Essentially, managed IT services
is just another form of outsourcing.

In short, says Dan Kusnetzky, analyst


and founder of Kusnetzky Group, organizations hire an outside company,
or MSP (managed service provider), to
provide administrative and operational
support for an application, a complete
multi-application workload, or perhaps
the organizations entire IT portfolio
in hope of reducing overall costs. This
is related to, but different than colocation services and cloud services, he
says. The MSP provisions these IT support services to a defined SLA (servicelevel agreement).
Woyzbun outlines three characteristics of managed IT services, the first
being a focus on a (hopefully high
quality) delivery of ongoing services
vs. completing a special project or performing one-time activities. The relationship and contract typically extend
over several years, he says. Second, the
service is complex and therefore nontrivial. High quality of execution is important, and not every aspiring vendor

is created equal, Woyzbun says. Third,


organizations can opt to continue to
deliver the relevant services in-house
as well as via the MSP, though typically once the service is transferred to
a managed service provider, its hard to
go back, he says.
To his last point, Woyzbun says specifically it can be difficult for an organization to take back responsibility of a
service. If you make a commitment to
managed services, then to a large degree
youve made a strategic commitment to
doing business that way, he says. If a
manufacturer subcontracts the manufacturing of assemblies to a third party,
for example, its essentially lost that
capability, Woyzbun says. Thus, while
managed IT services may seem the intelligent thing to do for various reasons,
know that getting a service back is really difficult, he says.

Why Do It
Many organizations use managed
IT services because they lack the time
and expertise needed to manage hardware, software, networks, etc., whereas
an MSP can deliver the abilities while
meeting required service levels and security, privacy, and governance policies
often at a lower cost. Kusnetzky says
that as organizations face more requirements to reduce costs while offering
more and better IT services, managed
services (along with cloud computing)
will look increasingly attractive.
Another driver is when theres a need
for a service that IT doesnt currently
provide. Although building the service
internally is possible, it could take considerable time and the quality of results
may be uncertain, Woyzbun says. In
other cases, IT might provide a service
thats viewed as inadequate. A common
example has to do with consistently applying patches, updates, and other support to a software package. You screw

If you make a commitment to managed services, then to


a large degree youve made a strategic commitment to
doing business that way.
ANDY WOYZBUN
Executive Advisor : Info-Tech Research Group

Trust and acceptance that the provider can deliver the


service is absolutely important. Its difficult if the technician who did the job before is now managing the provider
who is doing the job now.
WOLFGANG BENKEL
Principal Analyst : Forrester Research

things up once or twice and people are


going to start to say, Do we have an alternative here? Woyzbun says.
Domain expertise is another motivator, says Clive Longbottom, analyst
and Quocirca founder. For example,
locating, hiring, and retaining up-todate security skills is expensive and difficult for most organizations. A security
provider, however, chiefly focuses on
security to the point of understanding it
in minute detail, Longbottom says.
Furthermore, using an external security specialist enables sharing skills
among all the security providers customers, Longbottom says, making it
far more affordable. Typical managed services, says Wolfgang Benkel,
Forrester Research principal analyst,
are commodity services, or those that
are well-known and highly standardized. Based on the shared resource
model, cost can be reduced with outsourcing into a managed service
model, he says.
Where devices are concerned,
Longbottom says central IT groups primarily need to know where a device
is, and what it is, and apply controls.
An MSP can identify BYOD (bring
your own device) devices on a public

network that are seeking to connect to


corporate services, ensure devices are
correctly configured, and apply technologies (containerization, data wipes,
etc.) more effectively, he says.
Longbottom says predictability in
pricing is another attraction. Up to 70%
of IT budgets are devoted to keeping
the lights on, he says, thus anything
that helps avoid or control such costs
should be considered. Fixed-price
break/fix, patching, and upgrade maintenance is a great way of doing this, he
says, as long as the organization maintains control over overall strategy.
Woyzbun cautions that reducing
costs isnt a certainty. For example, although an MSP may perform a service
at lower cost, its possible an organization has locked itself into other costs
that wont go away upon using a managed service. Such costs could include
licensing and human resources, as well
as assets that arent fully depreciated.
To really save money with a managed
service, you better really figure out
how youre going to reduce those costs
you were spending yesterday, he says.
I think thats really something organizations dont think about. Ultimately,
unless reducing staff or moving them

MANY ORGANIZATIONS USE MANAGED IT SERVICES BECAUSE THEY LACK


THE TIME AND EXPERTISE NEEDED TO MANAGE HARDWARE, SOFTWARE,
NETWORKS, ETC. . . .

CyberTrend / March 2015

39

There are too many cowboys out there taking companies


for every penny they can, but there are also some very
strong players doing a very good job.
CLIVE LONGBOTTOM
Analyst & Founder : Quocirca

into positions that have already been


approved, youre adding costs, not reducing them, he says.

Heed This Advice


From the organizations side, Benkel
says, managed services will be a mind
change. Trust and acceptance that
the provider can deliver the service is
absolutely important. Its difficult if
the technician who did the job before
is now managing the provider who is
doing the job now, he says. Training
for the service managers is necessary.
Longbottom, meanwhile, says dont
believe everything an MSP tells you.
Ask for reference sites. Talk to people,
he says. Additionally, dont outsource
company strategy. When the company
needs changes made, Longbottom says,
ensure the MSP will be there to make
them in a timely manner. Along the
same lines, Woyzbun says, no contract
will protect you from a marginally competent supplier.
Longbottom also recommends
choosing providers that can help identify opportunities to adopt new technologies and capabilities and that have
a plan B. Further, ensure the organization has its own Plan B. In the
first case, make sure the MSP has full
plans in place, predominantly for business continuity but also for disaster recovery, he says. In the second, accept
that in any market companies will fail,
and this could apply to the MSP.
Benkel stresses the need for strong
service levels because the client is focused [on] the outcome/output of the
service only. Its a very important element that the output/outcome control
is strong, perhaps combined with a penalty clause (perhaps with incentives, as
well) to focus the vendor to the right
requirements, he says.

40

March 2015 / www.cybertrend.com

For the MSP, its important that an


organization is clear about what it expects the MSP to deliver, Woyzbun
says. Smaller providers, while possibly
cheaper and more responsive, typically have limited capabilities. For this
reason, confirm expectations with potential providers. Benkel says its also
important that an MSP is granted
enough responsibility and accountability
to manage services in a best practice
way. This means the provider can use
its own processes and tools, he says, as
most global providers are working on
their own improvements (global delivery
models, standardization of processes
and tools, etc.), and if providers dont
have the responsibility to use their process and tool framework, clients cannot
participate on the benefits.

morph or evolve into cloud solutions.


With a managed service provider,
Woyzbun says, the organization still
needs to worry about various aspects
of technology change. Cloud vendors
reduce the need for internal involvement even further. Benkel adds that
with increased standardization and
a willingness on the client side to be
open, more vendor- or market-defined managed services will evolve to
highly shared service delivery models
(or public cloud). Some cloud solutions
today, such as private hosted clouds, are
similar to managed services, he says.
Elsewhere, Woyzbun believes managed service providers must standardize
to become better and cheaper than
internally managed services. The increased effectiveness or efficiency of
these models is available only to clients
willing to align to a limited available set
of options, he says. So the trend will
continue to homogenization of systems
capabilities. Few organizations will have
the capability and appetite to differentiate their IT capabilities from their
competitors, and there will be more

FOR THE MSP, ITS IMPORTANT THAT AN


ORGANIZATION IS CLEAR ABOUT WHAT IT
EXPECTS THE MSP TO DELIVER.
From a business standpoint, recognizing the impact using a managed
service will potentially have on IT is
also key. If you, as a manager, are proposing to move some services externallywhether to managed services, the
cloud, or outsourcewhat youre saying
to the IT group is, I dont want you to
do that anymore. I want someone else
to do it, Woyzbun says. Regardless
of whether the motivation is financially based, he says, the interpretation
could be, You dont trust me, you dont
like me. Understanding that this hit to
ITs morale or confidence is possible, he
says, you need to play it intelligently.

The Market
Among some experts, theres a belief
that managed services will eventually

cloud/managed service providers where


the markets are sufficiently large.
Longbottom believes while the managed IT services market is doing pretty
well, it should be doing better. There
are too many cowboys out there taking
companies for every penny they can,
but there are also some very strong
players doing a very good job, he says.
They will need to adapt to the new
markets, however. Public cloud, IaaS
[infrastructure as a service], PaaS [platform as a service], and SaaS [software as
a service] will start to eat away at their
addressable markets, and theyll need to
identify new markets, such as becoming
cloud aggregators/brokers offering similar support services to the composite
application built up from a range of
functions from different suppliers.

The European Data Center Migration


U.S. ENTERPRISES ARE INCREASINGLY BUILDING OVERSEAS FOR DATA PRIVACY REASONS

TWO THINGS BECOME apparent when


speaking with analysts familiar with
the European data center market. First,
more U.S. businesses have builtor announced plans to builddata centers in
Europe in recent years. Second, this increase is tied to concerns that European
customers have regarding their data
and privacy. Events such as Edward
Snowdens NSA (National Security
Agency) revelations and Microsoft's
current court battle involving a U.S.based warrant demanding it give up
email data located in Europe are among
the catalysts.
Another factor is the GDPR
(General Data Protection Regulation)
thats expected to supplant the EU
Data Protection Directive thats now
in place. Proposed in 2012 by the
European Commission, the GDPR
would implement one framework for
data privacy protections covering all
28 EU member states. Adoption could
happen yet this year, after which EU

members have two years to ratify


GDPR.
In short, European customers of U.S.based businesses are increasingly making
it clear they desire their data to remain
local and expect companies to protect it
according to local laws. As Carsten Casper,
Gartner Europe managing vice president,
privacy and digital workplace security,
says, while such new rules may not necessarily appeal to U.S. business, what is
appealing is to do business, or not to lose
business, with European partners.

Rules Of The Game


Presently, the EU Data Protection
Directive determines how companies operating in the EU must handle Europeans
personal data regardless of company headquarters. Enza Iannopollo, a Forrester
Research researcher serving CIOs, dubs the
directive the highest standard for protection
of personal data. Notably, she says, each EU
country member has a national version of
the directive. These pieces of legislation are

similar but not identical, so variations at the


national level apply, Iannopollo says.
Casper says the main principles of current rules include needing permission to
store personal data, ensuring that data
is correct, securing data against unauthorized access, and deleting data if the
reason it was collected ceases to exist. If
sharing data with a partner, the partner
must follow the same rules and provide
the same level of protection, he explains.
Currently, data residency is the main discussion point, he says. Personal data can
only leave the EU if its adequately protected. Just what leave the EU means is
subject to intense debate, says Casper.
In general, Casper believes the beauty
of the European system is its fairly harmonized across all EU member states.
The GDPR will further emphasize this, he
says. Unlike the U.S., where privacy laws
are federal, state-specific, or industryspecific, all EU states use the same directive as the basis for their national laws,
he explains. There is no best location

CyberTrend / March 2015

41

Unlike the U.S., where privacy laws are federal, statespecific, or industry-specific, all EU states use the same
directive as the basis for their national laws.
CARSTEN CASPER
Managing Vice President, Privacy & Digital Workplace Security
Gartner Europe

42

for a data center. Dublin, Amsterdam,


Frankfurt and many other cities are often
chosen as data center locations, and this
creates a healthy competition. Because
locations are largely the same from a privacy law standpoint, decision makers can
focus on taxes, available workforce, connectivity, and other factors, he says.

center group, says large, multinational


U.S.-based companies are particularly
seeking to build B2B (business-to-business) and B2C (business-to-consumer)
cloud services. In recent years, IDC has
seen U.S. companies changing their approach to Europe as a market for cloud
services, he says.

The Data Center Trend

Privacys Influence

To date, the largest U.S.-based technology companies have data centers


in Ireland, Germany, France, the Netherlands, and elsewhere in Europe. Several
companies have also recently announced
plans to build more.
Casper says a 2014 Gartner survey
found 26% of 908 respondents from the
U.S., U.K., Canada, Germany, India,
and Brazil started data center operations
outside of the United States following
Snowdens NSA revelations. Rather than
build huge new data centers, however,
companies are often building facilities sufficient for only hosting an additional service in Europe or leasing extra rack space,
he says.
The key requirement is to be able to say
that these U.S. organizations have their
data in Europe, even though most would
agree that this alone doesnt prevent data
access from abroad, Casper says. Notably,
its vendors and organizations seeking this
requirementvendors because theyre
concerned about losing existing deals and
winning new ones and IT departments because theyre concerned about not meeting
European compliance requirements.
Iannopollo says possibly more interesting than major U.S. companies making
plans to build EU-based data centers is that
Forrester is seeing strong demand for EUbased data centers coming from European
cloud users. Similarly, Giorgio Nebuloni,
IDC research manager, European data

Nebuloni says whats driving this


change in approach primarily is that
European customersparticularly in
France, Italy, Germany, and Spaintend
to be fairly conservative about where their
data is located. Furthermore, they want to
ensure that companies they interact with
have at least a subsidiary in their country.
Following the PRISM scandal, Nebuloni says many European customers
uncomfortable with their data residing
outside their countries pushed for local
data centers. Especially in France and
Germany, local service providers have
tried to get an edge on U.S. companies
by emphasizing their local ties, he says.
In France, for example, an association
of French cloud service providers have
worked to provide French-certified
cloud services, Nebuloni says. So it seems
like politics is intertwining more and
more with IT services and IT markets
where the cloud is concerned, he says.
Steve Wilson, Constellation Research
vice president and principal analyst, says
the writing has been on the wall in
Europe for some time. He points to the
Safe Harbor provision that essentially allows some U.S. businesses to escape the
full weight of EU expectations as being
on borrowed time. Wilson notes many
non-European countries that also have
strong privacy laws want their data processing to occur in Europe vs. the United
States. Many laws in such countries take

March 2015 / www.cybertrend.com

the form of, If you export personal information from our country, you must only
send it to places that have equivalent data
protections, he says.
In general, Iannopollo says EU citizens
view privacy as a fundamental right and
part of their culture. History itself has
shaped this relation with privacy, she says,
making Europeans different than citizens in other geographies. As customers,
Europeans are increasingly aware companies are seeking out their data because
of its value, thus they expect companies
to protect their data as one would protect
valuable assets, she explains.
Iannopollo says for any company
dealing with EU customer data, complying with rules means mitigating risk
of fines but more importantly, having
the opportunity to build a trustworthy
relationship with their customers, a
stronger reputation for their brand, and
ultimately a competitive differentiator
for their business.

The Future
Nebuloni says most U.S. executives he
speaks with are cognizant of the great
regulatory and psychological problems
existing in Europe, particularly executives
involved with companies that offer cloud
products. Ultimately, Nebuloni believes,
larger U.S. companies will partner with
local providers to address privacy issues.
Casper, meanwhile, says the need to
operate data centers in Europe is unlikely to go away soon. How demand will
change, however, will largely depend on
the legislative initiatives on privacy in
the U.S., in the European Union, and the
privacy discussions between the two parties, he says.
Similarly, Iannopollo says privacy and
data protection arent merely 2015 trends.
Theyre really here to stay and to change
businesses culture and modus operandi
as necessary, she says. She expects more
non-EU companies will open European
data centers and fully comply with EU
data protection rules. Moving forward,
we also expect them to truly understand
and operate against the cultural background of their European customers,
partners, and employees, she says.

Greater Throughput, Lower Latency.


Meet The New Cutting Edge.
THE LATEST ENTERPRISE-CLASS SERVERS FROM SUPERMICRO
USE NEW TECHNOLOGY FOR ADVANCED PERFORMANCE

SPEED IS CRITICAL in business, especially


where database systems and Web applications are concerned. The advent of flashbased storage introduced a new era of speed
and reliability for servers and storage systems, and in recent years SSD (solid-state
drive) prices have dropped significantly.
Now a new interface technology called
NVMe (Non-Volatile Memory Express) reduces latency and increases throughput for
systems using SSDs. Supermicro, a leader
in the global server market, is now offering
servers that use NVMe technology and options from the Intel Xeon processor E52600 v3 product family.

Reduce Latency
No onenot customers, not employeeslikes to wait for technology to
work. A chief benefit of NVMe is the reduction in latency, enabling systems of

all types, including mission-critical systems, to operate with greater efficiency.


In performance benchmarking tests,
Supermicros new SuperServer NVMecapable servers were found to operate
with 7.18 times lower latency when used
with NVMe SSDs compared to SAS3, the
fastest common storage interface prior to
NVMe.

Increase Bandwidth
When considering the overall performance of your organizations database,
Web, and other mission-critical systems,
another important factor is bandwidth. In
benchmarking tests, Supermicros NVMe
servers exhibited improvements up to 5.93
times greater than servers using SATA3.
These improvements in latency and bandwidth translate directly into more work
done in less time, for greater ROI.

INTEL, THE INTEL LOGO, XEON, AND XEON INSIDE ARE


TRADEMARKS OR REGISTERED TRADEMARKS OF INTEL
CORPORATION IN THE U.S. AND/OR OTHER COUNTRIES.

First To Market
Ease of use and ease of maintenance
also contribute to greater ROI. With
its SuperServer NVMe-capable servers,
Supermicro is first to market with NVMe
hot-plug capability, which means IT personnel can upgrade drives, add capacity,
or replace failed drives without shutting down their systems. This prevents
down-time. Factor in improved power
optimization features, and the value of
Supermicros NVMe servers becomes
more considerable.

Learn More
There are now over 30 Supermicro
SuperServer NVMe systems in addition
to the SYS-6028R-TDWNR model shown
above, and the number is growing rapidly. Contact Supermicro to find out more
about pricing and availability.

Supermicro | 408.503.8000 | www.supermicro.com

CyberTrend / March 2015

43

The Evolution Of VPN


FROM THE 90S TO TODAY, THE TECHNOLOGY STILL HAS A PLACE IN MANY ENTERPRISES

KEY POINTS
Mobile devices have changed
networking requirements and VPN
is evolving along with it.
The cloud has forced companies
to decide how they want to use VPN
in the future and whether they need
third-party VPN solutions.
The future of VPN depends on
how well it interacts with MDM
(mobile device management) and
UTM (unified threat management)
security solutions and how its
unique capabilities can supplement
those solutions.
Whether you actually use VPN
onsite ultimately depends on the
size and age of your company as
well as your infrastructure needs.

44

March 2015 / www.cybertrend.com

AN IMPORTANT thing to remember


in the IT world is that just because
a technology isnt talked about on a
regular basis doesnt mean that its
dead or devoid of innovation. When it
comes to VPN (virtual private network)
technology, the concept was so wellestablished from its inception in the
90s that most people simply stopped
paying attention because nothing was
really changing, says Brad Casemore,
research director at IDC. In fact, VPN
had such strong fundamentals built
into it from the very beginning that it
was difficult to make many changes to
the formula that would be evolutionary
enough to make much of a splash in
the market.
Still, when VPN technology was first
introduced, it was a major revolution
in how businesses connected computers
to one another. Where slower, less
reliable, and more expensive dial-up
connections were the norm for most organizations, VPN came in to make the

process much easier and more secure.


Companies could give users more access
than ever before to crucial resources
without having to pay for rented network lines that they had little or no
control over.
Traditional VPNs were also established to help companies give users
secure access to internal networks regardless of location. Instead of giving
users access to the entire network and
thus opening up the company to potential security attacks, VPN connections
made it so that a single user could access the corporate network on a secure
line without fear of others being able
to hijack the signal. This concept was
especially popular for larger corporations with remote offices or employees
working from home that still needed
access to internal company resources to
do their jobs.
The interesting thing about VPN
is that it is still useful in todays business world and many of these use

You are not just concerned about having a VPN


or MPLS connections between your headquarters,
data centers, and major branches. Now, you begin to
cater to users who can be at all sorts of endpoints
around the Internet, wired or wireless, and on all
sorts of devices.
BRAD CASEMORE
Research Director : IDC

realized that in this new world, WAN


optimization is not enough, because
youre dealing with applications that are
in different clouds from different service
providers and different data centers. The
endpoints are mobile and not static, and
you have to find a way to extend what
you do in WAN optimization to accommodate that with things like intelligent
path selection.

VPN & The Cloud


cases from the past are still relevant.
Employees are demanding more connectivity than ever before, and they
want to be able to access internal company resources from their desktops,
laptops, tablets, and smartphones. For
that reason, VPN technology has had
to evolve over the years. And even
though it may not seem like it, VPN is
still very much alive and will continue
to be a part of corporate networking
for years to come.

Adjusting To New Networking


Needs
Casemore points out that where
IPsec (Internet Protocol security) and
SSL (Secure Sockets Layer) VPN were
the status quo for a long time, whats
happening now with the correlated developments of cloud and mobility is
changing the game. As we previously
mentioned, employees want to be able
to work wherever and whenever they
want, which means that companies
need to be able to offer access to internal resources essentially on demand.
You are not just concerned about
having a VPN or MPLS [Multiprotocol
Label Switching] connections between
your headquarters, data centers, and
major branches, says Casemore.
Now, you begin to cater to users who
can be at all sorts of endpoints around
the Internet, wired or wireless, and on
all sorts of devices.
Whats happening now is that VPN
technologies are combining multiple
protocols and capabilities, some of
which have been around since the beginning, to cover the widest range of

networking needs possible. Its now


more around policy-based orchestration of your WAN, including your
VPN capabilities, Casemore says. In
other words, what used to be called
VPN or certain functions that used
to be performed by a standalone
VPN solution are now included in
other types of technologies. Some of
these newer solutions work like SDN
(software-defined networking) where
the control plane and data planes
are separated, which makes it easier
for companies to manage individual
connections and ensure secure network
access across the board.
Where VPN-style capabilities are really seeing changes is with WANs (wide
area networks). You used to have WAN
connections that were pretty well stratified, says Casemore. You had your
core enterprise applications in your data
center and you needed people in the
branch to access them. Typically, youd
look at your WAN connection and say,
yeah Ill get a lease line or carrier service and MPLS can handle that and Im
covered. Weve seen a lot of the technology innovation thats occurred in
the data center moving out to the WAN
very quickly now.
Once again, this innovation Casemore refers to is around abstracting the
control and data planes and introducing
much more automation than before. Its
being attacked from several different angles, he says. The router vendors are
doing it. It had an overlay for network
virtualization in the data center and now
theyre extending it over the WAN. You
have WAN optimization vendors who

In addition to the WAN and mobile


device connectivity, VPN technology
has also had to change because of the influx of cloud environments and services.
Casemore says that the amount of responsibility youll have over VPN in the
cloud scenarios ultimately depends on
how much you plan to use it. For instance, startups that host almost everything in the public cloud wont have to
worry about VPN as much because the
cloud service provider will more than
likely handle it on its end. Alternatively,
companies that go with a hybrid cloud
strategy have to think more about how
they want to secure and facilitate communication between their branch offices and headquarters in this new cloud
context, Casemore says. For those latter
companies, adjusting to using SaaS
(software as a service), or cloud-based
applications, may mean they need to
overhaul their VPN approach.
For example, Casemore explains that
one approach is to use an IaaS (infrastructure as a service) providers VPN
solution to handle cloud-based apps and
then maintain your own VPN approach
when it comes to internal resources
and legacy applications. However, he
calls this a piecemeal approach, which
means it may not work for companies
that want a more cohesive approach.
In those situations, youre going to have
to find a third-party vendor who offers
a VPN-style solution that can handle
both internal and cloud-based environments in a similar fashion. A big
question now is, can the carriers and
managed service providers still provide
these third-party VPN services as we
move into this cloud era? Casemore

CyberTrend / March 2015

45

asks. How is that market going to shake


out? Its not really clear, at least to me
at this point, exactly how quickly it will
consolidate and who the winner will be.
Thats the dilemma right now.
The key, according to Casemore, is
to keep in mind why youre using VPN
in the first place and use that as a way
to compare your potential options. Its
all about delivering applications, he
says. As the workloads change and become more varied, some [of them] may
eventually be migrated to the cloud and
not exist anymore in the data center.
For that reason, you need to be flexible
in how you handle VPN, understand
that data and apps will be stored in a
wide variety of locations, and pay attention to the marketplace. Regardless of
which industry segment it comes from,
there will be vendors that can help you
create secure envelopes for these applications to be delivered to users and
potentially customers, Casemore says.

Improved VPN Security &


Management
Connectivity is only one part of the
VPN equation, because you also need
to consider the security aspect of it
and how much thats changed over the
years. As previously mentioned, VPN
SSL and IPsec used to be the norm, but
vendors are now finding new ways to
incorporate VPN functions into other
security solutions. For example, James
McCloskey, director of advisory services, security and risk, at Info-Tech
Research Group, says that many vendors are introducing VPN capabilities
into their NGFWs (next-generation
firewalls) and other solutions.
If youre in the largest enterprises
where you have multiple dedicated firewalls around the world operating in
a way that balances load, then maybe
youre in the market for a dedicated
VPN solution, says McCloskey. But
for a vast majority of organizations out
there, theyre really going to benefit
from getting an integrated VPN solution with a perimeter security device,
like an NGFW or a UTM (unified threat
management) solution.

46

March 2015 / www.cybertrend.com

One of the aspects that needs to be well-considered


and maintained is this mapping of users to access
groups to be able to restrict them to specific resources
on the internal network. Having . . . restrictions set up in
the groupings is really a powerful way to further reduce
the risk of someone unauthorized coming in and accessing content or someone who is authorized for certain
aspects of access to go beyond the scope of what they
should be doing.
JAMES McCLOSKEY
Director Of Advisory Services, Security & Risk : Info-Tech Research Group

Another area where VPN is changing


quite a bit is in how it interacts with
MDM (mobile device management) as
a way to check on a device before it
connects to your network, according
to McCloskey. The VPN infrastructure could be enforcing certain health
checks, but theyre not necessarily the
best at doing that for random mobile
devices that might be connecting,
he says. These MDM solutions are designed to take that into account directly
and basically say that for mobile devices that are registered with the MDM
system, the VPN can automatically be
assured of a certain level of security
on those devices without having to
go to the more invasive inspection of
those devices.

Size & Industry Affect VPN


Requirements
The future of VPN really lies in its
ability to interact with your other security solutions and supplement them.
And when trying to determine what
place VPN will have in your business, it
ultimately depends on the size of your
company, the industry youre in, and
your specific security needs. Smaller
companies and startups, for example,
have to decide if they want to invest in
IT personnel and infrastructure or see
if they can do everything in the cloud,
Casemore says. And on the other end of
the spectrum, larger enterprises or ones
that have been around for decades have

to strike a balance between newer approaches and traditional ones.


If you have a lot of customer-facing
activities and you have stores like retail and youre a hospitality business
or other businesses that have external
consumers of technology, youre obviously going to have a different strategy
in place for that, says Casemore. Some
of it is vertically conditioned and some
of it is conditioned by how mature you
are as a company and how many legacy
apps you have. If you have a lot of
legacy apps and theyre business critical
or there are compliance issues, youre
probably going to stick with the secure
pipes you use today, VPN or otherwise,
at least for those applications.
McCloskey agrees that company
size and industry will have an effect on
what type of VPN approach you take,
but you also have to take basic infrastructure and security needs into account. Having mentioned things like
MDM and VPN interoperability, those
are aspects where the organization really
needs to explore to see if their remote
access requirements are modest enough
and match up against those, he says.
For them, that might be a reasonable
approach to take. But if theyre looking
for a more general solution thats going
to enable their remote workforce, third
parties that they need to have connect
in a controlled way, and mobile users,
theyre probably going to look at a
UTM-based solution.

Better Authentication
DOUBLE DOWN ON YOUR ONLINE SECURITY

IF YOUVE NEVER BEEN hacked, consider yourself lucky. A May 2014


Ponemon Institute study found that over
a 12-month period 47% of U.S. adults
have had personal data exposed. The report points to a number of high-profile
breaches that potentially provided cybercriminals with information to steal
login credentials and/or credit/debit card
information, including up to 110 million Target customers information, 38
million active Adobe users usernames
and passwords, data from a significant
number of AOLs 120 million accounts,
and data from possibly all of eBays 148
million users.
Of course, smart business executives like yourself have already protected your online accounts with long,
random, nearly impossible-to-remember
passwords. (Right?) But even ironclad
passwords may not be enough as hackers
accumulate usernames and passwords
stolen from online databases. In August
2014, Hold Security reported that a

Russian cybercrime gang was in possession of a staggering 1.2 billion unique


username/password combinations.
Its always nice to be in complete control of your own security, so its upsetting,
to say the least, when data breaches and
cybercriminal activity put you at the mercy
of hackers. Thankfully, there are plenty of
ways to further secure your accounts via
multi-factor authentication, which adds
another layer (or two) to the login process. Multi-factor authentication greatly
reduces the likelihood that a cybercriminal
can break into an online account using a
compromised username and password.
Well take a look at the current trends with
multi-factor authentication.

Multi-Factor Passwords
Most online services that offer multifactor authentication use an OTP (onetime password) in conjunction with the
tried-and-true username and password.
Arguably the most popular OTP delivery method is a simple text message.

The high level of penetration of cellular


phones make it a convenient way to do
two-factor or multi-factor authentication, says James McCloskey, director
of advisory services, security and risk, at
Info-Tech Research Group. People tend
to always have their phone with them,
he adds, and just as they wouldnt
think of leaving the house without their
keys, they also wont leave without their
smartphone. Thats a strong authentication mechanism.
Once you receive the PIN or code,
youll just need to enter the OTP as
prompted by the website. The multifactor authentication prompt will typically occur after you enter your username
and password. Using the SMS (Short
Message Service) format is also advantageous because you dont have to own a
smartphone to receive the code. Mobile
devices can provide a huge boost to
multi-factor authentication, since even
feature phones can be used to deliver
multi-factor authentication codes, says

CyberTrend / March 2015

47

Michela Menting, practice director at


ABI Research.
Sending a code via text message might
also be more convenient than other OTP
delivery methods, which can include
email, a discrete app on your smartphone
or tablet, or a hardware token for your
key chain. You might also be able to have
a OTP issued via a phone call, which is
obviously ideal for people with phone
plans that have text message limits. An
email OTP can be helpful if you have easy
access to your email client, for example.
If entering a code each and every time
you log in sounds exhausting, youre not
alone. Two-factor authentication is still
an extra step people have to take, and
many will simply find it a hassle and not
bother, says Menting. Aware of this,
many services that require logins are simplifying the process, so youll only need
to enter a PIN if the website detects a
login attempt from an unfamiliar device
or location.
To different degrees, banks and social networks are making use of context-aware, adaptive techniques and
phone-as-a-token solutions, says Ant
Allan, research vice president at Gartner.
For example, if you try to access Facebook from an unknown endpoint, it can
prompt multi-factor authentication.
Another alternative is the TOTP
(time-based OTP) that you can access
via an app on your smartphone or tablet.
OTP apps typically require you to configure the device with your account credentials. Once complete, the software
token app generates the PIN based on
an algorithm, the time on your mobile
device, and a seed record. The latter is
a secret key thats stored on the online
services server.
A TOTP app generates a new code
periodically, so even if a hacker intercepts
the code, itll only be good for a short
period of time. Google Authenticator is
a popular TOTP that you can use with
Googles various offerings, as well as
some other prominent online services,
such as Amazon Web Services, Dropbox,
and LastPass. The app creates a new PIN
every 30 seconds and is available for
Android, BlackBerry, and iOS devices.

48

March 2015 / www.cybertrend.com

The big benefit of the TOTP app, compared with an SMS message, is that it
works even if your smartphone doesnt
have a data or cellular connection.

Regardless of delivery method, OTP


will continue to be a popular option for
two-step verification, according to experts. With online services, a username

GOOGLE & TWO FACTOR


Google is well-known for
making the vast majority of
its services available online,
including Gmail, Google
Drive, and Google Calendar.
Naturally, you can improve
the security of these services by using two-step
verification. The easiest
way to set up multi-factor
authentication is to go to
https://www.google.com
/landing/2step, click Get
Started, and log in to your
Google account. Next, click
Start Setup, and Google will
ask you to provide a phone
number to which it can send
a numeric code or a voice
call that verbalizes the PIN.
After choosing a method,
click the Send Code button.
On the next screen, enter
the code and click Verify.
Next, Google will ask if you
want to set up the PC as
a Trusted Computer. This
way, you can access Google
accounts without entering a
verification code every time
you log in to that system.
Finally, Google informs you
that youll be asked for a

code whenever you sign in


from an untrusted computer
or device. Click Confirm,
and the two-factor authentication is ready to go.
Because many Google
services are also offered
as apps, which dont have
a good way to support the
second factor, there are
some extra steps required
to validate the apps with
two-step authentication.
After you click Confirm, a
pop-up window will appear
that provides directions
to reconfigure app access. Just select the app
of choice, such as Mail,
and your device, such as
iPhone, from the dropdown menus, and Google
will provide you with a
replacement password that
you can enter. Google can
provide codes with apps
for its services on iPhone,
iPad, Mac, BlackBerry,
Windows Phone, and
Windows PCs.
When you first log in
using two-factor authentication on a PC, youll see

an option to remember
that particular computer.
This way, you wont have
to deal with re-entering a
new verification code, but
youll still be protected
against hackers trying to
access your account from
another PC.
Googles default options
for two-step authentication
are SMS (Short Message
Service) or voice call.
To use Google
Authenticator instead, access your Google account
settings and select the
Security tab. Then, click
the Settings option next to
2-Step Verification. Click
the Switch To App button.
You will receive a prompt
to use your phones
camera to scan the onscreen QR code, which
will produce the 6-digit
verification code in Google
Authenticator. Enter the
code on the Google account Web page to enroll
your mobile device with
the Google Authenticator
app and youre all set.

The high level of penetration of cellular phones make


it a convenient way to do two-factor or multi-factor
authentication. People tend to always have their phone
with them, and just as they wouldnt think of leaving
the house without their keys, they also wont leave
without their smartphone. Thats a strong authentication
mechanism.
JAMES MCCLOSKEY
Director Of Advisory Services, Security & Risk
Info-Tech Research Group

To different degrees, banks and social networks are


making use of context-aware, adaptive techniques and
phone-as-a-token solutions.
ANT ALLAN
Research Vice President : Gartner

and password with a code thats delivered to your smartphone is rapidly becoming the standard combination, says
McCloskey.
We found that some banks offer
customers a hardware token, but there
are some challenges involved with
using hardware tokens with online services. McCloskey explains, Practically
speaking, the hardware tokens arent
something that will take off, because
youd have a token for every online
service that youre working with.
Therefore, most online services will continue to send passcodes via SMS or work
with a soft token app in the near future.

Biometrics
With fingerprint readers and highresolution cameras now available on
smartphones and tablets, you would
think that biometrics would be an upand-coming method for multi-factor
authentication. Unfortunately, there are
some major obstacles to overcome when
it comes to biometrics as an extra layer
of authentication. It really becomes a
privacy issue, says McCloskey.
For example, you can always change
your password, but if a cybercriminal
were to steal your thumbprint or voice

identification, he could use the compromised biometric data to impersonate you


forever. In what format would Apple or
Google share the biometric information?
How would you do the registration of that
biometric with a third-party? McCloskey
asks. Until the privacy challenges are resolved, it would be difficult for online services to accept a fingerprint.
When it comes to securing physical
access to a location, biometrics have been
a proven option for businesses with sensitive data, such as data centers utilizing
fingerprint readers or retinal scanners.
But in these situations, the person providing the biometrics has also had his
identity verified in a traditional way. He
might be an employee, or the organization might have already verified his
identity with a Social Security number,
for example.
With something like a Google account,
registration doesnt typically require any
proof of identity when you set up the account. McCloskey says, Its almost backwards to expect a stronger identification
mechanism that would uniquely identify
you. Thats why a smartphone is a better
thing to bring than a fingerprint or face,
which is a private piece of information, to
enroll with an online service.

Still, its not impossible to imagine


biometrics catching on outside of casino vaults and secret government bunkers. Mobile devices have only recently
started including biometric sensors, so
there hasnt been much time for third
parties to develop policies to manage
a fingerprint or iris template, nor how
the biometic data will be securely transmitted to the server. Time will tell if online services will incorporate biometrics
into their authentication methods.
You might see biometrics used as authentication for other tasks. Biometrics
will play an important role in digital
identification cards most likely, notably
government and health care identity
cards, says Menting.
Besides using fingerprint and iris or
face recognition, online services might
one day be able to use behavioral authentication as a second factor. Some examples
of behavioral traits that can be used include the way you speak a phrase or the
speed at which you type on a keyboard.
Multimodal face plus keyboard/gesture
dynamics could provide continuous passive authentication, providing at least a
medium level of trust with a great user
experience, Allan says.

Multi-Factor Authentication
Setup
Although a variety of online services
now offer multi-factor authentication,
its almost always optional. And you
must register and configure multi-factor
authentication before securing your accounts. Many of todays most popular online services, including Google, Microsoft,
Facebook, PayPal, Dropbox, Apple,
Evernote, Yahoo!, and LinkedIn offer some
type of two-factor authentication.
If you havent already made the move,
we recommend that you consider the additional security of two-factor authentication for any online service that stores
your personal information. Most online
services offer a way to save a device as a
trusted source, so the process isnt too
time-consuming after the initial setup. In
the digital security arms race, its nice to
have any option that puts us in more control of our account security.

CyberTrend / March 2015

49

Cyberattack: Weather The Storm


HOW TO RESPOND TO A DDOS ATTACK

AS EVIDENCED BY A FLOOD OF recent


headline-making news, the cyberattack du jour appears to be the ominous
DDoS (distributed denial of service)
attack. Once the sole domain of highly
skilled cybercriminals with access to
their own private army of infected PCs,
aka a botnet, DDoS attacks are now
available as kits, and can be obtained
fairly cheaply by just about anyone
with a mind to wreak havoc. Whether
your organizations websites have faced
this form of attack or not, there are
a handful of cautionary tales to considercommon reactions that could
make things worse. In this article, well
clear up some misconceptions and provide some tips on what to avoid doing
and how to mitigate further damage.

How A DDoS Attack Works


For the most part, DDoS attacks are
used to extort money from a business
or organization, disrupt normal operations, incite anger among a companys

50

March 2015 / www.cybertrend.com

user base, or act as a distraction while cybercriminals exploit a security loophole


with the intent to steal data. A DDoS attack begins when an attacker uses a poweful computer or series of computers,
such as those in a botnet (network of
computers infected with malware designed to create such a network) or voluntary botnet (network of computers
whose owners have volunteered them to
become part of a botnet), to target a particular IP address with multiple simultaneous requests and other commands
to the point that the server is no longer
able to keep up with the requests. When
this happens, the attacked server, such
as a Web server or authentication server,

will crash or be forced offline, bringing


with it the company website and/or the
services that its legitimate users and customers rely on. The immediate and longterm fallout of such attacks can cripple
a business, and sometimes the organization never fully recovers.
For the most part, every DDoS attack
falls into one of two categories: an application layer attack or a network layer
attack. As you might expect, application
layer attacks occur when cybercriminals
target your organizations applications,
including your website, FTP server, and
DNS (domain name server). These attacks disrupt by preventing your customers and employees from accessing

THERE ARE A NUMBER OF COMMON MISTAKES


COMPANIES MAKE REGARDING THE THREATS
POSED BY DDOS ATTACKS. ONE OF THE
BIGGEST IS SIMPLY ASSUMING THAT YOUR
COMPANY IS NOT A TARGET.

DDoS attacks are used by a range of different threat


actors with widely varying motivations. This has made it
hard to predict who might be targeted in the past. But
were also seeing many more cases of DDoS attacks used
by rival companies.
CHRIS RODRIGUEZ
Senior Industry Analyst, Network Security : Frost & Sullivan

your services. Network layer attacks


attempt to overwhelm the networking
equipment to prevent it from maintaining a connection. A third category of
DDoS attack uses a combination of the
two aforementioned techniques, earning
it a hybrid attack designation. In these
instances, criminals vary and alternate
their attacks to achieve their goals.

The Dos and Donts


According to Chris Rodriguez, senior
industry analyst, network security, for
Frost & Sullivan, there are a number
of common mistakes companies make
regarding the threats posed by DDoS
attacks. One of the biggest is simply
assuming that your company is not a
target. DDoS attacks are used by a range
of different threat actors with widely
varying motivations, says Rodriguez.
This has made it hard to predict who
might be targeted in the past. But were
also seeing many more cases of DDoS attacks used by rival companies.
Rodriguez also cautions that there is
no minimum size of organization for
DDoS targets. Its not just large enterprise or high-tech companies that are at
risk, he says. Other examples of attack
victims include an online floral service
targeted the week of Valentines Day
[and] e-commerce sites targeted during
Black Friday.
To avoid becoming a statistic, Rodriguez recommends studying DDoS attack trends. A good metaphor might be
to compare this process to monitoring
weather forecasts, taking note of more
serious conditions that indicate a coming
storm. DDoS attacks can also be performed by groups or individuals with
opposing ideological, political, or other

views, so any high profile public conflicts


can also be a risk factor. Businesses of all
sizes should track and be aware of these
and other risk factors.

Dont Wait
Rodriguez suggests that now is the
best time to come up with a DDoS mitigation strategy. Investing in preventive
measures will cost significantly less than
paying for the emergency attack mitigation services that many vendors offer.
Likely, a DDoS service provider will
be able to offer a standby option that is
more cost effective than an emergency
response service, he says. Your preemptive steps will also prove to be more
effective in the long term, unlike emergency procedures, which provide more
limited immediate benefits. Make sure to
have escalating response plans. Evaluate
the options early, whose responsibility

and valuable intellectual property. Also,


this particular attack lasted for over a
month straight. Although these longspanning attacks are rare, they can cause
an unprecedented amount of disruption
for any organization.

Pushing The Bandwidth


Envelope
DDoS attack strength is often measured in terms of bandwidth, and some
of the biggest attacks have traffic that
exceeded 300Gbps (gigabits per second)
and 400Gbps. Flooding networks with
this much traffic is on an order of
magnitude greater than typical traffic.
However, despite these high figures,
most DDoS attacks are much smaller
and shorter-lived. But ultimately these
attacks are unpredictable, and businesses should have some sort of plan for
the worst.

How Much Protection Is


Enough?
According to Rodriguez, the key aspects of your best DDoS defense include mitigation strategies designed
to prevent or minimize the damages
caused by most attacks, with some options available for the less commonplace large-scale and long-term attacks.
This is most often achieved through a

INVESTING IN PREVENTIVE MEASURES WILL


COST SIGNIFICANTLY LESS THAN PAYING
FOR THE EMERGENCY ATTACK MITIGATION
SERVICES THAT MANY VENDORS OFFER.
is it, who should respond. Will the ISP
mitigate some attack traffic, and if so,
then what kind and how much and for
how long?
Referring to an incident where a DDoS
attack was followed by SQL (Structured
Query Language) injection attacks, Rodriguez revealed that, despite DDoS attacks reputation for being largely about
interrupting availability, more and more
cybercriminals are using them as a distraction to give data thieves an opportunity to covertly make off with private data

hybrid solution, where an on-premises


product blocks the majority of attacks,
but can switch traffic to a cloud scrubbing service after a certain threshold
is reached. Rodriguez also recommends deploying a DDoS strategy that
includes a WAF (Web application firewall), which can help pinpoint and prevent application-layer attacks that occur
during network traffic overload. As a
result, were seeing more cross pollination and integration between WAF and
DDoS solutions.

CyberTrend / March 2015

51

Fog Computing
PROVIDING A BRIDGE BETWEEN END USERS & THE CLOUD

KEY POINTS
Fog computing is a developing
model for connecting devices lower to
the ground as a complement to cloud
computing.
Fog computing would offload
processing, storage, and networking
workloads occurring in cloud services
and place them at the edge of the network in distributed local resources.
Reduced latency would be a primary benefit, which will be vital when
factoring in billions of connected devices the Internet of Things entails.
Fog computing would put real-time
data closer to users, providing an opportunity for users to perform more
relevant, time-sensitive analysis and
make intelligent business adjustments.

52

March 2015 / www.cybertrend.com

METAPHORICALLY, THE TERM fog


computing is spot on. In relation to
cloud computing, which is often described as residing in the sky, fog
computing sits closer to the ground
and closer to users. Conceptually, fog
computing also poses interesting potential that, if realized, will place data
closer to users and alleviate issues the
IoT (Internet of Things) is expected
to produce.
The IoT presents an attractive
framework for businesses and consumers alikewhether in the form
of improved operational efficiencies,
new revenue streams, and/or big data
analytics, says Ryan Martin, Yankee
Group associate analyst. For this to
happen IT infrastructures will require
the ability to manage the vast amounts
of information the IoT will generate.
Today, most of this processing happens
locally or at the cloud level, Martin
says. Fog computing would change that
by offering the ability to compute,

direct, process, and store data at the


edge of the network, providing a virtual extension of todays cloud services, he says.
Though fog computing is very much
in its early stages, work is underway
to bring it to life. The following explores what fog computing entails and
proposes to do, and how businesses
could benefit.

See Through The Fog


Fog computing is a term and
paradigm that Cisco is credited with
coining and leading the charge for.
Rather than replace cloud computing,
fog computing is meant to complement it, primarily by offloading part
of the processing, storage, and networking workloads from cloud services and placing them at the networks
edge in a geographically distributed
fashion. Essentially, fog computing
would enable distributed local resources (routers, mobile devices, etc.)

to do some heavy lifting vs. all of it


happening in the cloud.
A chief benefit would be reduced
latency, something particularly noteworthy when factoring in the scores of
Internet-connected devices projected
to exist in coming years that send data
to the cloud. One estimate suggests
there will be 50 billion such connected
objects (including sensors used in association with manufacturing processes,
medical equipment, in-vehicle systems,
consumer products, etc.) by 2020.
Martin says the volume of devices with
embedded connectivity poised to hit
the market will be commensurate with
the need for high-bandwidth, low-latency data services. Fog computing
aims to alleviate some of the resulting
burden on the network by adding an
intelligent layer between devices and
the cloud, he says.
Jet engines are commonly used as
an example to demonstrate how fog
computing could prove useful. A jet
engine can produce about 10TB of data
related to its performance and condition in just 30 minutes. Continually
transmitting that much data to the
cloud for processing and then sending
back responses is time-consuming and
taxing on bandwidth. In a fog computing scenario, chunks of processing
would happen locally to combat this
issue. Another example is a connected
thermostat that constantly logs data,
of which a fair amount might never
change but that is continually sent to
the cloud for processing. If processed
locally, however, the data that is of
little consequence could be extracted,
thus saving bandwidth.
Although cloud computing has
brought IT vendors and companies
various advantages, including costsavings, Faisal Ghaus, TechNavio vice

Like the cloud, the fog is a distributed model, but


instead of using the endpoints as computing elements contributing to a greater task, fog computing
focuses on reporting the data created as part of the
tasks already underway on a device. This could cover
a huge range of information . . . including metrics
from industrial processes; the fuel efficiency of truck
fleets; and data from browsing habits, social media
use, and cell phone locations.
STEVEN HILL
Senior Analyst : Current Analysis

president, says cloud computing has


also posed various challenges, the most
pertinent being that the resources an
individual requires reside in another
location (the cloud). How quickly a
user can access those resources depends
completely on accessible bandwidth, he
says. Furthermore, bandwidth might
vary depending on how many users
are accessing that cloud resource and
the daily activities occurring in an office operating on the same bandwidth.
As more users/organizations enter the
cloud framework, this challenge will
increase, he says.
To counter this problem, fog computing would make data available as
close as possible to users, Ghaus says,
either by putting it at the networks
edge as an intermediate stop between
the cloud and user or placing it at
various access points with the aim of
reducing latency as much as possible.
Fog computing could also make it
more viable for mobile device users
to access the cloud, he says, because it
allows for select data or applications
to be at the end-user point itself for
faster processing.

In addition to being organizationally located below the cloud, Ren


Buest, senior analyst at Crisp Research,
describes the fog as serving as an
optimized transfer medium for services and data within the cloud.
Conceptually, fog computing builds
upon existing and common technologies like CDN [content delivery
networks], but based on cloud technologies it should ensure the delivery
of more complex services, he says.
As users data requirements increase, Buest says, concepts to reinforce the cloud idea and empower
companies/vendors to provide content over a widely spread platform will
be needed. Fog computing should
help to transport the distributed data
closer to the end user, and thus to decrease latency and the number of required hops, and better support mobile
computing and streaming services,
he says. Alongside the Internet of
Things, the rising demand of users to
access data at any time from any place
and with any device is another reason
why the idea of fog computing will become increasingly important.

RATHER THAN REPLACE CLOUD COMPUTING, FOG COMPUTING IS MEANT


TO COMPLEMENT IT, PRIMARILY BY OFFLOADING PART OF THE PROCESSING, STORAGE, AND NETWORKING WORKLOADS FROM CLOUD SERVICES
AND PLACING THEM AT THE NETWORKS EDGE IN A GEOGRAPHICALLY
DISTRIBUTED FASHION.

CyberTrend / March 2015

53

The volume of devices with embedded connectivity


poised to hit the market will be commensurate with
the need for high-bandwidth, low-latency data services. Fog computing should alleviate some of the
resulting burden on the network by adding an intelligent layer between devices and the cloud.
RYAN MARTIN
Analyst : Yankee Group

The IoT Connection


Steven Hill, Current Analysis senior
analyst, says that proposed fog computing would extend network connectivity to billions of intelligent devices
without dramatically increasing the
strain on the Internet while providing
better access to data the devices provide. Like the cloud, the fog is a distributed model, but instead of using
the endpoints as computing elements
contributing to a greater task, fog computing focuses on reporting the data
created as part of the tasks already underway on a device, Hill says. This
could cover a huge range of information, he says, including metrics from
industrial processes; the fuel efficiency
of truck fleets; and data from browsing habits, social media use, and cell
phone locations.
Beyond growing in number at an
incredible rate, network-connected devices represent a wealth of data Hill
says were only now beginning to appreciate. Most people dont realize
that a huge percentage of the number
of microprocessors made today are actually used for embedded applications
rather than for computers, he says.
Currently, accessibility to embedded
processors is limited because they were
traditionally dealt with on a vendorby-vendor basis, he explains. Fog computing proposes an environment where
most of these types of applications
could instead be directed through the
Internet, he adds.
From a business perspective, the
IoE [Internet of Everything] represents
a relatively untapped mountain of data

54

March 2015 / www.cybertrend.com

just waiting to be harvested for analysis, and fog computing can actually
bring you much closer to these realtime data collection opportunities,
Hill says. Theoretically, the fresher the
data, the more relevant it would be
to an organizations analytical goals.
The closer the organization is to the
data, the quicker it could adjust to current conditions.
Matt Hatton, Machina Research
founder and CEO, believes that while
theres a benefit to a fog computinglike approach of processing data on
edge devices vs. sending every bit of

data might require serious crunching,


to the point you may want it all to
go into the cloud, Hatton says. And
there are cases that fall between those
extremes. Hatton says theres such a
variety of applications and services included with the IoT, its nearly impossible that a variety of approaches wont
be required to address them due to the
volume of traffic various applications
create, the varying importance of that
differing data, and the speed at which
organizations may need to respond
to data.

Pluses & Minuses


Mike Sapien, principal analyst with
Ovum, believes there are two issues fog
computing can realistically address:
business application performance and
sustainability. Generally, putting more
application intelligence and user content at the network edge will improve
performance and enhance availability
of business applications if a cloud
platform were down or slow, he says.
Considering the long term, Sapien cites
disaster recovery and allowing some

WHILE THERES A BENEFIT TO A FOG


COMPUTING-LIKE APPROACH OF PROCESSING DATA ON EDGE DEVICES VS. SENDING
EVERY BIT OF DATA TO THE CLOUD, HANDLING PROCESSING AT APPROPRIATE
LEVELS IS IMPORTANT.
data to the cloud, handling processing
at appropriate levels is important.
A connected vehicle, for example,
might generate gigabytes of data per
minute, some of which holds value to
car makers. Here, doing a significant
amount of processing and data management on the device makes sense
in terms of determining what data is
useful and extracting it so that youre
delivering by exception vs. delivering
all available data, he says.
A particular connected medical instrument, meanwhile, might only generate a small amount of data, but that

data collection and applications on


fog computing nodes to enable better
performance, increase data available
for analytics, and increase uptime as
potential real-time analytics and business-specific benefits.
Buest, meanwhile, says the more
services, data, and applications that
are deployed to end users, the more
vendors must find ways to optimize
the deployment processes. This means
delivering information closer to users
and reducing latency to be prepared
for the IoT, he says. Theres no doubt
that the consumerization of IT and

BYOD [bring your own device] will


be increasing the use and, therefore,
the consumption of bandwidth, he
says. Increasingly, rich content and
data is delivered over cloud computing
platforms to the edges of the Internet
while users needs are simultaneously
growing exponentially, he says. With
the increasing use of data and cloud
services, fog computing will play a central role and help to reduce the latency
and improve the quality for the user,
he adds.
Jagdish Rebello, senior director at
IHS, finds what fog computing is proposing promising and makes sense
for mission-critical and latency-sensitive applications. For a majority of
applications, however, there will be a
question of balancing increased costs
against gains in reduced latency, he
says. Further, he believes its worth
asking what fraction of applications
that require reduced latency is really
involved, as many critical or timeconstrained applications are probably done right on the device itself.
In other words, the relevant question
is this: how many applications will
benefit from reduced latency but still
essentially lend themselves to a basic
cloud architecture? In my opinion,
right now, thats small, Rebello says.
For Hill, the key to fog computing
lies in the level of intelligence the
devices at the networks edge and the
network model for communicating
with those devices possess. The real
value to business lies in the work
product provided by the endpoint
devices, which makes the term fog
computing somewhat more relevant,
Hill says. From a typical business perspective, this comes down to the same
challenges as any analytics initiative:
understanding your business goals,
defining the types of data most

There has always been a concern about the security


aspect of data being stored in a cloud, and now with
fog computing, the data isnt exactly stored in a cloud
but in an intermediate location, which makes the
question of security even more pertinent.
FAISAL GHAUS
Vice President : TechNavio

relevant, identifying the best sources


for the relevant data points, and then
streamlining the communications with
those sources.
Currently, a key challenge for fog
computing in general lies in providing
greater network connectivity with embedded technologies and convincing
the vendors of those embedded products to agree on a unified strategy for
communicating with them, Hill says.
Although Ghaus thinks its too early
to have a solid understanding of what
problems could stem from fog computing, knowing exactly what data will
be stored at end-user access points or
at intermediate locations end users access will be key. Further to this, what
about the security implications? There
has always been a concern about the
security aspect of data being stored in
a cloud, and now with fog computing,
the data isnt exactly stored in a cloud
but in an intermediate location, which
makes the question of security even
more pertinent, he says.
Ghaus adds that storing data on
end-user access points also raises the
question of how much storage will be
required for these access points. One
of the biggest advantages cloud computing offers is doing away (to a certain extent) with storage at end user
points, he says. However, with fog
computing, storage is required again,
and the amount of storage will depend

on what type of applications will be


stored. This would bring about an increase in cost again, he says.
Privacy is another issue, including in
terms of drawing a firm line between
the purely functional monitoring of
things like industrial processes or
public transportation systems and the
collection of personal data from mobile
devices, Hill says. He cites a recent
social network experiment regarding
the effect of news feed contents on
users emotional state as an example.
Although the results of the experiment
were significant and useful, the covert methodology used to collect data
walked a fine line, Hill says. Whats
considered visibility to one person may
well be intrusion to another, and the
increasing use of mobile devices as
data-gathering points could easily face
similar backlash, he says.
Hill says the types of embedded, intelligence-gathering opportunities are
multiplying at an amazing rate, and
most dont present users an opportunity to opt out or even become aware
of their existence. If they do, actual
intent is buried in a multipage EULA
(end user license agreement), he says.
Of course, this still occurs in the absence of fog computing, but you must
admit that the dramatically increased
connectivity offered by a foggier environment will only add to these problems, he says.

ITS TOO EARLY TO HAVE A SOLID UNDERSTANDING OF WHAT


PROBLEMS COULD STEM FROM FOG COMPUTING, KNOWING EXACTLY
WHAT DATA WILL BE STORED AT END-USER ACCESS POINTS OR AT
INTERMEDIATE LOCATIONS END USERS ACCESS WILL BE KEY.

CyberTrend / March 2015

55

Content Delivery Challenges


HOW VIDEO, MOBILE, THE CLOUD & MORE ARE INFLUENCING CDN DECISIONS

CONTENT DELIVERY NETWORKS are


nothing new. For roughly two decades,
CDN providers have helped speed up
delivery of customers content (software,
e-commerce, media files, documentation, etc.) and distribute various services, including streaming media. Dan
Rayburn, Frost & Sullivan principal
analyst, says major companies across
an assortment of verticals are all using
CDNs. Options in the market are numerous, Rayburn says, and theyre extremely stable and affordable and offer
great performance and coverage. Pricing
for many services, in fact, continues to
decline at a 20% to 25% rate annually,
he says.
Many CDN vendors today are extending their reach to differentiate themselves from competitors. Furthermore,
major cloud service providers are enticing customers with their CDN offerings. This article details these trends
as well as the challenges organizations
using CDNs face in delivering content

56

March 2015 / www.cybertrend.com

on a medium-to-large scale or when


looking to expand their CDN abilities.

Common Challenges
To better communicate with and engage customers, companies are adding
more video and personalized content
to Web pages. A challenge here, says
Jim Davis, 451 Research senior analyst,
is preparing that video content for delivery by a CDN. The process for managing video assets and encoding video
in various formats at a proper quality
level are specific stumbling blocks.
Positively, Rayburn says, CDNs are
agnostic, so theyll deliver video in any
type of protocol you want. Most of the
industry has already started to standardize on HTTP protocols for video
delivery, he says.

Typically, Davis says, CDN vendors


partner with OVP (online video platform) providers or integrate abilities to
transcode video to various file formats
that computers and mobile devices use.
Rayburn says companies generally need
only check a bunch of boxes to designate
how they want video encoded and what
they want it to play back on. From there,
back-end systems transcode it into the
proper file formats for delivery, he says.
In terms of video, which is
making a considerable splash on mobile devices, CDNs can prove beneficial in terms of Web performance
optimization. Mark Grannan, analyst
with Forrester Research, says Web performance optimization basically leverages contextual techniques to overcome
mobile latency and the fact that mobile

MANY CDN VENDORS TODAY ARE EXTENDING


THEIR REACH TO DIFFERENTIATE THEMSELVES
FROM COMPETITORS.

devices are moving in and out of coverage and changing connection speeds in
real time.
Philipp Karcher, Forrester Research
senior analyst, says whatever the content,
CDNs compress or deliver the right size
or quality format to optimize the mobile device experience given the network
connection. Part of that is based on the
traditional CDN technology of storage
and delivering the content closer to the
mobile device, and part of it is actually
transforming the content itself, which is
more of a newer capability for CDNs,
he says. Companies arent taking advantage of this much yet, but Forrester
believes they will because mobile is increasingly becoming a tremendous priority, Karcher says.
Grannan dubs mobile absolutely
imperative for commerce and says
it should be a critical priority for any
marketer or digital media company
putting video on the Web. In terms of
bounce and conversion rates, Web performance on mobile devices correlates
directly with revenue growth, he says.
Elsewhere, cloud-based services consumed on workers mobile devices can
benefit from CDNs. Maintaining performance of cloud services for end users
is a tough nut to crack because latency
over the wireless network can be highly
variable, Davis says. Numerous CDN
providers offer Web performance optimization services to enhance application
performance on devices, he says.

Provider Traits
To determine which CDN provider is
appropriate, understanding that CDNs
have different strengths for different services is key. One may excel in streaming
video, while another has more advanced
mobile content optimization or dynamic
site acceleration services. Selecting a provider should entail marketing and IT
teams first deciding how much video
theyll be producing, how frequently the
sites content is being changed, and to
what degree the audience will be viewing
content on mobile devices, Davis says.
If looking for a commodity provider,
Rayburn says, theres little difference

CDNs are agnostic, so theyll deliver video in any type


of protocol you want. Most of the industry has already
started to standardize on HTTP protocols for video
delivery.
DAN RAYBURN
Principal Analyst : Frost & Sullivan

Whatever the content, CDNs compress or deliver the


right size or quality format to optimize the mobile device experience. Part of that is based on the traditional
CDN technology of storage and delivering the content
closer to the mobile device, and part of it is actually
transforming the content itself, which is more of a newer
capability for CDNs.
PHILIPP KARCHER
Senior Analyst : Forrester Research

Selecting a provider should entail marketing and IT


teams first deciding how much video theyll be producing, how frequently the sites content is being changed,
and to what degree the audience will be viewing content
on mobile devices.
JIM DAVIS
Senior Analyst : 451 Research

among major CDNs. Many large enterprises, in fact, split their traffic among two
providers, he says. Providers attempt to
distinguish themselves through value-add
services (security, commerce, advertising
networks, etc.). One major provider, for
example, offers full-scale enterprise DDoS
(distributed denial of service) attack support, Grannan says.
All CDN vendors are at least talking
about security, Karcher says, whether
making acquisitions in the space or not.
Part of the new value proposition around
security is the fact that CDNs have inherently always mitigated some security
concerns by distributing the load and
adding capacity and scalability in case
of DDoS attacks, he says. Acquisitions
are helping CDNs better monitor and

manage security threats on enterprise


sites, he says.

How To Choose
When eyeing vendors, ask what content the organization wants to deliver,
Rayburn says. If it isnt new content,
ask how its delivered currently. If already using a CDN, ask if there are problems that require a move. Additionally,
ask whom the content must reach, on
what devices, and in what regions. Also
understand how the vendor works with
its customers and what verticals its customers address. Many large organizations change CDN providers every 18 to
24 months, Rayburn says, for price or
performance reasons or to move half of
their traffic to another provider.

CyberTrend / March 2015

57

THE LATEST PREMIUM ELECTRONICS

For Work Or Play, The GS60 Ghost Delivers Full-System Power


WWW.MSI.COM
OK, there are no wrong answers here: Are you a weekend PC gamer? Does your work require high-end graphics capabilities? Even if you
answered "no" to either of these questions, read on. What does matter is that you're interested in a laptop computer that offers powerhouse performance. The GS60 Ghost Pro 4K (starting at $1,599.99) from MSI includes special features with PC Gaming in mind, but
its top-of-the-line components make the GS60 perfect for professional use, as well. The system includes the latest 4th generation Intel
Core i7 processor, up to 16GB of memory plus 3GB of graphics memory, and as much as 1TB of hard drive storage plus a 512GB SSD
(solid-state drive). The GS60 has a brilliant 15.6-inch 4K UHD (Ultra High Definition) widescreen display, a GeForce GTX 970M dedicated
graphics card, and a Dynaudio sound system with 7.1-channel S/PDIF output and Audio Boost technology. An intelligent fan system
keeps the laptop cool. The GS60 Ghost uses the latest Wi-Fi (802.11ac) and Bluetooth (4.0) standards, includes three USB 3.0 ports, and
an HDMI output. The laptop weighs 4.2 pounds, features a backlit keyboard, and comes with Windows 8.1.

58

March 2015 / www.cybertrend.com

Portable Lightning
For Your iPhone
WWW.PNY.COM
Your iPhone 5 or 6 might pack more of a charge
than its predecessors, but that doesn't guarantee
your battery won't run low right when you need
it most. The PowerPack L3000 ($49.99) from
PNY features a retractable Apple Lightning connector and holds enough juice to deliver up to
two full charges, depending on the iPhone model.
The L3000 itself is charged using a Micro-USB
cable (included) and sports an LED battery level
indicator so you always know where you stand.
The device uses the same 5-volt, 1-amp flow for
both input and output, and charges one iPhone
at a time. The PowerPack L3000 is pocket-sized
for easy carrying and comes with a three-year
warranty. And as long as Apple continues using
the Lightning connector, the L3000 will work with
future iPhones, as well.

Looking For Some


DIY Satisfaction?
WWW.GIGABYTE.US
If youre looking for a high-power, low-profile
computer and youre the slightest bit technically inclined, consider the somewhat-doit-yourself BRIX Pro (model GB-BXi7-4770R;
starting at around $649) from Gigabyte. The
small form-factor (2.4x4.3x4.5-inch) case
comes with a choice of Intel Core i7 4770R
processors and a Wi-Fi Mini PCIe module
installed for robust performance. You choose
and install your own memory (maximum
16GB), 2.5-inch SSD or hard drive, and operating system. The case includes a Gigabit
LAN port as well as HDMI, Mini DisplayPort,
and four USB 3.0 ports for adding peripherals.
You provide the keyboard and display. Other
BRIX models with different base specs are
also available.

CyberTrend / March 2015

59

Smartphone Tips
ORGANIZE CONTACTS & GET SOCIAL

BLACKBERRY

60

Add Contacts To Your


Home Screen

Filter Your BlackBerry Contacts


For Quicker Access

Since the introduction of BlackBerry


6 and through to BlackBerry OS 10, the
BlackBerry Home screen has been able to
contain icons for things other than apps,
including Web pages and documents.
One often-overlooked use for this capability is to add one of your contacts to
the home screen. (Just dont let the rest
of your contacts know that they didnt
make the cut.)
Launch the Contacts app and highlight the contact you wish to have on
your home screen. Press the Menu key
and select Add to Home Screen. A dialog
box will appear, with an icon for the contact and the contacts name. You can
change either by tapping on it. When
youre satisfied with the name and icon,
tap the Add button.

The Contacts app in BlackBerry 10 does an excellent job of bringing all of


your contacts together into one place. You can even tap into social media contact information to access photos that you can then use as the main photo for a
given contact. After you import contacts from multiple sources, however, you
might find that some people have been pulled into your contact lists who you
dont want in that list. This can especially be an issue if you follow numerous
people via social networking services but you dont know all of them personally. One method for filtering social media contacts out of your contact list
is to open Contacts, swipe down to access Settings, find Show Accounts In
Contact List, and switch an account type to Off; you can hide Twitter contacts
in this way, for example.
Another technique for filtering contacts in BlackBerry 10 is to limit the
view to show only those contacts you typically connect with, say, via BBM
(BlackBerry Messenger). With the Contacts app open, tap the Contact icon
and then tap to choose the set of contacts you want to view. When youre
looking at any set of contacts (whether it be all contacts, all Facebook contacts,
or some other view), you can organize them further by accessing Settings and
changing the option under Sort Contacts By to First Name, Last Name, or
Company Name.

March 2015 / www.cybertrend.com

ANDROID
Automatically & Manually Merge
Contacts
Android users can make short work
of merging contacts, or combining the
multiple duplicate entries for a single
person that commonly occurs when you
import contacts from multiple sources.
To automatically combine contacts, log
into your Gmail account on a PC, click
Contacts, click the More Actions button
at the top of the screen, and then select
Find & Merge Duplicates. When the action has been completed, Gmail provides you with a report of how many
contacts were added and how many of
the contacts were merged. Typically, a
handful of contacts will not be merged,
but you can manually link these from
your Android device. Start by tapping
the Contacts icon on the Home Screen,
tap a contact you want to manually
merge with another, press the Menu key,
tap Link Contact, begin typing the name
of the second contact and then scroll
to and tap the appropriate entry from
the screen. Repeat this process for each
of the remaining contacts you need to
manually link.

Send Annoying Calls To Your


Voice Mail
We all have at least onethat person
or company you dont need or want
to speak with but who keeps calling
throughout the day or even into the evening. You can ignore the call, and eventually the caller will either hang up or be
routed to your voice mail. But why not
save time (your time, that is) and send
the call to your voice mail to begin with?
In order to do this, the incoming callers phone number needs to be in your
contacts list, so create the contact if you
havent already. Once thats done, open
the contact from your list, tap Menu,
and tap Options. Place a check mark in
the Incoming Calls: Send Directly To
Voicemail box.

Dial Phone Number Extensions Automatically


Your Android smartphone lets you place calls directly to the extension of
business colleagues, family, or friends, by adding soft pauses to a contacts
phone number. To set up this feature, tap the phone icon on the Home screen,
select the appropriate contact from the Contacts tab, press the Menu button, and
tap the green plus sign to add a new number. Input the number as you would
normally, with the area code first, and then tap the More icon (three dots) at the
bottom of the screen and select 2-Sec Pause. You should then see a comma following the phone number; this represents the pause. Finally, enter the extension.
Now, when you dial this number, the extension will be dialed automatically after
a two-second pause.

Get All Of Your Messages,


All In One Place
With the Hangouts app, Android
smartphone users can access all of
their messages in one place. Open
the Hangouts app to access text
and MMS (Multimedia Messaging
Service) messages sent and received
via your smartphone, as well as other
Google Voice and video calls. The
idea behind the Hangouts app is to
unify the messaging screens into one
location and make it easier to keep
track of past and ongoing conversations with specific contacts. With the
app on your smartphone, you can
make free group video calls with as
many as 10 people.

Google continues to update the Hangouts app to add new


features, such as improved notifications.

Connect Your Phone With Your


Social Networks
Because Android is a Google product, Android devices make it easy to link
with your Google+ social networking account: just launch the Google+ app
on your device and, assuming you have signed in to a Google account, the app
will access that account immediately; if you are signed into multiple accounts,
you can choose which Google+ account you want to use. Android also plays
nice with other social networking services, but accessing the services directly is
largely a theres an app for that affair. That is, download the app for your device, sign in, and youre good to go.
Androids social networking integration doesnt end there, however. You
can, for example, view Facebook pictures and video in the Gallery app, and you
can share content from the Gallery app directly to Facebook, Twitter, and other
networks. You can also share voice mail messages to Facebook or Twitter, and
integrate Facebook contacts with your Gmail and other contacts.

CyberTrend / March 2015

61

I OS

Link Similar Contact Information

Use Custom Vibrations To Identify Callers

If you have duplicate or similar information for multiple contacts in


your iPhone (say, after importing
some of your Facebook contact data),
you can link those contacts together.
Open one of the relevant contacts, tap
Edit, scroll down until you see Linked
Contacts, tap Link Contacts, select the
other contact you want to link, and
tap Link. If you should need to unlink
contacts, open one of the contacts, tap
Edit, scroll down until you see Linked
Contacts, tap the red circle containing
a minus sign next to the contact you
wish to unlink, tap the Unlink button,
and tap Done.

Setting your iPhone to vibrate is the considerate thing to do when youre


in an environment where ringtones would be distracting or disruptive. But in
doing so, you lose any unique ringtones you may have set up to identify certain callers.
That doesnt have to be the
case. Your iPhone allows you to
set up custom vibration patterns
keyed to your contact list. To turn
on this feature, open Settings and
tap General. Scroll down and select Accessibility. Scroll down and
tap to turn on Custom Vibrations.
With Custom Vibrations enabled,
you can turn to your Contacts app
to set the vibration pattern to use
for selected individuals.
Locate an individual in the
Contacts app. Tap the Edit button
and scroll until you see Vibrations.
You will find a list of standard vibrations you can assign by tapping
them. You can also create a new
vibration pattern by tapping Create
New Vibration. To record your
Setting your iPhone to vibrate doesnt mean youll never
new vibration, simply start tapping
know whos calling. If you enable Custom Vibrations, you
can set the vibration pattern to use for select friends and
the iPhones screen with the patcolleagues.
tern you want to use. When youre
happy with the pattern, tap the Stop
button. Youll be asked to give the
new pattern a name. Once it has a name, the pattern will be added to the list
of available vibrations and automatically assigned to the individual you were
editing. You can repeat this process for any additional contacts for whom you
want to assign custom vibrations.

Send A Caller To Voicemail


Heres a quick one. If you have your
iPhone with you but you arent using
it say, when attending a meeting or
having a face-to-face conversation
and you notice the buzzing of an incoming call, you can quickly press your
phones Wake button twice to stop the
vibrating and send the call directly to
your voicemail service.

So When Did You Send That


Text Message?
Recent versions of Apples iOS have
eliminated the persistent time stamp
alongside texts
in the Messages
app, but that
doesnt mean
the time stamps
have disappeared entirely.
When a string
of texts is open
in the Messages app, swipe to the left
and dont let go; doing this reveals the
dates and times that messages were sent
and received.

62

March 2015 / www.cybertrend.com

Block Calls, Texts & FaceTime


Accessing the block list on iOS 7 or iOS 8 requires only a few simple steps. Access
Settings, scroll down, and tap Phone, Messages, or FaceTime. (It doesnt matter
which option you select; you wont receive calls, messages, or FaceTime connections
from any contact you add to the Blocked list.) Next, tap Blocked in any of the three
categories and choose Add New to select who to add to your Blocked list.

Add A Photo To A Contact


To put a face with the name, open Contacts, find the appropriate contact, and
tap Edit. Tap the Add Photo circle in the top left portion of the screen. Tap Choose
Photo if you have a photo of the contact already stored on your iPhone, or tap Take
Photo to snap a picture on the spot. If you have a print photo, you can snap that.

WINDOWS PHONE
Set Up New Email Accounts

Combine Duplicate Contacts

Moving your email activity from


one device to a new Windows Phone 8
smartphone is as simple as adding the
email account information to your new
smartphone. Initial phone setup involves adding a Microsoft account, but
if you still need to add one (or would
like to add another), access Settings, tap
Email + Accounts, tap Add An Account,
choose Microsoft Account, tap Next, and
enter your email address and password
to sign in. To choose any other type of
account, access Settings, tap Email +
Account, tap Add An Account, choose
the relevant account type (Outlook,
Google, or Yahoo), and follow the onscreen instructions.

Windows Phone 8 does a great job of collecting all of your contacts into one
place so theyre easy to search for. If you find you have multiple instances of
the same contact, however, drawn in from various email and social media accounts, you can use the Link feature to combine them on your smartphone.
Find one of the contact instances, tap the Link icon, and then either select
another contact instance to connect it to or tap Choose A Contact to find the
correct contact to link.

Define Your Inner Circle


You can use Cortana to define quiet
hours, or times you would prefer not
to be notified if someone is contacting
you. To activate the feature on the fly
or set up a schedule, access Cortanas
Notebook, tap Quiet Hours, and adjust the settings. While youre using the
Notebook, also tap Inner Circle. This is
where you can establish which contacts
you would like to hear from, even during
quiet hours. After tapping Inner Circle,
simply tap the Add icon and select a contact to add, and then repeat as necessary.

Add Social Networking


Accounts To Your
Windows Phone
Windows Phone integrates social
media functionality into multiple
areas of the operating system, primarily in the People hub but also
(depending on the social network)
in the Photos hub, the Me card, and
elsewhere. To add a Facebook account to your phone, tap Settings,
Email + Accounts, Add An Account,
and Facebook; enter the required
login information; tap Sign In;
and, if you would like to link your
Facebook and Microsoft accounts,
tap Connect. There are similar steps
for adding a LinkedIn or Twitter
account: tap Settings, Email +
Accounts, LinkedIn or Twitter, and
Connect; you will then be redirected
to a mobile Web page to log in.

Need to add a social media or other type of account to


your Windows Phone 8 smartphone? It may not seem
intuitive, but the way to add such an account is to go
through Settings and Email + Accounts.

Transfer Contact
& Calendar Info
Get Audible Email
Notifications
Waiting for a phone call, but dont
want to keep looking at your phone?
One alternative that Windows Phone 8
provides is the option to set a specific
sound that will play when an email
arrives. To set this up, access Settings,
tap Ringtones + Sounds, and tap Sound
in the New Email section. Then simply
select the notification sound you want
to attach to incoming emails.

If you are switching from a non-Windows Phone smartphone to a new


Windows Phone 8 smartphone, there are different ways to transfer your contact and calendar data. If you use Outlook.com, Google, or another major service, you can set up your new phone to work with that service; access Settings,
tap Email + Accounts, tap Add An Account, choose the type of account, and
follow the on-screen instructions to complete the procedure. Or, if you have
transferred a SIM card from your previous phone to your new Windows Phone
8 smartphone, you can tap Start, People, More, Settings, and Import SIM
Contacts, and then follow the on-screen instructions to import all contacts or
only specific contacts; calendar information cannot be transferred using this
method. If those options dont work for you, Microsoft offers a Windows Phone
Sync wizard (bit.ly/1k8JJmu) to walk you through an alternate process.

CyberTrend / March 2015

63

Rootkit Attacks
WHAT TO DO TO FIGHT BACK

EVEN SEEING THE WORD rootkit can


send shivers up the spine of someone who
has suffered through the inconvenience
and damage a rootkit can exact. According
to Dan Olds, principal at Gabriel Consulting Group, rootkits are some of the
most insidious and dangerous pieces of
malware out there today. Thats due to
the fact that rootkits are both extremely
difficult to detect and get rid of completely.
Therefore, the more you know about rootkits, the better.

What Is A Rootkit?
A rootkit is software that infects and
gains privileged access to a computer.
This means it can perform administratorlevel type tasks, says Michela Menting,
practice director with ABI Research. The
primary feature is that it can hide itself in
the system and remain undetected.
One way to think of how a rootkit
wreaks havoc, says Jim OGorman, an
instructor of offensive security measures, is to envision that you are driving

64

March 2015 / www.cybertrend.com

a car but someone else is intercepting


all your movements and deciding if he
should pass them on to the car or not. In
some cases, he might decide to just insert
some of his own commands, as well,
OGorman says.
Although rootkits are similar to viruses
or Trojans, says Chris Hadnagy, a security
training professional, viruses and Trojans
usually delete data, stop services, or cause
harm while a rootkit provides an attacker
system access to get at data. Not all rootkits are malicious (a company might install one to remotely access and control
employee computers, for example), however, Menting says they are extremely

popular with malicious hackers and cyber


criminals, which is why they have such a
negative connotation.

The Damage Done


Essentially, rootkits give an attacker
free reign to perform any task desired,
include installing software; deleting
files; modifying programs; transmitting
data; and using spyware to steal credit
card numbers, passwords, keystrokes,
etc. A rootkits ability to modify existing
programs and processes, says Menting,
enables it to avoid detection by security software that would normally catch
such software.

NOT ALL ROOTKITS ARE MALICIOUS, HOWEVER, ABI RESEARCH'S MICHELA MENTING
SAYS THEY ARE EXTREMELY POPULAR WITH
MALICIOUS HACKERS AND CYBER CRIMINALS,
WHICH IS WHY THEY HAVE SUCH A NEGATIVE
CONNOTATION.

There really arent any limits to how


much damage it can do to a PC, Olds
says. It can delete data files and then
rewrite gibberish on the hard drive to
ensure that the data cant be recovered, or
it can quietly work in the background and
log user keystrokes, eventually capturing
workplace, e-commerce, or banking usernames and passwords. Ultimately, a
rootkit can route that data to a hacker to
plunder accounts or gain access to a corporate network, Olds says.
Beyond software-based rootkits
there are hardware-based rootkits, says
Hadnagy. These, like software rootkits,
give the attacker full admin access to a
machine, compromising everything on
it and even at times the network its connected to, he says. For users, OGorman
says a rootkit destroys all trust with the
computer. You cant know what is private, what is not. All integrity is gone.

How Youll Know


There are several ways a rootkit can
find its way into a computer. A downloaded program file a user believes to
be legitimate, for example, may have a
rootkit embedded within it. Menting
says rootkits generally enter a system
through existing vulnerabilities and are
loaded by malware, which can infect
computers via downloads, email attachments disguised as genuine communication or documents, websites with
unpatched vulnerabilities, USB thumb
drives, or mobile devices.
To the average user, abnormal computer behavior is the best indicator a
rootkit might be present; warning signs
include files spontaneously disappearing
or appearing, a sluggish Internet connection, and slow-loading programs.
Such behavior can indicate other programs are running in the background.
Menting advises checking the Task
Manager to detect which applications or
processes are running and using significant memory. For the non-tech user, it
may be difficult to understand, she says.
But users should familiarize themselves
with how their Task Manager looks
when its running on a clean system so
that when it actually is infected, the user

Unfortunately, the likelihood of being hacked or unwittingly downloading malware on a computer is extremely
high. Especially in the network-connected environment
of a company, even if you take all precautions necessary
someone else may not have and you get a virus from
them internally.
MICHELA MENTING
Practice Director : ABI Research

can spot some differences when looking


at the tasks.
That said, detecting a rootkit is still
generally difficult. This is due to how
adept they are at installing themselves
and hiding their presence in a way
that is virtually undetectable by your
system software, Olds says. In this
case, the only way to find the rootkit is
to boot the system using a CD/DVD or
thumb drive that has special diagnostic
routines designed to find and remove
rootkits. Hadnagy says if a systems OS
is compromised, it cant be trusted to
find flaws in itself.In this event, it may
be necessary to boot a self-contained OS
running from a CD/DVD or USB drive
and run malware detection and removal
software from a clean environment.

What To Do
For typical users, arguably the worst
news concerning rootkits is that getting
rid of one can be beyond their scope.
Olds says, in fact, most users should
probably seek an experts help if they
suspect a rootkit infection. Though some
security programs can detect and remove
specific rootkits, Menting says, there are
so many variants that it can be impossible
to detect and remove them all. Often,
she says, getting rid of a rootkit requires
a radical solution.
If a user suspects a rootkit, he should
first disconnect the system from the
Internet to cut off possible remote access
and prevent data from leaking, Menting
says. Next, remove data from the infected
computer and scan it for malware on another device. (Menting notes that if the
data contains unknown [or zero-day]

malware, this step may not guarantee the


malware is eradicated.) Finally, the computer should be purgedwipe the hard
drive and reinstall everything, she says.
OGorman, in fact, says starting over is
the only real solution, because really, you
cant trust cleanup methods, as you are
never really sure if they worked.

How To Protect Yourself


The first defense against rootkits (and
malware in general) is keeping the operating system and all softwareespecially security softwareup-to-date and
fully patched. Completely relying on antivirus software is a mistake, however. As
OGorman says, theres always a lag between the time a new threat pops up and
the point at which antivirus software can
detect it. The best way to avoid issues is
to not engage in risky activities, he says.
Run trustworthy, current software thats
kept patched. Dont go to shady sites
with out-of-date browsers and plug-ins.
Dont run software that doesnt come
from trustworthy sources.
Unfortunately, the likelihood of being
hacked or unwittingly downloading malware on a computer is extremely high,
Menting says. Especially in the networkconnected environment of a company,
even if you take all precautions necessary,
someone else may not have and you get a
virus from them internally.
Menting suggests using different passwords for all logins, encrypting sensitive
and confidential data, staying constantly
on the lookout for odd system behaviors,
and securing mobile devices, particularly
if they are connected to a company network or business computer.

CyberTrend / March 2015

65

Mobile Data Best Practices


SYNC & BACKUP OPTIONS FOR YOUR TRAVELS

THE THEFT OR LOSS OF a laptop,


tablet, smartphone, or other mobile device ranks among the worst productivity
catastrophes that can befall a traveling
professional. For all intents and purposes, our devices are our offices when
we travel, and losing them disrupts our
ability to work and communicate. There
is an obvious financial hit associated
with the loss of hardware, but there is a
potentially greater hit that occurs in the
loss of corporate data. Its important,
then, to know where your data is at all
times, so in the event that you no longer
have access to your devices, youll know
what is lost and what is accessible elsewhere. And, if you follow a few mobile
best practices, youll never have to worry
about losing much data at allif any.

Know What Gets Backed Up


Automatically
Depending on your smartphones or
tablets OS (operating system), there
is a certain amount of device data that
automatically gets backed up on a regular basis. If you use a USB cable to
directly sync your iPhone or iPad with
your computer, for example, the sync
process backs up all of the OS and app
data stored on that device; there is an
option to encrypt and password-protect
the backed-up data, too. If you use the
iCloud service with your iOS device,
specific sets of data will automatically
be backed up in the background as long
as your device has a Wi-Fi Internet
connection, is plugged in to a power
source, and has a locked screen; backed

up data can include camera roll images, documents, audio, and settings,
depending on the options you choose.
Android users can manage incremental backups for apps and device
settings by signing into the associated
Google Account from their smartphones or tablets. The Android Auto
Sync feature routinely syncs in the
background; how and what it syncs
partly depends on the options you
choose, but by default the feature backs
up OS data, contact information, documents, and select app data (such as
Facebook and Twitter).
If you have a device running one of
the latest versions of Windows Phone,
you can sync documents stored on your
device with Microsofts OneDrive cloud

IF YOURE RELUCTANT TO SYNC KEY DATA TO A CLOUD BACKUP OR


STORAGE SERVICE ON A REGULAR BASIS, CONSIDER USING AN ALTERNATIVE CLOUD SOLUTION . . .

66

March 2015 / www.cybertrend.com

storage solution; you can also retrieve


documents from OneDrive that were
uploaded from a different source. To
sync all of the photos, audio files, and
videos stored on your Windows Phone
device, you must install Microsofts
Zune software on your computer and
connect the mobile device to the computer via USB.

Dont Forget
Your App Data
App data encompasses a broad
range of digital information, but in
our context it means third-party apps
and the content you create using those
apps. Consider, for instance, notetaking services that exist as both cloud
services (where all of the information
that is associated with those services
is stored in the cloud) and applications
(where your app-related information
is stored locally). As you take notes
with the app, it stores those notes locally and in the cloud simultaneously
and in real-time. Such an app-service
combination is different from a notetaking app that does not have an associated cloud service; with this type
of app, everything you add is stored
only in the device and is therefore
vulnerable to loss. Make sure you know
how your apps work so you dont get
caught unawares.
Also keep in mind that some apps
are more flexible than others. Apples

Notes app in iOS, for example, can keep


your notes on the device only or on
both the device and in the cloud, depending on how you set it up.

Be Careful
When Traveling
If you travel frequently, you probably have quite a few travel-related
routines. When it comes to keeping
all of your data intact, though, its important to remember that travel disrupts the routines youve established
at the office. For example, if you regularly sync your tablet and smartphone
with your computer but typically leave
the computer behind when traveling,
the backup that otherwise occurs with
every physical sync wont take place
during your travels. If you keep that
sort of thing in mind while traveling,
you will remain aware of what data resides in the danger zone (i.e., stored
on your device, but not backed up anywhere else) in the event your device
gets lost or stolen.

storage provider to use with only


a handful of files that are necessary
for a specific trip. Providers offering
this type of service typically also offer
a mobile app that makes the service
more useful on your mobile device.
And some major storage services also
sync with productivity apps you might
already have installed on your devices.
Another stop-gap alternative is to
use a Web-based email service to email
documents to and from a corporate account. Doing this ensures that a copy
of the document is maintained on the
corporate network even after you delete the associated email from the Web
email account.

Physical Backup
Finally, you cant sync a certain
amount of valuable device data to the
cloud (or to your main computer via
the cloud), so be sure to back up that
data as often as possible to a second
device (such as a laptop) or storage solution (such as a microSD card or portable hard drive).

Use Cloud Services,


At Least Temporarily
If youre reluctant to sync key data
to a cloud backup or storage service on
a regular basis, consider using an alternative cloud solutionat least temporarilyto meet specific requirements
while traveling. For example, you could
set up an account with a major online

To locate the Storage


& Backup menu on
your iOS 7 device, tap
Settings, iCloud, and
Storage & Backup (choose
Backup in iOS 8). From
this screen you can view
available storage and
switch iCloud Backup
off and on.

You can customize which apps are backed up in iCloud by toggling the
ON/OFF button next to each app. Be sure to activate the Find My iPad feature
in case you need to locate a lost iOS device.

CyberTrend / March 2015

67

Quick Cloud Collaboration


KEEPS PROJECTS IN SYNC

AS THE NUMBER OF EMPLOYEES


doing business outside the walls of
the traditional office environment increases, companies of all sizes are
adopting new ways of getting work
done. Namely, theyre moving toward
more flexible, efficient cloud-based
services. Although the purposes of online SaaS (software as a service) options vary, users are taking advantage of
seamless conferencing, file sharing, idea
generating, and so much more. Read on
to find a service that suits your collaborative needs.

Take Documents Offline


It seems inevitable that wireless Internet availability determines when
and where you edit online documents
while you are on the road. But with the
help of the right device-specific offline
app, you dont have to postpone work
until you are within range of a Wi-Fi
hotspot. Some basic apps primarily
let you read docs offline, whereas

68

March 2015 / www.cybertrend.com

more feature-packed options let you


edit and save changes to collaborative
documents, spreadsheets, and presentations. Microsoft, for instance, provides a solution for offline workers via
Office 365s (products.office.com/en-US/
business) SharePoint Online. Using the
programs MySite tool, you can create
copies of documents on your PC and
work on them when you are offline.
Then, when you connect to the cloud
again, SharePoint automatically syncs
your work.

with globetrotting team members are


commonly conducted via LCD touchscreens. Whether youre working on a
smartphone, tablet, laptop, or PC, using
your webcam as a collaboration tool
connects you to colleagues and clients
more intimately than the routine conference call. We suggest using a videoconferencing app or software that
supports multiuser conversations. Some
options let you incorporate shared
whiteboards and simultaneous document editing.

Dont Forget Your Webcam

Consider Using File-Sharing


Tools

Collaboration is accomplished on an
international level these days, which
means that face-to-face conversations

If you need to share documents that


dont contain particularly sensitive

. . . USING YOUR WEBCAM AS A COLLABORATION TOOL CONNECTS YOU TO COLLEAGUES


AND CLIENTS MORE INTIMATELY THAN THE
ROUTINE CONFERENCE CALL.

Accomplish More With


Web Apps That Combine
Different Capabilities

With a cloud service such as Microsoft Office 365, you can co-author Word documents, Excel sheets, and other files with
colleagues. Unlike traditional Office products, you dont have to save a separate version for yourself or wait until another
person closes the file.

Multitaskers take note: Not only


can you collaborate with more team
members in the cloud than ever before,
but you can also complete more tasks
with-in the same service. Want to walk
your team through a live slideshow
from a presentation sharing service? No
problem. Need to create flow diagrams
and share relevant images with your
colleagues online? Theres a service for
that. And, if your team and a thirdparty developer are working on a website, for example, you can work together
in a virtual space where anyone can add
comments, crop images, and more.

Manage Time & Tasks

If youre a Windows Phone user, you can easily access Office 365 apps from your device. Specifically, you can start a
new OneNote page, create a new Office document, or edit files saved in SharePoint.

Organizing schedules and all the associated meetings, deadlines, projects,


and so forth can become a daunting
task. Among the available cloud-based
sites and mobile device apps, you can
find apps and services that will help
you manage your work life. Consider
utilizing event-based planners, grouporiented reminder apps, services for
meeting coordination, and visual to-do
lists to keep your busy life on track.

Print Documents
data, you can do so using a file-sharing
service. Most file-sharing services let
you securely upload and store a limited number of gigabytes (2 to 5GB is
common) of data. Some services also
give you the tools to organize your files.
Sharing from your mobile device makes
on-the-go collaboration convenient, so
its beneficial to check out file-sharing
apps appropriate for your device.

Consider Online
Productivity Tools
A plethora of Web apps fall under
the umbrella of productivity, but in
no way is that a bad thing because there
is an app for practically every task, priority, project, and goal. For instance,
you can use project management tools
to juggle deadlines, manage to-do lists,
track workflows, and more. Adding to

these capabilities, Microsoft Office 365


gives team members shared access to
master documents via user-created intranet sites, so they can edit in real-time
and manage file access among customers
and partners.

Use Whiteboards
When you cant meet in person,
members of your virtual team can interact and brainstorm on full-featured
online whiteboards. Browser-based
whiteboards typically let you invite
meeting participants to create and
sketch on the same board. A number of
whiteboard apps also support real-time
collaboration in which everyone in the
session is an equal participant. This is
a good tool for tablet users who want
to share ideas on the go but need input
from others.

When you need to print content from


your mobile device, you can use one of
many available apps to print documents
to supported printers anywhere in the
world. For example, if you are working
on a presentation on your tablet while
traveling and need to distribute copies
to colleagues, you can print the presentation to a printer in your main office. Some mobile printing apps let you
search a directory for nearby printers
(such as those in hotels or airports) or
locate a printer via GPS, so if you need
to print a boarding pass or other content from your device while traveling,
you can do that, too. Some cloud-based
printing apps and services also provide
the option to print by sending an email
attachment to a supported printer, or
to print documents saved in an online
storage service.

CyberTrend / March 2015

69

PowerPoint Tips
ADD CHARTS & GRAPHS TO YOUR PRESENTATION

THE SCENARIO IS FAMILIAR to traveling


professionals: your PowerPoint presentation is all set when new and relevant
information comes to light and must be
added. If youre on the road or in the sky
and find yourself having to add charts or
graphs to a PowerPoint 2013 presentation, this article will help. We include
tips designed for PowerPoint novices and
adept PowerPoint users seeking specific
chart-making advice.

Create A Basic Chart Or Graph


To insert a colorful chart or graph illustration into your PowerPoint presentation, locate the Insert tab and select Chart.
Next, look through the available chart
types, select the design that best represents the information you want to share,
and click OK. A Microsoft Excel chart will
open with placeholder text and figures
you can replace with relevant data. When
you finish entering your data, close the
spreadsheet to see the completed chart in
the slide.

70

March 2015 / www.cybertrend.com

Save & Manage


Chart Templates
If you want to adjust the look of an
existing chart, click the chart in the
PowerPoint slide and the Chart Tools
contextual tab appears. Keep in mind
Chart Tools will only appear when you
select a chart. Open the Design tab and
you can manipulate the overall layout
and style of your chart via the options
in the Chart Layouts and Chart Styles
panes. When youve fashioned a chart
youd like to reuse, click File, Save As,
and choose the location in which you
want to save the slide. In the Save As
dialog box, type a name for the template
in the File Name field, select PowerPoint
Template from the Save As Type dropdown menu, and click Save.

Highlight Important
Data In A Chart
Whether youre presenting numerous
charts or need to add emphasis to specific data within a chart, sometimes its

beneficial to call out key points. Locate


the Drawing pane in the Home tab and
expand the Shapes menu. Select a shape
that is appropriate for emphasizing
information in your chart, and then
click anywhere in the chart to place the
shape. To customize the shape, select it
and click the Shape Fill, Shape Outline,
Shape Effects, and Quick Styles options
in the Drawing pane.

Insert A Chart
Linked To Excel
If youre used to working in Excel and
prefer to construct the skeleton of your
chart first, you can use Excel to compile data and create a chart for use in
PowerPoint. Start by entering values in
an Excel workbook. Highlight all necessary data cells, click Insert, and apply a
preferred chart style in the Charts pane.
Next, select the newly created chart and
click Copy in the Home tab. Open a current or new PowerPoint slide and find
the Clipboard pane on the Home tab.

Click the Paste dropdown arrow and choose


Keep Source Formatting
& Link Data (to maintain the appearance of
the Excel file) or Use
Destination Theme &
Link Data (to match the
chart appearance with
the presentation).

Microsoft PowerPoints Design tab


in the Chart Tools contextual tab lets
you modify the layout of your chart
and adjust its style. These settings
help you create one-of-a-kind charts
and graphs that illuminate important
statistics or values.

Edit & Add Labels


A chart that includes
a lot of numbers or a
detailed legend may
require some editing,
especially because you
want it to look polished
for presentation purposes. These finetuning tools are located in the Add
Chart Element drop-drown menu in the
Chart Layouts pane of the Design tab on
the Chart Tools contextual tab. If you
notice that your chart is missing a title,
you can add one by selecting Chart Title
and clicking Centered Overlay Title or
Above Chartthis displays a title at the
top of the chart. You can browse the remaining label options to add axis titles,
insert legend variations, and manipulate data.

clicking the Format tab in the Chart


Tools contextual tab. You can enhance
backgrounds, category shapes, and 3D
chart elements when you use options on
the Shape Styles pane for each feature.
Options on the WordArt Styles pane let
you apply fill colors, outlines, and effects
to chart text. To view every part of your
chart (such as depth, floor, horizontal
axis, side wall, and so on), click the dropdown arrow at the top of the Current
Selection pane.

Adjust Style & Text

Modify Data
In An Existing Chart

To put the finishing touches of


color and contrast on a chart, start by

Regardless of whether you created


your chart in Excel or PowerPoint, you

You can outline a graphical


element, change its color, and
add unique effects to a chart
or graph all within PowerPoint.
In addition, applying WordArt
Styles will change the fill
color, shade, and outline
of selected text.

should be able to modify data without


much hassle. In PowerPoint, click the
chart you intend to change and select
the Design tab in the Chart Tools contextual tab. Next, click Edit Data in the
Data pane. Excel opens the data sheet in
a new window, and from here you can
click and edit individual cells. Simply
closing the Excel file will refresh and
save the new content.

Add Animation
If you want to emphasize a particular
data group, you can add animations to
a graph or chart. Under the Animations
tab, the Animation pane has a variety
of animations you can apply to a chart.
Explore extra effects
by expanding the pane
and clicking More
Entrance Effects,
More Emphasis Effects, or More Exit
Effects at the bottom
of the menu. To stagger the animation of
individual objects,
click Effect Options
in the Animations
pane and select one
of the following
functions: As One
Object, By Series, By
Category, Be Element
In Series, or By Element In Category.

CyberTrend / March 2015

71

Certied Piedmontese beef tastes great: lean and tender, juicy and delicious.
But theres more to it than just avor. Certied Piedmontese is also low in fat
and calories. At the same time, its protein-rich with robust avor and premium
tenderness. Incredibly lean, unbelievably tender: Its the best of both worlds.

piedmontese.com

Вам также может понравиться