You are on page 1of 72

NOVEMBER 2014 VOL. 12 ISS.

11
CYBERTREND.COM

PC Today is Now CyberTrend


Upgrading to give you a better
HIGH-TECH EXPERIENCE.

CUTTING EDGE
SECURITY
NEW TOOLS TO
PROTECT YOUR
COMPANYS DATA

WHOLE CHICKEN
Tecumseh Farms Smart Chicken is truly the most natural chicken in the United States. All Tecumseh Farms
products are raised without the use of animal by-products, antibiotics, or hormones, are 100% all-natural, and are
processed using puried cold air instead of adding non-potable waterthats the air-chilled difference.

WWW.SMARTCHICKEN.COM

Volume 12 : Issue 11 : November 2014

24

16

SECURITY IN THE CLOUD

CA TECHNOLOGIES

8 COVER STORY
This month we cover security in the cloud,
firewalls today, and risk management ROI

36 COMMUNICATIONS
Types of hosted communications offerings
commonly available today

52 NETWORKING
How mobile devices affect the corporate
network

16 BUSINESS
This month's focus is CA Technologies and its
cloud, mainframe, and mobility offerings

40 DATA
A "big data" executive overview and a look
at data lakeswhat they are and why they
matter

56 ELECTRONICS
The latest premium consumer electronics and
a look at trends in artificial intelligence

24 CLOUD
Introduction to managed cloud services
27 MOBILITY
Mobile collaboration, what constitutes a
tablet, storage for new laptops, and mobile
Web app pros and cons

CONTACT US
P.O. Box 82545
Lincoln, NE 68501
or
120 W. Harvest Drive
Lincoln, NE 68521

44 ENERGY
The latest news and research into energyconscious tech

60 TIPS
Smartphone and business travel tips

46 IT
CyrusOne's high-speed fiber network for
colocation, and how better servers improve
business

Advertising: (800) 247-4880


Fax: (402) 479-2104
Circulation: (800) 334-7458
Fax: (402) 479-2123
www.cybertrend.com
email: feedback@cybertrend.com

Copyright 2014 by Sandhills Publishing Company. CyberTrend TM is a trademark of Sandhills Publishing Company. All rights reserved.
Reproduction of material appearing in CyberTrend TM is strictly prohibited without written permission.

Organizations Becoming More


Aware Of Threats
Mobile, cloud, social, and information will spur security spending through
at least next year, according to new research from Gartner. This Nexus of
Forces is impacting security in terms
of new vulnerabilities, says Gartner
research director Lawrence Pingree. It
is also creating new opportunities to
improve effectiveness, particularly as a
result of better understanding security
threats by using contextual information
and other security intelligence.
Gartner research shows that worldwide spending on information security
will top $71.1 billion this year, up 7.9%
from last year; Gartner expects another
8.2% growth in spending next year.
Companies are investing most in data
loss prevention, which will see spending
increase 18.9% this year, according to
Gartners forecast.

Does Your Organization Stand Behind BYOD?


With corporate data residing on more and more devices, including those purchased and used by employees through a BYOD (bring your own device) program,
those devices arent always being included in backup policies and procedures. A
recent survey by the Enterprise Strategy Group found that 54% of organizations
with formal policies for endpoint backup either dont include employee-owned devices or arent sure if those devices are protected. Why? Here are some of the most
common reasons, or what ESG analyst Jason Buffington refers to as excuses.

45% - Not A Strategic Priority At This Time


38% - No Concerns About Data Loss
31% - Budget Constraints
27% - Too Much Data To Protect
26% - Burden On IT Of Data Retrieval/Recovery Process
22% - No Compliance Requirements

DellOro Group: SDN Exiting


The Hype Phase

Growth To Be Stunted By Data


Center Skill Insufficiencies

BYOD? Make Sure Secure


Methods Are In Place

The market for SDN (software-defined


networking) equipment will grow more
than 65% this year, according to DellOro
Group. Most of that will be in network security appliances and Ethernet switches.
By 2020, data centers will look significantly
different, says Alan Weckel, vice president at DellOro Group. How users access data centers will be forever changed, as
will the way enterprises deploy and manage
them and the vendors that operate in this
market. Weckel says the evolution toward
10GbE in enterprise and 25GbE in cloud is
an opportunity for data center vendors and
providers to garner increased IT spending.

Organizations may start to see a lag


in growth in the next two years due to a
lack of availability of data center skills, according to recent Gartner research. Whats
missing in this sector are capacity planning and performance management skills
within IT infrastructure and operations
departments. Among the specific actions
Gartner recommends for I&O (infrastructure and operations) professionals: Develop
demand-shaping techniques to prevent
performance overloads. Additionally, I&O
workers can brush up on their knowledge
operational analytics tools and prepare for
the possibilities of big data.

When it comes to BYOD (bring your


own device), its not just the devices
that can pose security risks. New data
from Gartner shows that through next
year, more than 75% of mobile applications will fail basic security tests. To a
large degree, the applications employees
download from app stores have little to
no security assurances, exposing them
to attacks and violating security policies. The solution, according to Dionisio
Zumerle, principal research analyst at
Gartner, includes adopting methods and
technologies for mobile application security testing and risk assurance.

November 2014 / www.cybertrend.com

MYTH 1 - EVERYONE IS AHEAD OF US IN ADOPTING BIG DATA


Although approximately 13% of respondents in a big data adoption survey indicated they are deploying big data this
year, almost 25% of those surveyed say they have no plans to invest at this time.

MYTH 2 - WE HAVE SO MUCH DATA, WE DONT NEED TO WORRY


ABOUT EVERY LITTLE DATA FLAW
IT decisions makers shouldnt assume that minor data flaws dont impact the large mass of data in their organization.

Big Data Myths Skew Enterprise


Infrastructure Strategies
IT decision makers are operating under
some common illusions, when it comes
to understanding the potential of big data.
According to Gartner, there are five primary
myths that are obfuscating the facts about
big data. Big data offers big opportunities,
but poses even bigger challenges. Its sheer
volume doesnt solve the problems inherent
in all data, says Alexander Linden, research
director at Gartner. IT leaders need to cut
through the hype and confusion, and base
their actions on known facts and businessdriven outcomes.

MYTH 3 - BIG DATA TECHNOLOGY WILL ELIMINATE THE NEED FOR


DATA INTEGRATION
Gartner says many people believe big data lets organizations use a schema on read on-demand approach to
information processing, but in reality, users will rely on a schema on write approach that allows them to agree
about the integrity of data and how it relates to the [data] scenarios.

MYTH 4 - ITS POINTLESS USING A DATA WAREHOUSE FOR ADVANCED


ANALYTICS
Data refining will become an essential part of analysis as information managers realize that analytics projects will
require the use of a data warehouse during the analysis, according to Gartner.

MYTH 5 - DATA LAKES WILL REPLACE THE DATA WAREHOUSE


Data warehouses already have the capabilities to support a broad variety of users throughout an organization. IM
[information management] leaders dont have to wait for data lakes to catch up, says Nick Heudecker, research
director at Gartner.

The Internet Of Things Will


Change Security Plans

Demand For External Storage


Weakens, Slowing Market

Semiconductor Sales Reach


All-Time High, Growth Continues

Business uses cases using IoT (Internet


of Things) devices already exist, according
to Gartner, and enterprises will be forced
to secure them. The power of an Internet
of Things device to change the state of environments in and of itself will cause chief
information security officers to redefine the
scope of their security efforts beyond present
responsibilities, says Earl Perkins, research
vice president at Gartner. CISOs will need
to prioritize initial IoT implementations by
tactical risk and secure IoT using a blend of
approaches combining mobile and cloud architectures along with industrial control, automation, and physical security, Perkins says.

The storage market has been relatively


strong for years as enterprises sought ways
to keep pace with increasing amounts of data.
But that growth is showing signs of slowing,
according to new information from IDC.
Worldwide external disk storage systems
factory revenues were down 1.4% between
the Q2 2013 and Q2 2014. The total market
for disk storage systems (both internal and
external) showed just 0.3% year-over-year
growth, according to IDC. Although capacity shipped reached 11.5 exabytes during
the second quarter of this year for a yearover-year growth of 23.9%, IDC reports that
growth is low by historic comparison.

Semiconductor sales may go through the


roof by years end, as new research from
Gartner suggests. Jon Erensen, research director with Gartner, says that semiconductor revenue set an all-time record in
the third quarter of 2014, fueled by a strong
electronics build for the holiday season. By
the end of 2014, worldwide semiconductor
revenues are expected to peak at $338 billion, a 7.2% boost over 2013. Erensen says to
prepare for a flood of new product introductions for the holidays, including tablets,
ultramobiles, and smartphones. Gartner
projects that the semiconductor market will
see 5.8% growth in 2015, as well.

CyberTrend / November 2014

Increased Focus On Dedicated


Network Security Groups
In the past, when enterprises wanted to
buy network security equipment, a security team would define what was required
and a networking team would purchase
and operate the equipment, according to
a recent blog post by Jon Oltsik, senior
principal analyst at ESG. But that picture
is increasingly changing. New research
from ESG shows that 47% of enterprises
have a dedicated group in charge of all
aspects of network security. Of the remaining organizations, 26% say network
security is a cooperative arrangement
between networking and security teams
right now but that theyre in the process
of creating a dedicated network security group, Oltsik reports. The ESG data
suggests that network security is moving
away from the gear that transports bits
and closer to the technologies that protect
the bits.

How Often Do You Unplug?


Thats the gist of the question CivicScience posed in a survey of 8,718 consumers earlier this year: How often do you unplug from all personal technology
(mobile phones, tablets/computers, e-readers, TV, audio players, etc.)? Of those
responding, 64% use a smartphone, 43% own a tablet computer, 28% own an
e-reader, and 47% watch some sort of second screen while theyre also watching
television. As for the question at hand, heres how consumers replied:

Daily

Never

A Few Times
Per Week

20%
43%

10%

Once Per
Week

6%
4%

17%

Once Per Month

A Few Times
Per Year

File Sync & Sharing Market


Growing Fast

U.S. Internet Users


Opting For BYOID

Personal 3D Printing Set For


Booming Growth

A new study from IDC indicates the


FSS (file synchronization and sharing
market) will undergo a 23.1% growth to
become a $2.3 billion market by 2018. In
2013 alone, the market grew 114% to $805
million, according to the research firm.
File sync and share is being approached
from a number of different market polarities to meet a wide variety of business
challenges, says Vanessa Thompson, research director with IDC. The research
firm indicates consumer adoption of these
services, and the fact that many consumer
versions are free, as major drivers for the
FSS markets growth.

Social authentication, or whats known


as BYOID (bring your own identity), lets
users sign into sites or apps using their social identity information. This is a practice
with which more consumers are growing
comfortable. A July survey from OnePoll
and Gigya says 77% of domestic Internet
users between the ages of 18 and 55 have
used their social identity to log in into either a website or app. Other reasons why
Internet users sign in with social logins
include avoiding registration forms, not
creating more usernames and passwords;
sharing articles with social network friends;
and better protecting personal data.

With patents expiring for 3D printing,


well soon see substantial growth in the
personal 3D printing market, says Chris
Connery, an NPD Group analyst. Connery
says that the goal is for 3D printing technology to move from industry to businesses and eventually to homes. This shift
takes time, he says, but recent numbers
from NPD Group show that sales of personal 3D printers increased 250% between
the first and second quarters of this year.
Although the general consensus is that inhome use of 3D printers hasnt yet caught
on, Connery says, there was a seasonal
spike in December 2013.

November 2014 / www.cybertrend.com

STARTUPS
UK Startup Offers Relocation
Via Cloud
Why hire a third-party service to relocate employees when you could use
a simple cloud-based service instead?
Thats the question UK-based startup
Move Guides hopes a growing number
of human resources departments will ask,
as Move Guides offers that precise alternative. Talent mobility is a huge pain
point for HR departments, says Brynne
Herbert, founder of Move Guides, and
our market-leading Talent Mobility
Cloud is their answer. Move Guides
recently closed an $8.2 million Series A
funding round led by New Enterprise
Associates. With this additional funding,
Move Guides will continue to expand.
It currently has offices in Hong Kong,
London, New York, and San Francisco.

SolidFire Gets $82M For Flash


SolidFire, a company based in
Boulder, Colo., that builds all-flash
storage systems for next-generation
data centers, recently closed an $82
million Series D funding round led
by Greenspring Associates. This
brings the total funding four-year-old
SolidFire has received to $150 million.
SolidFire also announced expansion of
Flash storage offers excellent capacity and speed, and
its SF Series flash storage product line, with costs coming down flash is providing better return
which, according to SolidFire, includes on investment. SolidFires new SF4805 flash storage
guaranteed Quality of Service, com- node is designed to capitalize on the increasing business
plete system automation, and scale- for automated and scalable flash storage.
out storage design. About the influx
of funds, Dave Wright, founder and
CEO, says, Additional funding allows us to continue to extend SolidFires technical
advantages over the competition and will deepen our sales, marketing, and channel
enablement to meet the growing global demand for SolidFires leading all-flash
storage architecture.

Intuitive Analytics,
No Coding Required

More Productive Email


Conversations

GoodData Gets $25.7M For Cloud


Analytics

For companies that have limited programming resources but are interested in
analytics tools, there are the services of
Alteryx. The Irvine, Calif.,-based startup
offers a solution that combines information from a variety of sources (internal
data, third-party data, and cloud-based
data) into dashboards for sales, marketing,
and other business departments. The company recently completed a $60 million investment round led by Insight Venture
Partners. According to Alteryx, part of the
funding will go toward increasing go-tomarket capacity and further expanding
channel partners.

There is a highly valuable simplicity in


the straightforward back-and-forth flow
of email conversations, but most in the
business world know how conversations
and the use (and misuse) of CC and BCC
can throw productivity out the window.
Front, a startup based in France, offers
a solution that enables better control
over email flows, including means to deal
with mass emails, ways to collaborate
in line with email messages, and features to track correspondence, at least
for organizations using Gmail. Front recently closed a $3.1 million seed round
of funding.

Intel recently led a $25.7 million equity financing round for Good- Data,
a San Francisco-based startup devoted
to providing end-to-end cloud-based
analytics capabilities to companies of
all sizes, particularly enterprises. The
business intelligence market has reached
a new point of maturity, as organizations increasingly analyze data at scale in
the cloud, says Roman Stanek, CEO of
GoodData. We have always recognized
a need in the market for a complete,
cloud-based big data platform, he adds,
attributing the new funding as recognition of GoodDatas value in this market.

CyberTrend / November 2014

Security In The Cloud


THE PROMISE OF SECURITY AS A SERVICE

KEY POINTS
Among SecaaS offerings, organizations have most widely adopted
email and Web protection services,
but offerings run the gamut.
Reducing internal security infrastructure and administration burdens while acquiring cutting-edge
security are top reasons organizations adopt SecaaS.
SecaaS can translate into cost
savings for organizations, but that
isnt guaranteed and shouldnt be
the driving factor for adoption.
A potential drawback to using
SecaaS offerings for some organizations can be a providers inability
to meet various customization
requirements for security that an
organization may have.

November 2014 / www.cybertrend.com

WHAT TYPE OF ORGANIZATION is a


good candidate for using cloud-based
security services today? Virtually any,
says Kevin Fielder, co-chair of the Cloud
Security Alliance Working Group.
Further, nearly every aspect of security
is available from SecaaS (security as a
service) providers now, and the benefits from using them can be numerous
and significant.
Research firm Gartner predicts that
by 2015 the cloud-based security service
market will reach $3.1 billion, up from
$2.1 billion in 2013. Globally, companies
large and small are using the cloud in
general to access services of all types,
including security, says Jonas Hellgren,
member of CSAs SecaaS Working
Group. The model has evolved to a
point where any service you can think
of can be delivered via the cloud, and
adoption rates back that up. Any security service traditionally delivered onpremises can now be delivered via the
cloud, he says.

For organizations considering using


cloud-based security, the following details services and features available, the
positives of using them, and considerations the weight before signing on.

Whats Available
To date, email and Web security services that help weed out bad URLs and
Web content, provide secure (typically
encrypted) email, reduce spam, perform
virus scans, and combat phishing attacks,
among other capabilities, have seen
the greatest adoption among organizations, says Lawrence Pingree, Gartner
research director.
SecaaS offerings, however, pretty
much cover the entire security spectrum. Service types available include
IAM (identity and access management),
website protection (anti-fraud, anti-DoS
[denial of service], anti-DDoS [distributed DoS]), application security testing,
vulnerability scanning and assessment,
penetration testing, security intelligence

engines, tokenization/encryption, and


SIEM (security information and event
management; log collection, correlation,
and alerting). Often, Fielder says, organizations use SIEM in conjunction with
a broader cloud-based MSSP-SOC (managed security services provider-security
operations center). More advanced
SecaaS offerings include Web application
firewalls, Fielder says.
Bob Tarzey, Quocirca director and
analyst, says recent research his firm
compiled based on European figures indicated that among organizations surveyed, advanced threat intelligence was
a top service (about 40%). This covers
a broad sweep, from vulnerability scanning and intelligence to signatures for
known bad stuff (files, links, etc.), he
says. To some extent, this is an extension of old-style virus definitions that
were some of the first things to be provided as cloud updates.
Another highly used service was emergency DDoS protection (also about 40%),
Quocirca found. Here, organizations with
a DDoS problem divert network traffic
to a service provider that cleans up the
traffic. Elsewhere, Quocircas research
found SIEM (about 35%) and DNS (domain name system) protection (about
35%) as often-used services.
Pingree says for any service provided
externally, ease of use and customer
service are the most common adoption
criteria for organizations and crucial
aspects overall. When the organization
doesnt own a security component, he
says, it relies on the external provider
to be focused on ensuring stability and
availability of the service the customer
is using.

The Benefits
Among the more notable enticements
for organizations to adopt SecaaS is
the ability to bolster current security
measures, achieve better positioning
in terms of compliance (government

Instead [of looking solely at cost savings], look at the


increase in quality, depth of knowledge provided, the likely
availability of more features, and business benefits. If
there are cost savings, they should be seen as an added
benefit.
KEVIN FIELDER
Co-Chair : Cloud Security Alliance Working Group

and industry regulations, industry standards and best practices, audits, etc.),
acquiring better reliability, and the
potential for cost-savings. Some companies also look to SecaaS to help avoid
the negative publicity and business
disruptions that can stem from security incidents.
As with other cloud computing services, adopting SecaaS offerings enables organizations to reduce their
on-premises technology infrastructures,
streamline processes, and achieve economies of scale previously not possible.
Specifically, Fielder cites speed of implementation, scale and elasticity of a
service, the depth of expertise available
from a security provider vs. whats typically available internally to most companies, the flexible and global nature of
a service, and reliability and resilience
as primary SecaaS benefits.
For Pingree, the greatest benefit of
adopting SecaaS is its ability to reduce
administrative burdens. Most larger enterprises want their IT security teams
focused on reducing threats and risks, as
well as the incident-response process,
he says. The biggest weakness most organizations have is that they dont properly resource their incident-response
processes. So redirecting resources is
essentiala factor that SecaaS offerings
help deliver.
In short, some organizations couldnt
or wouldnt deploy security architectures
and functionality without the benefit
of SecaaS, Fielder says. Any company,

however, can deploy leading-edge security solutions via the cloud and take
advantage of providers expertise with
SecaaS, he says. Additionally, SecaaS offerings provide the ability to easily integrate multiple sites into the service
regardless of location, as well as the
flexibility to rapidly scale up and down
a service to pay only for those services
the organization actually consumes.
DDoS protection is a good example of
the latter, Tarzey says. Why pay if you
arent under attack? he says.
Yet other SecaaS benefits include obtaining security thats more up-to-date
and part of a large-scale security operation. The latter means organizations
benefit from protection the provider
has developed based on issues other
customers have experienced. Broadly,
Tarzey says, SecaaS gives organizations
the freedom to focus on delivering value
to their business and leave the security
to experts.
From a budgetary standpoint,
SecaaS offerings are available in a wide
array of service contract options, including monthly, annual, and biennial contracts that make budgeting
easier, Pingree says. While theres a
notion among some that using cloudbased services automatically translates
into costs savings, this isnt necessarily true. Pingree says over the long
haul, using SecaaS offerings may cost
more than covering security internally. The tradeoff, however, is that
SecaaS offerings are much easier to

IN GENERAL, TARZEY SAYS, NO ORGANIZATION IN THE 21ST CENTURY


SHOULD NOT BE LOOKING AT ON-DEMAND SERVICE AS AN ALTERNATIVE
TO ON-PREMISES DEPLOYMENT.

CyberTrend / November 2014

adopt and digest and can lead to other


improvements from a security perspective, he says. For example, crosscustomer intelligence sharing and
threat-detection analytics can be enhanced, Pingree says.
Similarly, Fielder says that although
cost savings may be possible in terms of
not having to purchase infrastructure
and paying only for whats used vs. permanently paying for capacity to cope
with peak requirements, the impetus for
using SecaaS shouldnt be to save money.
Doing so is entirely the wrong viewpoint, he says. Instead, look at the increase in quality, depth of knowledge
provided, the likely availability of more
features, and business benefits. If there
are cost savings, they should be seen as
an added benefit.

The Drawbacks
Although the list of benefits associated
with using SecaaS is long and attractive,
there are potential risks and drawbacks
to consider before adopting such services, including those related to a providers reliability, policies and control,
quality and quantity of customer support, and ability to protect data moving
between the provider and the organization. Additionally, while many providers
offer geographic hosting, organizations
with requirements about where data is
hosted should ensure this.
For some organizations, the level of
customization required vs. the level of customization the provider can deliver can be
a drawback. Some providers, for example,
focus on serving the needs of many by delivering common capabilities as opposed
to providing highly customized services.
Some services may be more commoditized in order to create economies of
scale and provide a very standardized service, which is likely to be very good but
may not be capable of much tailoring to
meet specific needs or working practices,
Fielder says. Thus, organizations may need

The model has evolved to a point where any service you


can think of can be delivered via the cloud, and adoption
rates back that up. Any security service traditionally delivered on-premises can now be delivered via the cloud.
JONAS HELLGREN
Member : CSA SecaaS Working Group

to flex slightly to meet the providers way


of working, he says.
Elsewhere, because SecaaS providers
store, access, and administrator key security technologies, they are targets for
hackers and a centralized risk across a
multitude of customers, Pingree says.
Tarzey, meanwhile, says vendor viability
is the main issue for organizations,
though this is true of any purchase.
Although its less of an issue with security providers, another concern is if
a SecaaS provider goes bust and how
the organization will access its data and
transfer services afterward, Tarzey says.
In general, Fielder recommends organizations give consideration to the contractual and SLA side of using a SecaaS
offerings. Moving to any cloud-based
service means less infrastructure and direct support costs, but it also means a
very strong reliance on contractual and
SLA components to ensure the service
meets requirements, he says. He also
recommends paying attention to integration and that the service will likely
rely on Internet connectivity, meaning
the possibility for added costs related to
increased bandwidth and ensuring the
resilience of links.

Viable Candidates
In terms using SecaaS and all cloudbased services in general, Tarzey says no
organization in the 21st century should
not be looking at on-demand service as
an alternative to on-premises deployment. Increasingly, he says, old-style IT
and vendors that fail to adapt are dying.

AMONG ORGANIZATIONS SURVEYED,


ADVANCED THREAT INTELLIGENCE WAS A TOP
SERVICE (ABOUT 40%).

10

November 2014 / www.cybertrend.com

In terms of SecaaS adoption specifically,


Pingree says that surprisingly its large
and low-end large enterprise segments
where Gartner sees the most adoption,
despite the fact that the mid-market is
the primary target for SecaaS. In general,
government, health care, and banking
verticals spend the most on security in
general, he says.
Although large organizations are most
likely to have extensive security services
internally, especially if they fall within
government and heavily regulated environments, even these organizations can
use cloud services to fill gaps in their expertise or technical stack, Fielder says.
Large organizations may also look to
cloud-based security offerings when they
have particular needs globally and deploying security themselves would be too
complex or cost prohibitive.
Small and midsize enterprises often
appreciate how using SecaaS reduces
the need to maintain large and expensive security teams and infrastructure.
That said, Fielder still recommends that
organizations maintain some internal
security expertise where possible, even
if outsourcing most services.
Also enticing to small and midsize
organizations is the ability to leverage
cutting-edge services and expertise and
deploy solutions that might otherwise not
be possible. For example, installing and
maintaining a SIEM internally and running a security operations center is complex and expensive, Fielder says, to a
point where these capabilities are often
completely overlooked. Taking advantage of a cloud-based SIEM and SOC,
however, lets a small and midsize enterprises benefit from 24/7 log correlation
and alert features that are backed up by a
significant degree of security expertise.

Risk Management ROI


HOW A SOLID STRATEGY FOR MANAGING DIGITAL RISKS PAYS OFF

AS ENTERPRISES INCREASINGLY look


to digital assets to fuel growth, managing
digital risks is also becoming important.
Gartner recently projected that by 2020,
60% of digital businesses will experience
a major service failure because of IT security teams inability to manage digital
risk in new technology and use cases.
Interdependencies among IT, operational
technology, the Internet of Things, and
physical security technologies will necessitate risk-based approaches to governance and management, Gartner states.
Lacking a solid DRM (digital risk
management) strategy, a business can
truly go under, and the speed at which
it goes under is dependent on the business type, says Bob Tarzey, Quocirca director and analyst. A bank that neglects
compliance, for example, wont last long.
Smaller businesses may get away with
it, but even they need to be aware of risk
and mitigate, he says. Conversely, possessing a solid strategy can offer peace of
mind. For compliance risk, for example,

mitigation costs are offset by potential


fines, but all risk mitigation is offset
by hard figures relating to lost business
and soft measures such as reputational
damage, he says.
While developing an effective DRM
strategy is important, so is knowing how
to go about it and what mistakes to avoid.
This article explores those topics.

DRM Exposed
Sean Pike, program director with IDC,
says the emergence of the term digital
risk management is really a recognition
of business transformation that has taken
place over the last 20-plus years, during
which time corporations have shifted to
large-scale digital architectures with complex IT systems requiring specific and
unique risk management processes.
Still, Pike is wary of the term. While
readily acknowledging corporations have
transformed on expanding technology platforms, he says, the idea that digital risk
management is a new discipline discounts

any transformation that has occurred within


the corporate risk office.
Overall, Pike says, managing digital
risk is vital to any organization. While
most organizations are likely already
doing it in some way, he says, its important to recognize risk management
strategies must evolve as quickly as technology changes. Some changes, such as
adopting third-platform technologies, can
significantly modify how an organization
should view risk, he says. Putting data in
a cloud architecture, for example, might
reduce the risk of data loss but also create
risk related to cross-border data flow.
For Tarzey, DRM encompasses many
aspects, including ones related to data
and performance/systems. Data-related
aspects include the risk of exposure of
regulated data (and the consequences
of such) and the risk of theft of IP (and
competitive impact of that). Performance/
systems-related aspects include the impact of poor systems performance on
a businesss ability to function and the

CyberTrend / November 2014

11

inability to get business processes running again after a disaster. Mobile devices,
using or not using the cloud, hacking,
insider threats, etc. introduce numerous
other risks, although these nearly always
link back to data or performance/systems
DRM aspects, Tarzey says.
Michela Menting, ABI Research practice director, says intrinsic to DRM is the
concept of information security and an
understanding of what information needs
to be protected and managing the risks of
not properly protecting information and
systems. The key is to be able to protect
information at rest and while in transit
anywhere, as it flows through the corporate network and through third-party
service and application providers, she
says. Information security is therefore a
non-negligible aspect of digital risk management, as it involves the formulation
of policies as much as the configuration
of hardware and software, such as patch
management, network scans, logging and
monitoring activity, etc.
Menting says within the DRM context,
enterprises must consider such things
as data loss prevention, encryption and
key management, and identity and access
management because each enables assessing and managing risks appropriately.
Information governance is the sort of
overarching field of application for risk
management, she says. Its as much
about management as it is about assessing
risks and determining the proper level of
security that will not too adversely impact
business productivity.

Developing A DRM Strategy


When formulating a DRM strategy,
Pike says, odds are that digital risks are
already being tracked and accounted for
somewhere within the organization, possibly in a formal risk management office
or within an IT or business continuity
and contingency function. The challenge
is coordinating efforts to gain better visibility and reduce overlap. Pike suggests
organizations start by gaining a clear understanding of their business functions,
individual processes, and technology elements that drive functions and processes.
Just understanding where data eventually

12

November 2014 / www.cybertrend.com

The key is to be able to protect information at rest


and while in transit anywhere, as it flows through the
corporate network and through third-party service and
application providers.
MICHELA MENTING
Practice Director : ABI Research

lands for long-term storage isnt enough.


Unforeseen risk is scattered throughout
business processes, including data acquisition/creation, short-term storage, and
data processing, he says.
After identifying employees already
tracking digital risk, Pike recommends
creating a working committee, appointing a strong leader, and mapping
initially one business process end-toend. During the process, seek business
leaders input regarding risk factors and
in-place risk mitigation strategies.
Involving risk management committees at the beginning of new business
processes and corporate initiatives will
help transform the organizations culture
from reactive to risk-assessing, he says.
Something to watch for is the dedication of individuals involved in tracking
processes and quantifying risk. Too
often, teams fail to tackle the more difficult and mundane challenges, Pike says.
If risk officers dont understand a technology or cant get support from a given
IT group or function, its very easy to
find other work, he says. Thus, difficult
tasks may never get addressed.
Elsewhere, an organizations size, regulatory requirements, methods of operations, budget, and available resources
will directly influence how risk management is implemented. Enterprises should
implement policies and management,
prevention, pre-emptive, and response
strategies, but balance them against cost
and requirements, Menting says.
Menting says that organizations most
significant considerations arguably include information risk and security processes; account and access management
processes; protection against attacks
both from within and without; early
warning and watch notifications; incident response and disaster recovery

plans; compliance information management; secure configuration for all ICT


(information and communications
technology) devices; monitor and test
security controls; internal and external
incident reporting; activity monitoring
on ICT systems and networks; incident
forensic capability; mobile and remote
working policy; records and retention
management; control access and analysis
of activity, network, and audit logs; data
loss and leak prevention; business continuity, backup, and disaster recovery;
electronic discovery; and information life
cycle management.

The Payback
Pike says theres plenty to gain from
possessing a strong, effective DRM
strategy. Arguably the biggest benefit is
transformation of the organizations culture. Organizations with a risk-assessing
culture tend to be thoughtful about how
new lines of business or technologies
will affect the rest of the organization, he
says. That level of thoughtfulness and
cooperation can reduce spending, reduce
duplication of effort, and create collegial
environments that encourage innovation, he says.
Similarly, Menting says the advantage
of having a well-planned IT structure is
that its conducive to enhancing business
models and modes of operation. The
current progression and expansion of cybercrime means that most organizations
with a digital presence will eventually
have to deal with an incident, she says.
Legitimate organizations have the most
to lose. Theyll need to deal not only with
cyber threats but also with the consequences of deficient security. Liability
can be eschewed, however, with a wellimplemented information governance
framework.

Firewalls Today
WHAT YOU NEED TO KNOW

ONE OF THE BIGGEST problems we

A Little Background

face when it comes to securing networks is the simple fact that the PCs,
laptops, and mobile devices that help
make up the network are kind of nave.
When connected, after little more than
a handshake, computers readily share
information and communicate freely
with one another.
This is good, vital even. But most
other devices and networks cannot be
trusted, and computers are not built to
treat any and all incoming transmissions like the wretched hive of scum
and villainy that they potentially are.
Instead of throwing up roadblocks to
the computers ability to exchange information, however, we can use software and purpose-built devices that act
as gatekeepers: this is the firewalls job.
In this article well take a closer
look at the firewalls purpose, the types
available, and how firewall technology
is moving forward to tackle modern
threats.

The earliest firewalls appeared in


1988 and were referred to as packet
filtering firewalls. According to Eric
Maiwald, managing vice president at
Gartner, these firewalls only took a peek
at the packet headers of common protocols (IP and TCP/UDP) to determine
if the data was legitimate or adhered to
the firewalls rules.
Modern firewalls still work the same
way, and a firewall administrator can
enact custom rules to filter out any
traffic deemed unnecessary. Most firewalls also include a default rule set that
requires little if any intervention to
work properly. There also exist hardware and software firewalls.
Routers stood in as the first kinds of
firewalls. Maiwald explains that hardware firewalls are typically standalone
devices that include both the firewall
software and the hardware it runs on.
Hardware firewalls are often called appliances and are physically connected

to the network by network cables. He


describes a software firewall as one that
runs inside a virtual machine, which
can be a part of an enterprises internal
virtual environment or running in a
public cloud.

Stateful vs. Stateless


Generally speaking, there are two
types of packet filtering firewalls:
stateful and stateless. The former type
takes into account the session context
when making judgments about whether
or not a certain packet should be allowed or denied entry to the network or
device. These firewalls tend to require
more memory and can add some latency to the network. The context that
a stateful firewall can examine include
UDP or TCP ports, source IP address,
destination IP address, session initiation timestamp, handshake timestamp,
data transfer timestamp, and transfer
completion timestamp. As you can
imagine, a stateful firewall is better

CyberTrend / November 2014

13

suited to networking environments that


require more careful management.
Stateless firewalls are more simplified, consume less system memory, and
are capable of more quickly allowing
and denying incoming packets. Stateless
firewalls also make a lot of sense when
processing traffic running over a stateless network protocol. The drawback
here is that stateless firewalls are less
customizable.
But, says Maiwald, as firewalls have
matured, policy enforcement was able to
take into consideration packet contents
as well. This is the key ability of application layer firewalls, and more and more
organizations are turning to them to secure their networks.

Application Layer Firewalls


An application layer firewall is capable
of inspecting whats called the application level of the TCP/IP stack, which
includes browser traffic, Telnet transmissions, FTP activity, and any traffic generated by applications. One of the primary
benefits here is the ability to block malware at the source, but there is a cost in
added latency.
Application firewalls are most commonly used together with a packet filtering firewall. One of the biggest
drawbacks for application layer firewalls
is that the most complex (and most
capable) filtering rules only apply to a
handful of applications associations,
leaving a majority of those associations
to pass through the more basic filter.
Application layer firewalls also have no
way of detecting corrupt applications
that have been altered, for instance using
memory-based exploits and attacks.

The Rise Of The


Next-Generation Firewall
To address security vulnerabilities
such as these, a NGFW (next-generation
firewall) was created that uses numerous
techniques to accomplish the firewalls

A [unified threat management] device is a specific


category of firewall targeted at smaller enterprises. It
includes the firewall, IPS, URL filtering, and [antivirus]
functionality.
ERIC MAIWALD
Managing Vice President : Gartner

ultimate goal of blocking bad traffic and


giving the green light to everything else.
As Maiwald describes, An NGFW is
defined as a firewall that includes firewall, IPS (Intrusion Prevention System),
URL filtering, and application control
features. According to Maiwald, the last
of these features, the ability to create specific and complex rules regarding what
is and isnt allowed for website and applications on the network, is most compelling to organizations large and small.
These capabilities make it easy to filter
out malicious software regardless of origin or exploit.

A Firewall For One & All


Theres no such thing as an organization too small to be a victim of cybercrime. As such, a firewall is necessary
no matter how many people are on the
network. But as weve already shown, not
all firewalls are created equally. Maiwald
suggests that there are three general use
cases for firewalls.
One popular option he describes is to
use a firewall to control traffic that can
reach the Internet-facing systems, usually as a part of the enterprises DMZ
(demilitarized zone or perimeter network). It might also be used to provide an intrusion prevention function in
cases of low bandwidth utilization. For
higher bandwidth utilization, a separate
IPS device is used. Note that the security controls in this use case may also
include separate load balancing, SSL decryption, and Web application firewall
functions.

FOR ORGANIZATIONS WITH A SMALL NUMBER


OF USERS . . . MAIWALD SUGGESTS USING A
UNIFIED THREAT MANAGEMENT DEVICE.

14

November 2014 / www.cybertrend.com

Firewalls have also proven effective when used on network-connected


client systems. In this case, Maiwald
says, the firewall is deployed to control traffic that can reach internal systems and also to control how the client
systems use the Internet. An IPS device may be used or the IPS features of
the firewall may be used depending on
bandwidth requirements. The firewall
may provide application control and
URL filtering to enforce policy on how
the users access the Internet.
For organizations with a small
number of users, or for small branch
offices, Maiwald suggests using a UTM
(unified threat management device).
A UTM is a specific category of firewall targeted at smaller enterprises. It
includes the firewall, IPS, URL filtering,
and [antivirus] functionality.
Data centers and organizations with
high bandwidth requirements should
investigate deploying a firewall that
controls network traffic as it passes between internal network security zones.
The primary purpose is to provide
separation between the systems in the
zones, says Maiwald. Today, this use
case is often fulfilled with a software
firewall deployed within the virtual environment.

Time To Get Serious


About Security
There are a multitude of security
technologies and services currently
available that claim to prevent breaches,
infections, and data loss, but not all of
them are compulsory. Firewalls, however, do not fall in the optional category, and not are firewalls are the same.
Get them, make them work for you, and
enjoy some peace of mind.

CA, From Mainframes To Mobility


DESPITE SHIFTS IN MARKETS, CA TECHNOLOGIES REMAINS DEDICATED TO THE CUSTOMER

16

IT'S COMMON FOR companies to

A Bit Of History

evolve over time as the demands of the


market change, and CA Technologies
(www.ca.com) is an example of one
company that made a market shift deep
into its timeline. Originally named
Computer Associates, the company has
gone through a few name changes as
it expanded its market, supporting IT
management and security for mainframe and distributed platforms as well
as cloud and mobile solutions.
In order to fully understand where
CA Technologies is positioned today,
its useful to look at the companys
past, as its current products have built
upon the company's years of experience. With an ongoing focus on a B2B
(business-to-business) technology
model, CA Technologies continues to
make customer service its first priority,
and to examine the experiences of its
customers as a basis for adding new
features and functionality that benefit
those customers.

Established in 1976, CA made a


name for itself almost immediately
by introducing its CA-SORT product
for mainframe systems and bringing
in $5 million in revenue by the following year. Using this impressive momentum, CA continued to add more
and more products over the next decade, including Dynam/T, an enterprise storage management solution, in
1978, the first iteration of its eTrust
enterprise security management
product line in 1983.
In 1989, CA Technologies introduced CA90S, which was a multiplatform development solution for
enterprises. Incidentally, with such a
strong portfolio, CA became the first
software company to ever hit $1 billion
in revenue, which it reached in 1989,
merely 12 years after the company was
founded.
Throughout the 90s, CA continued
its innovation streak. In 1992 and

November 2014 / www.cybertrend.com

1993, the company introduced a security and storage management solution


with scheduling for the Unix platform
called Unicenter for Unix. This specific product would continue to evolve
over the next few years and become a
cross-platform solution that worked
with a wide variety of operating systems and Web environments. In 1995,
CA Technologies expanded into Latin
America, Africa, and the Far East, and
introduced a wide range of products
for Windows NT. And in 1996, CA acquired Cheyenne Software in order to
offer more enterprise storage options.
It was in the early to mid-2000s
that CA embarked on its most significant changes, starting in 2001 when
the company introduced its FlexSelect
licensing program, which was a subscription model for its software that
helped customers take advantage of
new technologies with less of an upfront investment. In the early 2000s,
CA created brand units to make it

easier for customers to find the products that best matched their needs,
started making its expertise more available to customers, and put strategies
in place to help customers run their IT
operations more efficiently.
CA then introduced the EITM
(Enterprise IT Management) concept,
and released 26 versions of existing
products and developed 85 new products following the EITM framework.
And in 2010, the company changed its
name to CA Technologies.

Building On Its Foundation


In 2007, CA Technologies officially created a Mainframe Business
unit, specifically designed to develop
and deliver products for customers
still taking advantage of mainframe
systems. CA also developed a large IT
solution portfolio designed to cover
nearly every aspect of IT operations,
including infrastructure management,
data center automation, security management, and IT governance.
Over the next few years, CA would
make key acquisitions to further bolster its portfolio. The company acquired NetQoS in 2009 to better offer
services for physical environments and
help companies take advantage of virtualization and the cloud. In 2010, the
company acquired Hyperformix, 4Base
Technology, Nimsoft, and 3Tera, all of
which helped the company make advances in the cloud computing market
more effectively.

Mainframe, Cloud & Mobile


Every product release and acquisition
has led CA Technologies to where it is
today. The company helps businesses
offer better services to their customers.
And when businesses decide to take
advantage of CA products, they
also get the added benefit of tapping
into the companys nearly 40 years of

experience in
multiple areas,
including mainframe, cloud,
and mobile.
CAs dedication to advancing mainframe
systems continues today.
New technology
releases are designed to help
customers take
a d v a n t a g e o f Earlier this year, CA Technologies opened a technology center in Santa Clara, Calif. Among
the center's areas of focus are mobility, big data, digital payment technology, and security.
the unmatched
capabilities of
the mainframe
while enabling them to get the most out
EMM includes CAs Mobile Device
of their hybrid IT environments: mainManagement solution, which is deframe, distributed, cloud, and mobile.
signed to give companies more control
CA also offers implementation services
over company-issued devices as well
to make sure that companies can get the
as make it easy to implement BYOD
most out of their CA product deploy(bring your own device) policies and
ments and make a smooth transition.
allow consumer-owned devices in the
This same idea applies to the compaworkplace. And as with all of the comnys cloud offerings. Not only can CA
panys other products, CA also offers
offer companies cloud-based solutions,
implementation services to make sure
but it will also help its customers take
you get your MDM system up and runadvantage of the cloud in new ways. CA
ning quickly.
Technologies offers a wide variety of
implementation services to help busiA Well-Rounded Approach
nesses get cloud environments up and
CA Technologies isnt content to
running as well as integrate them with
offers solutions for only one market
existing physical and virtual environor one specific use case. Instead of
ments. Even if a company has never
counting on you to buy solutions from
moved data or workloads to the cloud
multiple vendors, which can create
in the past, CA will make this process
fragmentation and cause major inteas painless and seamless as possible,
gration issues down the road, CA ofso any company can start setting up
fers a wide variety of solutions to help
cloud environments.
its customers cover all bases. And perPerhaps the most recent addition
haps the biggest benefit of using CA
to the CA Technologies portfolio is
products is that you know regardless of
the companys line of mobile prodwhether you opt for a physical or viructs. The star of the show here is CAs
tual onsite deployment or go with the
Enterprise Mobility Management
cloud, the solutions can integrate and
Suite, which was recently released with
youll have CAs wealth of experience
day-one support for the new iOS. CA
behind you every step of the way.

WHETHER YOU OPT FOR A PHYSICAL OR VIRTUAL ONSITE DEPLOYMENT


OR GO WITH THE CLOUD, EVERY SOLUTION YOU USE WILL INTEGRATE
WITH THE OTHERS.

CyberTrend / November 2014

17

CA Technologies Mobile Services


GET MORE CONTROL OVER COMPANY-ISSUED & EMPLOYEE-OWNED DEVICES

MOBILE DEVICES CLEARLY have had


a major role in the business world for
many years now, but they also have a
large presence in the personal lives of
employees. Now that employees, as consumers, have gained dramatically more
familiarity with advanced devices, there
is increased interest in using personal
devices for business purposes. This
phenomenon gave rise to the concept
of BYOD (bring your own device) programs where employees could bring in
their personal smartphones or tablets
and start using them in the office or to
work from home. But BYOD (also sometimes referred to as consumerization)
also introduces security concerns for organizations that arent used to allowing
outside devices into the workplace.
CA Technologies (www.ca.com) offers MDM (mobile device management)
solutions that not only give enterprises
more control over company-issued devices, but also offer control over employee-owned devices that may start

18

November 2014 / www.cybertrend.com

infiltrating the workplace. To help


companies keep up with this trend and
give consumers the freedom they want
without compromising security, CA
Technologies offers a combined solution that includes both an MDM product
and knowledgeable support to help customers use MDM effectively.

CA Mobile Device Management


CA MDM is an extensive mobile
management solution designed to give
you more control not only over the mobile devices themselves, but also over
the applications, messages, and other
data stored on the device. You have
complete control over what types of
applications can be downloaded to a
device as well as what company information that device has access to. CA
MDM also includes a self-service portal,
so employees can be more involved in
the enrollment process. This will help
minimize the impact on your IT team
and adds automation so they can focus

on tasks that are more important to the


business.
One of the biggest and most helpful
features for companies that support
BYOD is Smart Containerization. Basic
containerization makes it possible for
companies to essentially partition off
parts of a mobile device and separate
them from each other. You can then
put business-only applications in one
container while leaving the rest of the
device open for the employees personal
use. CAs Smart Containerization differs from similar approaches in that
instead of using one application as a
container, providing remote access to
business apps, or making the employee
set up two separate user accounts on the
device, you can make those decisions on
a file-by-file basis.
For example, instead of putting every
single application, business or otherwise,
into a specific container, you could make
it so that one specific application isnt allowed to gather location information as it

on each device and


ensure the safety of
potentially sensitive
company data. You
can put settings in
place so that certain
applications and
features are turned
on or off. And you
can even make it
so that if a device is
lost or stolen, you
can remotely lock
it or wipe it to prevent company data CA MDM is an extensive mobile management solution designed to give you more
control not only over the mobile devices themselves, but also over the applications,
from leaking. CA
messages, and other data stored on the device.
Technologies provides its expertise
on how to set up
these specific feaServices tier, CA also includes some
tures and will help you make sure all of
additional help with its Limited
your bases are covered.
Implementation Services. For example,
In addition to security and containCA MDM Implementation
rather than simply helping you through
erization, CAs MDM Implementation
Services
the implementation process using
Services can also help you better manage
CA MDM can be implemented as a
knowledge and experience, CA will
the device throughout its life cycle. The
physical onsite product, a virtualized apsend experts out for onsite implementaaforementioned self-service portal inpliance, or as a cloud-based SaaS (softtion and configuration to make sure the
cluded in CA MDM gives employees acware as a service) solution, but you
system is deployed to your organizations
cess to certain features and also gives
wont be alone in the deployment proexact specifications. And as long as you
them the ability to submit tickets if they
cess if you go with CAs Mobile Device
have the necessary support infrastructure
want certain functionality added to
Management Implementation Services.
in place, which CA outlines ahead of
their devices. And for companies that
With these services, CA Technologies
time, the process will take about a week.
still choose to issue their own devices,
will help you set up your MDM solution
This illustrates CA's dedication to enthere are performance monitoring tools
in whatever primary environment you
suring successful implementations.
to help make sure the mobile devices are
choose and help you not only configure
CA Technologies also goes beyond its
still working properly and put a refresh
the basic features, but also help you dig a
core implementation service offerings to
plan in place if necessary.
bit deeper into available tools.
include installation and configuration
For example, CA MDM has multiple
of one CA MDM instance on a sandbox
security features designed to protect the
CA MDM Limited
or non-production environment. CA
company and the employee. You can
Implementation Services
will also install CA MDM on one server
set up data portioning, which will sepaIn addition to the base features inand fully configure five devices or derate personal and business information
cluded in the CA MDM Implementation
vice types. For companies that arent as
familiar with MDM technology or just
CA WILL SEND EXPERTS OUT FOR ONSITE
want to have solid working examples
IMPLEMENTATION AND CONFIGURATION TO
of how to deploy the technology, CAs
Limited Implementation Services offer a
MAKE SURE THE SYSTEM IS DEPLOYED TO
great start. Instead of just deploying the
YOUR ORGANIZATIONS EXACT SPECIFICAsolution into your production environTIONS. AND AS LONG AS YOU HAVE THE
ment and learning along the way, you
have access to a test environment and
NECESSARY SUPPORT INFRASTRUCTURE IN
devices with CA MDM already installed
PLACE, WHICH CA OUTLINES AHEAD OF TIME,
to get your footing before opting for a
THE PROCESS WILL TAKE ABOUT A WEEK.
company-wide implementation.
normally would. Or, as CA Technologies
suggests, you could set it up so that a document cant be stored on a devices hard
drive and can only be accessed from a
company-approved computer or device.
Companies have quite a bit of control over CA MDM and can make sure
every application or piece of data falls in
line with the companys policy, but there
are other cost-related benefits as well.
Employees get the freedom to use their
own devices while the company gets to
maintain control over business-related
parts of the device. And at the same time,
the company doesnt need to go through
the process of buying a device, adding a
voice and data plan, and then deploying
it. This could dramatically cut costs for
companies that would otherwise have to
pay for every individual device that their
employees use for work.

CyberTrend / November 2014

19

CA Technologies Mainframe Services


THE TOOLS YOU NEED TO ASSESS, IMPLEMENT & OPTIMIZE YOUR MAINFRAME SOLUTIONS

EVEN THOUGH MAINFRAME systems


arent as common as they were a few
decades ago, mainframe computers remain some of the most reliable, efficient, and secure systems available
today. Over the past 50 years or so,
mainframe manufacturers have worked
to add new capabilities and greater
scalability, which has kept the mainframe market relatively fresh and preserved mainframe systems as the go-to
infrastructure for government organizations and large corporations.
However, there are some downsides to mainframes that organizations have dealt with for years, some of
which have forced many enterprises to
move away from mainframes entirely.
Because mainframes are rather complex and take up quite a bit of space in
the data center, it takes effort to maintain them properly and make sure you
get the most out of them.
Fortunately for those companies
that still find mainframe systems

20

November 2014 / www.cybertrend.com

useful and rely on them for a significant portion of their computing tasks,
CA Technologies (www.ca.com) offers
solutions that help you better understand the equipment youre using and
formulate a plan for how to continue
leveraging those investments for years
to come. And as it does with all of its
solutions, CA will help you through
the implementation process to make
sure your mainframe systems are operating at peak performance well into
the future.

CA Assessment Services
The first step in improving the performance of your mainframe systems is
to acquire a baseline reading as to how
your current infrastructure is running.
At the outset, CA Technologies will
help you evaluate your mainframes,
find areas that could use improvement,
and point out specific changes you can
make, using CA solutions, to get more
out of those systems.

Technicians from CA Technologies


will perform health checks on your
physical equipment as well as on your
software solutions to determine how
those products are being used and how
you can use CA solutions to improve
them. CA Technologies will also give
you product usage reviews, which are
essentially grades for how well you
are currently taking advantage of your
mainframe solutions and how well
those use cases line up with the longterm goals of your organization.
At the conclusion of this review process, youll receive a report that provides specific recommendations on
how to best implement and deploy CA
solutions to get the most out of your
mainframe systems.

CA Implementation Services
Once you have a solid understanding of the current state of your
mainframe system, you can move on
to the implementation and deployment

installed and implemented, everything is optimized to the highest


degree possible. CAs optimization services are designed to help
you make small or large changes to
your mainframe solutions in order
to not only improve overall performance, but also reduce CPU consumption to improve efficiency and
resource utilization.
At the beginning of the optimization process, CA will help you
spot the low-hanging fruit that your
IT team might not have noticed
while focusing attention on other CA will help you integrate best practices into your overall
aspects of your infrastructure. Even mainframe approach and find the best possible use cases for
the smallest tweaks can significantly your existing systems.
impact how your mainframe system
runs over the long run, so these
changes are essential to making true
CAs mainframe solutions. Blue Hills
efficiency gains.
initial challenge was, of course, that
From this point, youll once again
it wanted to provide the best possible
continue to evaluate how you use your
service, but the company also wanted
mainframe equipment and identify
to make sure it offered service at a
potential future improvements. CA
competitive price point. This comwill help you integrate best practices
plex goal was compounded by the fact
into your overall mainframe approach
that the companys software strategy
and find the best possible use cases for
consisted of multiple disparate soluyour existing systems. Even after your
tions from different vendors, which
mainframes are fully implemented and
prevented Blue Hill from having one
optimized, CA will perform additional
integrated system for its own internal
infrastructure, as well as infrastructure
hosted for clients.
AT THE BEGINNING OF THE OPTIMIZATION
The solution to Blue Hill Data
PROCESS, CA WILL HELP YOU SPOT THE
Services' problem was relatively
straightforward and involved taking
LOW-HANGING FRUIT THAT YOUR IT TEAM
advantage of 18 CA Technologies soMIGHT NOT HAVE NOTICED WHILE FOCUSING
lutions in order to create a well-inATTENTION ON OTHER ASPECTS OF YOUR
tegrated system for better managing
and maintaining mainframe software.
INFRASTRUCTURE.
From performance and automation to
application quality and testing, Blue
health checks from time to time to
controls, and more. Combined with the
Hill, with the help of CA experts, immaintain past adjustments and find
help youll receive from CA experts,
plemented numerous solutions that
even more ways to improve perforthese knowledge resources are invaluworked in concert to improve the commance.
able as you move through the initial
panys mainframe performance and
implementation and on through the
establish a better foundation for seroptimization process.
CA Mainframe Solutions In Action vice and support so Blue Hill could
Blue Hill Data Services is a New
provide a high level of service for its
York-based data center hosting proCA Optimization Services
customers at a lower price. And as an
vider that focuses on application supBecause companies with mainframes
added bonus, the company was able
port, disaster recovery and business
often use them as the computing backto improve the productivity of its staff
continuity, and mainframe systems,
bone of their business, its important
by giving them easier to use tools and
which makes it a prime candidate for
to make sure that once everything is
adding more automation.
process, which is another area in which
CA Technologies offers experienced
experts for assistance. In fact, CA
mainframe professionals have aided in
thousands of onsite implementations
and will bring all of that expertise to
your deployment to make sure its as
successful and seamless as possible.
CA focuses not only on making sure
the implementation is done right, but
also that it is finished within a reasonable amount of time. Experts will work
closely with your mainframe IT team
to quickly deploy mainframe solutions
and at the same time cut costs.
CA also offers solution implementation services designed to help deliver
mainframe solutions over a longer period of time in phases. This ensures
that your infrastructure can keep
up with the changes you make, and
accommodates testing along the
way to ensure that everything is working properly.
CA offers numerous resources, including Deployment Playbooks and
Solution Run Books from CA Services,
which are designed to help you better
understand the operational requirements of your business and provide
guidance for installation, security

CyberTrend / November 2014

21

CA Technologies Cloud Services


EASE THE CLOUD ADOPTION PROCESS & LET CA HELP YOU PROPERLY IMPLEMENT SOLUTIONS

WHEN BUSINESSES TRY to implement


cloud computing, they often find that the
process involves much more than simply
finding a reliable provider, signing an SLA
(service-level agreement), and moving
data and workloads over to the cloud. The
key to a successful cloud implementation is to find a balance between cost and
capacity. You want to gain the flexibility
and scalability benefits of a cloud environment, but you have to be careful you dont
end up overspending by putting too much
in the cloud and maintaining a high capacity, and therefore higher cost, over an
extended period of time.
CA Technologies (www.ca.com) helps
remove the guesswork from these specific
situations and helps companies embrace
the cloud in the most effective way possible and without exceeding their budgets. CA doesnt just give you physical,
virtual, or cloud-based solutions and then
leave you to figure them out on your own.
They also offer specific services designed
to help throughout the implementation

22

November 2014 / www.cybertrend.com

process and make sure you get the most


out of your solution investments.

Cloud Capability Assessment


& Strategy Services
When companies get to the point
where theyve virtualized as many workloads as possible and dont see a path to
further efficiency gains, they end up in a
state CA Technologies refers to as virtual stall. This is a gray area where the
company may think it has maximized its
resource utilization, but in all actuality,
there are gains to be made either through
further optimization or by moving some
workloads to the cloud. CA Technologies
offers its CCAS (Cloud Capability
Assessment & Strategy) Service to help
companies overcome these issues.
The CCAS Service gives you a onetwo punch of assessment and strategy,
which means youll be able to improve
your existing setup, but also find ways
to improve your situation well into the
future. The assessment portion of CCAS

revolves around making sure the people,


process, technology, and architecture
in your organization are all working together efficiently. CCAS will then help
you find ways to fix issues in all of these
areas and make your virtualized environments function more effectively.
The other component of CCAS,
strategy, is where CA helps you look forward to where you can go in the future.
This transcends simply fixing problems
that already exist and focuses on what
issues may arise in the future as well as
what opportunities are ahead. The goal
of CA Technologies with CCAS is to reduce the risk of your investments and
make sure you get the best possible performance at the lower cost, whether its
in an onsite virtualized environment or in
the cloud.

CA Business Service Insight


Implementation Services
Another major issue to think about
when moving to a cloud environment

is the SLA you have with your provider.


This contract states that the provider will
offer a certain level of availability and
reliability as well as a pricing structure.
What ends up in the SLA is ultimately
up to the provider and the customer, but
companies shouldnt stop caring about it
after they sign on the dotted line. Thats
where CA Business Service Insight and
CA BSI Implementation Services come
into play.
Instead of performing manual reporting to make sure a provider is
meeting your SLA requirements, CA BSI
Implementation Services will help you
install an IT service management solution that tracks key metrics and ensures
youre getting the performance you were
promised at the right price. CA also offers this same service to cloud providers,
so they can track metrics from a service
delivery agreement point of view. This
can be particularly beneficial (and eliminate confusion) if both parties use the
same solution to monitor the services
and ensure everything is performing at
the highest possible level.
For cloud customers, CA BSI Implementation Services are beneficial because you can set up a centralized SLM
(service-level management) solution that
helps you monitor the environment and
make sure that it is meeting your companys needs. Its also a nice reporting
tool that can help in cases of regulatory
compliance because you have a record
of everything moving in and out of the
cloud environment. And for cloud providers, they can improve customer service by proving that they are meeting the
agreed-upon requirements in the SLA
and vowing to make necessary changes
to the service if it isnt up to par.

CA Secure Cloud
Another way in which CA Technologies uses the cloud to address core
business needs is with its CA Secure
Cloud product (formerly called CA
CloudMinder). CA Secure Cloud is a
Web-based IAM (identity and access
management) solution that enables organizations to stay on top of identity
management and governance across

all employee devices, whether they be


laptops, desktop computers, or mobile
devices, and all applications, including enterprise and partner applications as well as SaaS (software as a
service) products.
One chief benefit of CA Secure Cloud
is that it provides federated SSO (single
sign-on), which pairs each employees
individual authentication information
with multiple applications and systems.
This simplifies matters both for employees and IT, and contributes to CA
Secure Clouds overall aim of simultaneously boosting productivity and lessening the opportunity for data loss.

THE GOAL OF CA
TECHNOLOGIES
WITH CCAS IS TO
REDUCE THE RISK
OF YOUR INVESTMENTS AND MAKE
SURE YOU GET THE
BEST POSSIBLE
PERFORMANCE AT
THE LOWER COST,
WHETHER ITS IN AN
ONSITE VIRTUALIZED ENVIRONMENT
OR IN THE CLOUD.
CA Clarity PPM
CA Clarity PPM (Project & Portfolio
Manager) helps you better track projects from the idea stage through to
deployment and ongoing support.
Additionally, Clarity PPM enables you
to manage your entire business portfolio in order to align your resources
with your overall business objectives.
CA Clarity PPM offers a centralized
management console for tracking every
resource involved in a project as well as
every aspect of your portfolio. The key
benefit of Clarity PPM is that you get an
in-depth view every step of the way so
you can make more informed decisions.

As it does with its other solutions,


CA Technologies offers Implementation
Services for Clarity PPM to help companies deploy and use the product
in the most effective way possible. If
companies were to set up the solution
on their own, it could takes years before it reached its potential simply due
to the fact that the business may not
be familiar with the product. But CA
Technologies helps companies input all
relevant information into Clarity PPM
so that they can go beyond the deployment phase much more quickly and
instead focus all of their resources on
monitoring and managing their projects
and portfolios. This idea is incorporated in CA Clarity PPMs Foundation
and Acceleration Services, which help
you build out your implementation and
then optimize it in a timely manner.

CA Clarity PPM In Action


Home Hardware is a home improvement retailer based out of Canada with
four distribution centers. Because all
of its stores are independently owned,
Home Hardware has to use IT solutions to make sure those locations always have the right products in stock
to drive up sales and generate more
revenue. As one might imagine, this
issue is compounded by the fact that
not every store needs the exact same
inventory of products based on their
specific customer needs.
To help overcome this major technology-related problem, Home Hardware uses CA Clarity PPM to monitor
and manage its supply chain and make
sure that the headquarters, as well as
each individual store, always has access to the information it needs. Using
CA Clarity PPM, Home Hardware can
make sure the right products are on the
shelves, which not only raises sales, but
also improves the overall customer experienceand Home Hardware can do
this at the organization and at the individual store level. This use case is just
one example of how CA Technologies
products offer benefits not just internally for the business, but also externally
for customers.

CyberTrend / November 2014

23

Managed Cloud Services


THE UPSIDE OF HANDING CLOUD OPERATIONS TO A THIRD PARTY

KEY POINTS
A managed cloud service usually
entails an organization contracting
with a third party to provision, configure and operate a cloud service
on behalf of the organization.
Off-loading in-house IT management duties to increase focus on
other business objectives is one
benefit of managed cloud services.
Companies running out of data
center space and lacking staff,
skills, or resources often consider
managed cloud services.
Ensuring the provider offers SLAs
(service-level agreements) that
meet business needs, and service
that meets or exceeds statutory and
regulatory needs, is key.

24

November 2014 / www.cybertrend.com

AS WITH OTHER CLOUD computing


models, managed cloud providers are
piquing the interest of organizations
with the promise that adopting managed cloud servicesbe they for app
development, load balancing, security, testing, migration, scalable server
capabilities, storage, or one of many
other servicescan free up time so that
customers can spend more energy focusing on business initiatives and less
on IT management chores.
Beyond reducing administrative
complexity and capital expenditure
budgets, says Karyn Price, Frost &
Sullivan industry analyst, companies
are eying managed cloud services to
stay more current with new technologies and address challenges they face
in managing data centers. Although
the cloud promises to offer previously
unachievable results at attractive price
points, Price says, the complexity
that exists in the cloud can confuse
resource-strapped IT departments and

can drive them to seek outside assistance.


This is where managed cloud providers come in, offering the promise
of handling administrative tasks while
guaranteeing a specific level of service. Furthermore, providers typically
upgrade their hardware, platforms,
and software more frequently than
businesses can while coupling offerings with the knowledge and expertise
needed to deploy solutions and maintain optimal operating environments.
William Martorelli, Forrester
Research principal analyst, says one
way to envision managed cloud services is as something emerging from
evolution of managed hosting and the
proliferation of multiple cloud models.
For many suppliers, managed hosting
is still a growth business, he says. The
managed cloud model is expected to
contribute to this growth, if not match
the explosive growth of the public
cloud, he says.

A Working Definition
Simply put, managed cloud providers help consumers move to the
cloud and, once moved, operate cloudrelated services for them. Beyond
providing infrastructure, Price says,
managed cloud vendors offer services
available within the cloud stack and
incorporate automation, orchestration,
and proactive manual monitoring to
manage cloud applications or infrastructure for customers.
John Howie, principal at Howie
Consulting, says that increasingly,
cloud brokers (companies that match
cloud consumers needs with cloud
providers) are recommending managed
cloud providers. Furthermore, Howie
says, its highly likely some major
public cloud providers will begin offering forms of managed services.
For background on the matter,
Howie points to The NIST Definition
of Cloud Computing, which describes
cloud service models, cloud deployment models, and essential characteristics, one of which is on-demand
self-service. This last trait means a
cloud consumer can buy a cloud service from a cloud provider and provision it without requiring human
interaction. A managed cloud service
is one in which the consumer contracts
with a company to provision cloud services on their behalf, Howie says. A
good example of a managed cloud service is a private cloud that a third party
hosts, he says. The third party works
with the consumer to understand its
requirements and then build, tailor,
and configure cloud services on its behalf, and operate it.
Martorelli says managed cloud
models can be considered forms of

[The] complexity that exists in the cloud can confuse


resource-strapped IT departments and can drive them to
seek outside assistance.
KARYN PRICE
Industry Analyst : Frost & Sullivan

managed hosting that use cloud infrastructure. Hosted private clouds and
hosted virtual private clouds are typical
examples; a private cloud is physically
isolated and a virtual private cloud is
virtually isolated. Both are conceptually close to virtual managed hosting,
Martorelli says. Some providers are
also pursuing models in which they
manage public cloud services adjacent
to their own services, whether they be
managed hosting, disaster recovery,
or other cloud model variants, he
says. While providers cant actually
manage the underlying public cloud
infrastructure, they can surround
it with additional managed services,
SLAs [service-level agreements], and
more favorable contractual terms and
conditions, Martorelli says.
Ed Anderson, research vice president with Gartner, says that in the
case of private clouds, various managed cloud scenarios exist. For example, a company could run a private
cloud service in its own data center but
have a third party manage or run it in
an external data center. With private
cloud services specifically, Anderson
sees three dimensions existing: ownership (Who owns the assets?), location
(Where is it running?), and management (Who is managing it?). Any
time you have a third party managing
the system, thats what we would call a
managed cloud service, he says.

To some, the idea of a managed cloud itself is a


contradiction in terms, but theres no inherent reason
why managed hosting customers cannot enjoy at least
some of the benefits of cloud computing.
WILLIAM MARTORELLI
Principal Analyst : Forrester Research

The Motivators
A primary benefit of managed cloud
services is the ability to get to the
cloud faster. For example, says Dave
Bartoletti, Forrester Research principal analyst, managed cloud services
can help an organization enable cloud
platform access for developers quicker
than on its own. Another benefit often
cited is the potential to off-load IT
management functions to gain more
time, agility, and flexibility in terms
focusing on innovation. As Howie says,
consumers dont need to train up
their IT staff or acquire new staff with
the requisite experience and skills.
Agility, Anderson says, ranked as
the top reason by a wide margin, according to Gartner data, why organizations consider using a cloud
service. I think that says volumes.
They believe public cloud providers,
including, in the parlance, managed
cloud providers, can deliver services
much faster than they can internally,
he says.
Another scenario under which companies consider managed cloud environments is when theyre facing a need
to replace depreciated hardware and
software but dont possess the required
skills or staff to do so. Additionally,
Howie says, a company that is considering a move to the cloud but believes
the risk of public cloud computing too
great might instead adopt a private
cloud model through a managed cloud
provider.
Martorelli says compared to a pure
public model, companies may gain
more SLA coverage, additional terms
and conditions, and comfort levels regarding regulatory requirements via a
managed cloud provider. Desire for
support at the applications layer would
be another motivation, he says.

CyberTrend / November 2014

25

Bartoletti recommends companies


that are running out of data center
space, or that find it too expensive to
retain skilled staff with knowledge of
the companys applications, consider
managed cloud services. While many
cloud service providers are cloud pureplays, the majority are not, Bartoletti
says. Most providers are taking a hybrid approach with on-premises, traditional hosting and outsourcing, as well
as cloud solutions. Nearly all now acknowledge and are embracing a mixedmode use case.
Price says providers that embrace
hybrid cloud configurations and that
can offer management tools that traverse cloud environments are filling a
critical gap as businesses increasingly
embrace a hybrid strategy. Likewise,
providers that can aggregate thirdparty services, or become a cloud
service broker, offering managed infrastructure supported by a suite of
cloud-based applications and services
that are vendor-agnostic, are poised
for success in the increasingly complex
service environment, she says.

The Need To Know


Before forging a relationship with
a managed cloud services provider,
due diligence is vital. Organizations,
for example, should fully understand
what their desired business outcomes
are first. Without knowing what the
business needs and wants, the experience will rarely be a good one, and the
goals wont be achieved, Howie says.
Companies must also understand that
abilities, services, quality, and longterm positioning will differ among
providers. For example, knowing the
providers financial well-being is key,
Howie says.
Beyond ensuring a vendor can deliver on functional and nonfunctional
requirements the organization requires

Identify desired outcomes first. Without knowing what


the business needs and wants, the experience will rarely
be a good one, and the goals wont be achieved.
JOHN HOWIE
Principal : Howie Consulting

today, Bartoletti says organizations


must consider scalability, availability,
security, and other operational requirements when evaluating providers.
Bartoletti provides another scenario:
If, for example, an organization selects a global provider for a given service but the providers implementation
cant meet the companys needs in, say,
Latin America, another consideration
is to seek a suitable alternative that
can. You may not need to manage
this additional vendor, or the integration and complexities that come with
onboarding), as many Tier 1 cloud service providers will subcontract to these
smaller players and manage this relationship on your behalf, he says.
Elsewhere, ensure a provider can
offer SLAs the business requires and
service meeting or exceeding requirements of statutory and regulatory
obligations. Negotiating a win-win
contract is also important. While you
can certainly buy cloud services on
a credit card and off providers published price lists, if theyre strategic
to your business, you probably want
a better relationship, Bartoletti says.
This is where going in with a quid pro
quo becomes your strongest asset. The
more you know about their strategic
direction, the better you can map it to
your own and identify areas where you
both want to put skin in the game.
Its also key to have an exit strategy
in the event that a project fails or a
provider relationship goes sour. This
means having a strategy for self-managing the companys cloud presence

and passing control to in-house teams,


Howie says.
Anderson, meanwhile, says its important to understand the trade-off organizations make in terms of handing
off control to gain the ability to focus
on matters more valuable to the organization, such as innovation. Cloud
services are really nothing more than
a new IT hosting and consumption
model, he says. Its really nothing
more than that.
There are many cases where this fits
in well, but others where it doesnt, he
adds. If companies are thorough about
understanding this up front, its more
likely theyll recognize the success of
their initiatives.
Overall, Price notes, as cloud computing continues to evolve, the level
of manual intervention required of
providers to offer a managed cloud
service is dwindling. Managed services formerly offered a strong human
component, where an actual person
proactively monitored the service and
suggested or made tweaks to ensure
that SLAs were met and customer satisfaction was high, she says.
As cloud orchestration and automation platforms enable many routine
management functions and proactive
changes based on real-time operational
metrics, however, the need for human
intervention becomes increasingly less.
Eventually, its plausible that managed cloud services will simply be
cloud services, based on the nature of
cloud management tools available in
the market, she says.

COMPANIES THAT ARE RUNNING OUT OF DATA CENTER SPACE, OR THAT


FIND IT TOO EXPENSIVE TO RETAIN SKILLED STAFF WITH KNOWLEDGE OF
THE COMPANYS APPLICATIONS, SHOULD CONSIDER MANAGED CLOUD
SERVICES, RECOMMENDS BARTOLETTI.

26

November 2014 / www.cybertrend.com

Mobile Collaboration: Take Back Control


TOO MANY COLLABORATION TOOLS CAN BE COUNTERPRODUCTIVE

INCREASINGLY, WORKERS dont just

KEY POINTS
Security, privacy, and infrastructure
issues may arise when employees
are allowed to download any app they
want to their mobile devices, so consider policy issues carefully.
Consider involving employees in the
selection, testing, and evaluation of
mobile collaboration tools.
Focus on selecting mobile collaboration solutions that provide
cross-platform support and that will
work on different device form factors
(smartphones, tablets, etc.) to avoid
compatibility issues.
Consider implementing policies and
rules regarding mobile collaboration
tool usage to ward off problems related to security and data leakage.

desire mobile collaboration abilities,


they expect them. Its no wonder, with
mobile devices now omnipresent in the
workplace and enterprise collaboration
tools now available via cloud services
and mobile apps. Benefits include better
productivity, greater flexibility, and
time and cost savings.
That said, not everything concerning
mobile collaboration tools is rosy. With so
many options easily acquirable, including
consumer tools that workers use outside
of the companys watch, confusion can
easily surface as to which tools employees
should use to collaborate. Further, frustration can arise when the different apps/services employees are using wont play nice
together and actually hinder collaboration.
Meanwhile, business owners and executives
are left to make sense of the mess.

Messy Business
Its understandable that with so many
mobile collaboration options available

users differ on what they find most


useful and appealing. Its when one employee adopts, say, one file-sharing service while another adopts a different
service that collaboration becomes more
complicated. The same holds true with
other mobile collaboration tools (video
conferencing, document editing, calendaring, etc.).
Jeanine Sterling, Frost & Sullivan
principal analyst, says when companies
and individual workers have a slew of
mobile collaboration options to choose
from, things can become a little
messy. For example, too many different solutions can result in workers
using non-compatible software. One
employee may focus on choosing collaboration tools that work on her particular mobile device, for example, but
those may not be designed to work with
a mobile OS or a certain device form
factor that another team member uses.
Another issue can be lack of a consistent interface with or lack of any

CyberTrend / November 2014

27

When workers decide they can download any app they


like to their mobile devices . . . IT then tears their hair out
worrying about security and privacy vulnerabilities and
proper infrastructure support.
JEANINE STERLING
Principal Analyst : Frost & Sullivan

type of integration between the workers mobile device and his/her desktop
computer, Sterling says. Files and
documents ideally should be accessible
and editable across smartphone, tablet,
laptop, and desktop. Although workarounds may be available to overcome
such issues, these take time, which
nullifies much of the initial mobile
collaboration advantage, she says.
Employees can become confused and
frustrated, each championing their own
favorite collaboration toolagain, the
antithesis of what collaboration is
all about.
Similarly, Jim Rapoza, Aberdeen
Group senior research analyst, says
having too many tools really defeats the whole purpose of collaboration tools, which is to make it easier
for people to seamlessly work together wherever they are. The biggest
problem Rapoza sees with these tools,
however, is they dont integrate with
how people really work. If, to collaborate, you have to go to some app that
isnt integrated with your regular workspace and tools, you probably arent
going to use that collaboration app
much, he says.

existing issues can be significant, especially when it comes to sensitive data


being improperly sent or stored on
these tools.
While mobile collaboration tools
likely make things easier for workers,
says Rob Bamforth, principal analyst
with Quocirca, if employees dont
share common approaches to information sensitivity and risk and are suitably cautious there may be problems
for the organization. For example,
company secrets and customer-related
data should be kept private, he says.
The former may have business consequences, but the latter could result in
huge reputational damage and fines.
The problem with mobile collaboration
tools, Bamforth says, is that indiscretion is only one click away. Bamforth
thus advises implementing guidelines

Security & Privacy

or rules concerning the use of these


tools (dependent on the industry and
data sensitivity in question) that should
be well-communicated and understood.
These rules must have teeth and consequences and ideally be automated by
tools that enforce the agreed policies,
he says.
Elsewhere, when employees use
whatever app or service they want, says
Jason McNicol, Ph.D., ABI Research senior analyst, IT loses the ability to track
content (who accessed the system, from
where did they gain access, when did

Other issues with these tools arent


exactly specific to mobile collaboration
but still pose problems. When workers
decide they can download any app they
like to their mobile devices, Sterling
says, theres potential for issues. IT
then tears their hair out worrying about
security and privacy vulnerabilities and
proper infrastructure support, she
says. While Rapoza doesnt believe mobile collaboration tools create issues
or problems that dont already exist
with other collaboration tools, the

28

November 2014 / www.cybertrend.com

they gain access, what might have been


downloaded, etc.). Another issue experienced by businesses (albeit more frequently in the past) involves employees
finding that some file-sharing services
dont make clear what the latest iteration of a file is and who made changes
along the way. Many providers, however, now offer versioning control as a
standard feature, McNicol says.

Regain Control
What employees and business
owners/executives can do to take control and agree upon a set of compatible
solutions depends on the circumstances.
McNicol says this question is coming to
the forefront more often. Collaborate,
evaluate, and communicate are the
three criteria he points to for finding a
solution that works best for everyone.
First, collaborate with employees. Get
them involved to find out which solutions theyre using and why. Then the
organization can evaluate the solutions and communicate the preferred
choice, he says. Collaborating with employees on the decision-making process
will increase the odds they comply with
company policy, he says. All this does

COLLABORATING WITH EMPLOYEES ON THE


DECISION-MAKING PROCESS WILL INCREASE
THE ODDS THEY COMPLY WITH COMPANY
POLICY. ALL THIS DOES ASSUME THE COMPANY ALREADY HAS A MOBILE STRATEGY IN
PLACE TO HELP GUIDE THE PROCESS.
assume the company already has a mobile strategy in place to help guide the
process, he says.
Rapoza offers similar advice. Figure
out how people really work and collaborate. Talk to the different groups
and stakeholders and find out what
they really need, he says. If a tool really works and there arent significant
downsides, consider embracing solutions employees are already using, he
says. Bamforth points out that getting employees to switch from a preferred solution may be difficult without

considerable up-front consultations


and follow-up with policy-driven security tools that enforce or at least increase awareness of data leakage.
Additionally, employees who lack
knowledge about or experience using
mobile collaboration tools may require training. Sterling says businesses
should train employees on mobile policies and practices so theyre aware of
the risks of downloading their own
apps unbeknownst to the IT organization. The real onus, however, is
on management to research, vet, and
standardize a set of tools in a timely
manner that actually meets employees
needs, she says. They have to get in
front of the herd, as difficult as that
may be, Sterling says.

Making Decisions
When filtering mobile collaboration
tools, Rapoza emphasizes collaboration is a feature, not a standalone application. Thus, focus on how a tool
matches how people actually work.
Also, consult employees to work out
exactly what types of data might be
shared via the tools. Based on this data,
Bamforth advises to set simple rules,
such as never share this or if you
must share this, use only X, and use
only Y for personal information.
Sterling says the tool-selection process will take time and a modicum of
expertise. Smaller businesses particularly may lack sufficient resources to
do everything alone. While peer referrals can be helpful, Sterling says Frost
& Sullivan research indicates wireless
carriers are SMBs favorite partners for
selecting, buying, and deploying mobile apps. Reasons include that with a
monthly invoice, carriers maintain a
high-touch relationship with smaller
businesses. Many carriers have taken
on the responsibility of vetting a stillfragmented set of vendor alternatives
in a number of application categories, she says. Further, carriers focus
on including affordable, easily managed cloud-based solutions in their
mobile application portfolios, she says.
Once companies gather recommended

If, to collaborate, you have to go to some app that isnt


integrated with your regular workspace and tools, you
probably arent going to use that collaboration app much.
JIM RAPOZA
Senior Research Analyst : Aberdeen Group

Set simple rules, such as never share this or if you


must share this, use only X, and use only Y for personal
information.
ROB BAMFORTH
Principal Analyst : Quocirca

products, employees should be closely


involved in trials and evaluations,
Sterling says. This cant be a top-down
dictation, she says. Companies should
also follow up with proper training.
Often, McNicol says, sales and marketing employees are mobilized first
because returns are easily quantifiable and quick. So for small businesses,
mobilizing S&M comes down to price.
Before buying anything, however, a
mobile strategy should be formulated,
he says. Who or what are you trying to
mobilize and why? Implementing a mobile solution is a lot more difficult and
can take up to a year in some cases.
In terms of features, abilities, and
types of solutions that could help mitigate or prevent cross-device/platform
compatibility issues, Sterling advises
choosing compatible software so that a
tool can work for desktop and mobile
workers across tablets, smartphones,
and laptops. In our BYOD [bring your
own device] world, it should also work
across iOS and Android, she says.
There are many, many good solutions
out there. Its up to the business to
take the initiative, listen to its workers,
and standardize on a set of solutions
that really enable and optimize its employees performance.
Rapoza also advises against using
single-platform options. If your employees are on a variety of platforms,
dont go with an iOS option only, he
says. Someday, an HTML5 option may
be realistic, though for now most of

those are lacking in real-time collaboration capabilities.


Most tools now offer cross-platform support for iOS, Android, and
Windows Phonebut with a catch,
McNicol says. If were talking about
MDM [mobile device management] solutions, there are greater features for
iOS and the most recent versions of
Android, but there are fewer MDM
features for Windows Phone management, he says. If you look at app
development tools, businesses can actually develop the same capabilities
across multiple platforms due to app
libraries. Content-management solutions focus on iOS and Android but not
all support Windows Phone, he says.
With email management, most services are based on Microsoft Exchange
Server, so this is uniform across platforms, McNicol says.
If eyeing cloud and Web-based solutions, Bamforth cautions that there
are security challenges to consider before deploying these types of solutions.
Pick vendors with a solid reputation
for taking fast action and being securityoriented, he says. Its also better to
ensure that the security, whilst strong, is
as unobtrusive as possible to the user.
Overall, if companies make solutions
hard to use, employees will simply use
their favorite consumer tool and thus
defeat the object of imposing security,
he says. Never underestimate the ability
of employees to keep their working life
simple, he says.

CyberTrend / November 2014

29

Anatomy Of A Tablet
WHAT YOU SHOULD LOOK FOR IN A TABLET

WHEN IT COMES TO mobile productivity, you cant beat a tablet. The slim
devices can travel nearly anywhere, and
youll enjoy more screen real estate than
whats available on a smartphoneeven
that of todays pocket-filling models.
Another benefit of the bigger size, compared with a smartphone, is that tablet
manufacturers can install more powerful processors for a speedier experience
when using email and other productivity
tools, so you might not need to bring
along that bulky laptop. We know that
the tablet shopping experience is fraught
with technical specs and features, and
were here to examine the key things you
should look for.
For easy comparison, we separated tablets into three categories: small (7 to 8.3
inches), medium (8.4 to 10.1 inches), and

large (bigger than 10.1 inches). This way,


you can compare features within the size
range you prefer, and the feature differences may help you decide whether to
supersize your tablet.

The OS
Before focusing on the specs and features are desirable in a tablet, youll want
to explore your OS (operating system)
options, because the OS fuels how quickly
you can accomplish tasks with the tablet.
Were not here to say one mobile OS is
more functional than another, but it does
help if youre familiar with the interface.
For example, you might have experience
with the mobile OS on your smartphone
or a previous tablet. Or if you have a PC
that runs Windows 8.1, you might want
to look for a Windows 8.1-based tablet.

THE DIFFERENCE IN PROCESSING POWER IS


OFTEN NOTICEABLE WITH PRODUCTIVITY APPS
AND OTHER DEMANDING TASKS.

30

November 2014 / www.cybertrend.com

If you are familiar with how a certain


OS works, it will make transitioning to
a tablet easier if you choose a tablet with
the same OS. Another benefit is that it is
likely youve already set up accounts for
the OSs built-in features.

Small Tablets
Tablets in this size range are generally
less powerful and more affordable than
the larger models. Obviously, the bonus
of the smaller screen size is that the tablet
will fit easily inside a travel bag, purse, or
other carry-along. Generally, small tablets are comfortable to hold in one hand,
which makes them a good fit for users
who want tablets that double as e-readers.
When it comes to processing power,
many models in this category feature a
multi-core (typically quad-core) processor that runs at approximately 1.5GHz.
For example, the Asus VivoTab Note 8
(www.asus.com) boasts a 1.86GHz quadcore processor. Budget tablets might
offer dual-core processors or models that

operate at slightly slower speeds, such


as 1GHz. The difference in processing
power is often noticeable with productivity apps and other demanding tasks,
such as if youre multitasking among apps
for weather, social media, and the Web.
Typically, youll get what you pay for in
terms of tablet speed.
Storage capacity is another area where
small tablets might fall short. Many small
tablets are limited to 64GB, 32GB, or
16GB of internal storage, while tablets
larger than 8.3 inches often support up to
128GB. If you need access to local copies
of multimedia collections or large work
files, youll want to look for a small tablet
that maximizes its storage with support
for an external memory card. Note that
some small tablets may not offer a video
output, such as an HDMI or DisplayPort,
which will allow you to easily display
tablet content on the big screen.

Medium Tablets
Tablets that fall into the medium-sized
category tend to be the most popular.
Compared with models in the small tablet
category, the extra inches make watching
movies and TV shows or playing video
games more enjoyable. Youll be able to
use productivity apps more effectively, as
well. Medium-sized tablets also offer better
portability than laptops or larger tablets,
because they fit into most travel bags.
Improvements to tablet screens have
allowed manufacturers to deliver some incredible quality on medium-sized tablets.

The Asus Transformer Pad (TF701T) features a


1.9GHz quad-core CPU and a 72-core graphics
processor. That means smooth HD video and snappy
multitasking and browsing.

For example, we found several mediumsized tablets that produce resolutions


greater than 1080p, such as 2,560 x 1,600
and 2,160 x 1,440. With so many pixels on
the screen, content will look crystal clear
and astoundingly detailed, even when your
face is only a foot (or less) away from the
display. Combining big screen resolutions
with the medium-sized tablet also brings
new life to newspapers, magazines, and
other graphics-oriented content.
To go along with the bigger screen
size, tablets in this size range also typically boast improved processors. For
example, the Asus Transformer Pad
(TF701T) features a 1.9GHz quad-core
CPU and 72-core graphics processor. This
kind of power lets the Asus Transformer
Pad deliver smooth 1080p HD video,
and itll feel snappy when multitasking
and browsing the Web. The Transformer
Pads 10.1-inch display delivers a 2,560
x 1,600 resolution, and the micro HDMI
port lets you share photos and videos on a
television or monitor. You can also combine the Transformer Pad with an optional mobile keyboard dock that adds a
USB 3.0 port and SD card reader.

Large Tablets
The tablets that are larger than 10.1
inches are often referred to as laptop replacements, because they typically feature a hybrid design with a detachable
keyboard (or mobile keyboard dock) that
lets the user switch between tablet and
laptop form factors. The newest tablets
in this category are often equipped with
the fastest mobile processorscapable of
handling multimedia creation and productivity tasksas well as Windows 8.1
to support the full versions of applications
you use on a traditional PC.
The Surface Pro 3 from Microsoft
(www.microsoft.com) is a good example
of a tablet thats designed to function as a
laptop replacement. You can select from a
variety of 4th Generation Intel Core processors and the 12-inch tablet comes with
a detachable keyboard. The Surface Pro 3
also offers memory and storage options
similar to what youd find on a laptop,
because you can have a maximum of 8GB
of memory and up to 512GB of storage.

The VivoTab from Asus is an 11.6-inch tablet that works


with Wacom digitizer stylus for precise input when drawing and writing notes.

Youll be able to connect with peripherals,


too, thanks to a full-sized USB 3.0 port
and microSD card reader.
Tablets bigger than 10.1-inches also
have space for some amenities that arent
standard on tablets in the medium and
small categories. For instance, the extra
room allows manufacturers to include
bigger batteries or, in some models, an
additional battery in a keyboard dock to
further extend use. For example, the Asus
VivoTab can deliver a 10.5 hour battery
life by itself, and when connected to the
optional mobile dock, you can enjoy up
to 19 hours of use. The Asus VivoTab
also supports a Wacom digitizer stylus,
so you can draw and write on the tablets
display; an ideal feature for taking notes
in a meeting.

Size Matters
Performance and screen quality
often scale with a tablets size. For basic
tasks, such as Web browsing and ereader capabilities, small tablets will do
the job. Tablets between 8.4-inches and
10.1-inches hit the sweet spot for those
people who want a portable device with
a screen thats large enough for watching
movies and viewing emails and documents, but can also still conveniently fit
into an everyday bag. If you want the
ability to use powerful software and easily
create emails and documents, youll likely
want a large tablet or convertible tablet/
laptop that delivers greater processing
power and screen real estate.

CyberTrend / November 2014

31

Mobile Web App Pros & Cons


HOW THEY CONTRAST & COMPARE TO STANDARD APPS & MOBILE WEBSITES

NEARLY EVERY mobile device user is familiar with mobile websites and standard
(or native) mobile apps, the former accessible via a Web browser and the latter
downloaded from app stores. Arguably
fewer users are as experienced with mobile Web apps. In terms of reaching customers, clients, and employees via mobile
means, each of these avenues has distinct
pros and cons. The following focuses on
what mobile Web apps entail in relation
to the other two options and why businesses might consider developing them.

Likes & Dislikes


Although there are similarities, several
traits distinguish mobile Web apps from
mobile websites and native mobile apps.
Native apps, says Ray Valdes, Gartner
research vice president, can tap into the
abilities of the underlying mobile device
platform theyre installed on to provide
the highest performance levels, richest
user experience, and highest degree
of functionality. Only a native app can

32

November 2014 / www.cybertrend.com

access some functions in a device platform, he says. A December 2013 report


from Forrester Research cites such abilities as performing offline tasks, multisensor interactions, push notifications,
and application integration as mobile
app strengths.
Mobile websites, meanwhile, are those
that are viewable and usable on a mobile device. At one time, they were built
and operated as separate sites in parallel
with classic desktop-oriented websites,
Valdes says. Today, responsive Web
design enables creating an all-in-one
digital channel, meaning a mobile website is viewable on different devices and
adapts to a devices unique form factor,
Valdes says. A mobile Web app is similar
to a mobile website but adopts an interaction model more like a desktop application, Valdes says. Thus, instead of the
page-by-page hypertext-driven interaction model of content-centric websites,
the Web app may have a single-page
structure for the user experience, he says.

Depending on how a business implements each option, slow performance,


limited user experience, and more
narrow scope of functionality can result,
Valdes says. Generally, though, a native
app approach offers a faster, richer experience than a mobile Web app, which offers
similar advantages over a mobile website.
A mobile website is often cheaper to build
and easier to update than a mobile Web
app, which has similar benefits over a
native app. Overall, these choices dont
have to be mutually exclusive, Valdes
says. Many large, successful companies
implement all three to cover a full range
of requirements.
Altaz Valani, Info-Tech Research Group
senior consulting analyst, says its possible
to distinguish the three options by viewing
them as layers, each offering a different
perspective. At a capability level, for example, apps have access to device functionality (GPS, notification, etc.), while mobile
websites are more limited. From a connectivity perspective, apps can operate in offline

Today, responsive Web design enables creating an


all-in-one digital channel, meaning a mobile website is
viewable on different devices and adapts to a devices
unique form factor.
RAY VALDES
Researh Vice President : Gartner

Architecturally, a mobile Web app has a layer that sits


between the Web application and native APIs, while a
native app has direct access to native APIs.
ALTAZ VALANI
Senior Consulting Analyst : Info-Tech Research

mode, while websites are mostly always on.


Architecturally, a mobile Web app has a
layer that sits between the Web application
and native APIs, while a native app has direct access to native APIs, he says.
From a development perspective, Valani
says, things get muddier. Typically, developers use HTML/CSS/JavaScript to create
mobile Web apps and mobile websites but
can use native languages (Objective C, Swift,
and Java) or cross-compile using C++, C#,
or JavaScript as the source to create standard
native mobile apps, he says. Deploymentwise, apps are typically distributed via public
or private app store or side-loaded, while
mobile websites dont require installation.
From a release-management perspective,
apps require thought around versioning and
upgrades while mobile websites typically
have the latest version, Valani says.

Reaching Customers
For businesses seeking the best way to
reach customers, clients, employees, and
others via mobile there are various considerations to weigh regarding using mobile Web apps and websites vs. devoting
resources to developing standard mobile
apps. In terms of development issues, for
example, mobile Web apps are based on a
generic translation engine, whereas native
mobile app code can be highly optimized
to take advantage of specific hardware and
OSes, Valani says. Additionally, mobile
Web app frameworks can lag behind native mobile app frameworks, he says.

Valdes says depending on how mobile Web apps are implemented, theyll
have more limited user experience and
functionality vs. native apps. Conversely,
depending on execution, native apps will
have higher development and maintenance
costs, and theyll be more difficult to update. Similarly, the Forrester report states
that mobile apps shine at interactive experiences on targeted devices and can offer
customers deeply engaging experiences.
Mobile Web, meanwhile, delivers a consistent experience on every device.
In a 2013 article, Al Hilwa, IDC program
director of software development research,
describes an enlightened coexistence between Web and native device application
platforms that could prevail. Although
native apps will remain dominant, Hilwa
states the Web platform will make significant inroads, to that point Web technologies may infuse the majority of mobile
apps one day. Initially, various development tools that generate installable native
apps will support Web technologies, he
writes. These hybrid apps may be written
primarily in HTML and JavaScript and take
advantage of browser components available
on a native device platform, he states.
Forrester describes hybrid apps as one
part app and one part Web. Similar to mobile apps, hybrid apps have a presence in
app stores and on touchscreens, load like
apps, and have local storage and security
abilities. Like mobile Web, hybrid apps display content in an embedded browser. In

short, Forrester states hybrid apps have a


best of both worlds sensibility, although
they do carry burdens associated with mobile apps and the mobile Web.

Questions To Ask
For Valani, deciding if a mobile Web
app, standard mobile app, or mobile
website approach is best is essentially an
apples and oranges scenario. Trying
to correlate reach and end users would
imply that customers prefer one type of
app container over the other, he says.
Numerous other external factors are involved, however, including social ranking,
trustworthiness, update frequency, integrated ecosystem, and personalization.
Bottom line is the business shouldnt
focus on the technology but on the value,
he says.
Specifically, Valani believes the container should be a secondary issue. In general, businesses should focus more broadly
on mobile innovation, including by asking
how it will make workers more productive, provide the business a competitive
edge, provide other unique or enhanced
services, and enable the organization to
leverage existing investments. Questions
business administrators, architects, and
developers should consider include: Is native hardware access needed? Does a large
pool of Web developers who understand
HTML, CSS, and JavaScript already exist?
Do you need to deploy an app on multiple
devices? Is the fact that code may not perform equally across devices acceptable? Is
there a release-management strategy for
mobile apps?
When determining an approach, Valdes
stresses the need for businesses to know
their users, needs regarding the businesss
digital channels, and the high-value and
high-priority use cases for specific channels. Organizations will likely need a fullfeatured, content-rich website accessible
by desktop PCs, tablets, and a variety of
mobile phones, he says. Further, there
might be a subset of functions that can be
packaged into a mobile Web app with an
easy-to-use, touch-centric interface. Lastly,
there might be a small handful of scenarios
for which a dedicated, high-performance
native app makes sense, he says.

CyberTrend / November 2014

33

Storage For New Laptops


DONT SKIMP WHEN IT COMES TO YOUR FILES, DOCUMENTS & APPLICATIONS

AS ANY ORGANIZATION begins to look


into replacing aging hardware, theres
always going to be a trade-off between
how much new technology to purchase
and how much money is available to
spend. When it comes to something as
complex as new laptops, however, the
per-unit costs can vary from a couple
hundred dollars to several thousands of
dollars. Which makes the decisions you
make about what parts go into the laptops all the more important. We'll help
you cut through specs, abbreviations, and
tech speak to determine the best form of
storage for new laptops.

Key Considerations
Although the technology is the same,
theres a big difference between what
you need to consider when purchasing
storage for laptops vs. desktops. Jonathan
Weech, Solid State Storage Marketing
Manager at Micron (www.micron.com),
says there will be some compromises that
need to be made due to the physical space

34

November 2014 / www.cybertrend.com

and power limitations of laptops. The


best place to start is to understand the
use case required (i.e. mobile workstation, data entry, software development,
content creation, mobile sales force, etc.),
and then from that determine the correct
mix of performance, power management,
capacity, and form factor.
Any application that is storage speed
limited should favor SSDs (solid state
drives). As the name implies, SSDs have
no moving parts, relying on a controller
chip and flash memory. SSDs are the
newer storage option and generally cost
more per gigabyte than hard drives, but
they are capable of achieving significantly
faster read and write speeds. Having a
fast storage subsystem can benefit your
laptop's boot up process, content creation, and any application that requires
the viewing and manipulation of large
files, among other things. According to
Weech, SSDs are an obvious choice in
many of these applications, because they
are much faster than hard drives, [are]

more energy efficient, come in capacities


up to 1TB, and are available in laptopfriendly form factors including 2.5-inch,
mSATA, and M.2. Arguably, most professional and workstation applications
will see some tangible benefit from using
an SSD as compared with an HDD (hard
disk drive).
The traditional hard drive is generally
a 3.5-inch enclosure that houses one or
more discs, called platters, and a metal
arm, called an actuator arm, that reads
and writes data as it moves over the surface of the spinning platters. Applications
that are more capacity constrained, such
as raw storage for audio and video files
for playback only, may benefit from less
expensive HDD storage. Again, an attempt to keep the storage subsystems
price as low as possible is the best reason
to go with HDDs.

Speed Rules
Modern laptops generally all support the SATA III (6Gbps) interface,

which refers to the maximum theoretical throughput of the device. Both


HDDs and SSDs support the interface. When it comes to actual performance, however, an SSD can achieve
speeds above 500MBps (megabytes per
second), whereas a 2.5-inch laptop HDD
can perform the same operation at
roughly 120MBps.
SSDs are often advertised using peak
read and write performance numbers,
which tend to hover above or below
that 500MBps number. HDDs, on the
other hand, read and write data as the
platters spin, and its these RPMs (revolutions per minute) that distinguishes
the slower HDDs from the faster ones.
The most common speed for HDDs is
currently 7,200 RPMs. HDDs that spin
at 5,400 RPMs are also very common
in laptops. Hard drives that target the
enterprise market can spin at 10,000
RPMs or 15,000 RPMs, but they tend to
be significantly more expensive and the
cost advantage they have over SSDs vanishes quickly.
There is another speed boost on the
horizon called SATA Express (speeds up
to 1 gigabyte per second), however, no
products supported the technology as we
went to press.

The Crucial M500 M.2 SSD by Micron (www.micron.com) is small enough to fit in the thinnest laptops and even tablets.

which are typically 2.5-inches deep.


Some compact laptops may use 1.8-inch
drive bays, however, new standards call
for even smaller form factors. mSATA
SSDs and the M.2 (pronounced em dot
two; the newest form factor), can typically deliver storage capacities of as much
as 512GB. Like many tech products, as
the size shrinks, the price for comparable
performance goes up.

Capacity

As we mentioned previously, SSDs


dominate when it comes to speed, but
the capacity crown goes to HDDs. If you
Form Factor
need 1TB of storage capacity or more in a
Hard drives and SSDs are readily availlaptop, your least expensive option would
able as options for laptops, and as such
be to go with an HDD. Currently, 1TB
they both support the same form factors,
SSDs cost upwards of $500, while 1TB
HDDs are readily available in laptop form factors (2.5-inch) for less
than $100.
RAID (Redundant
Array of Independent
Disks) is a storage technology that lets you
use two or more SSDs
or HDDs to achieve
higher capacities, faster
speeds, or protection
against drive failure
(or any combination
thereof). A RAID 0
array, for instance, lets
Laptop HDDs (hard disk drives) will generally offer higher capacities at a lower cost
you
combine two identhan SSDs (solid state drives).
tical drives so that your

laptop sees it as a single drive with double


the capacity, and the system can access
both drives simultaneously for faster read
and write performance. A RAID 1 array
essentially makes a copy of everything on
the second drive, which is good for data
backup purposes. There are other types
of RAID that combine these features
and use more than two drives, but most
desktop replacement laptops are typically
limited to two drives, with smaller notebooks such as Ultrabooks being limited
to just one.

The Best Of Both Worlds


If youre still on the fence about what
type of storage system to purchase, some
laptop makers offer models that feature
an HDD and an SSD. The end result is a
laptop with plenty of capacity and a section of storage set aside for those bandwidth-demanding applications. If a user
needs to boot very quickly or often needs
to manipulate very large files, [he] may
want to consider using an SSD as [his]
primary storage device, says Weech.
Due the slower performance and higher
capacity of an HDD, it makes the most
sense to use it as a secondary repository
for photo, music, and video files, largely
because those files dont require massive
throughput to display or play back.
Laptop manufacturers currently offer
models with a wide range of storage
systems, so be sure to consider your
budget and storage needs when making
a purchase.

CyberTrend / November 2014

35

Hosted Communications
THE APPEAL OF CLOUD-BASED VOIP & UNIFIED COMMUNICATIONS

AS MOBILITY, REMOTE working, and collaboration become more important by the


day, businesses cant make too few communications options available for employees, clients, and customers. Today,
theres an impressive list of options, including those provided in enterprise-level
hosted UC (unified communications) and
VoIP (Voice over IP) solutions. The list
of features within these solutions is also
impressive and growing in number.

Talk The Talk


The breadth of providers offering enterprise-level hosted communications
services that can meet business-specific
needs is fairly vast. North American businesses, says Dan OConnell, Gartner
research director, have 100-plus cloudbased VoIP and UC providers to choose
from. There are so many choices, he
says, it confuses business IT procurement managers. Options include services
from telecoms, cable operators, systems
integrators, and technology providers.

36

November 2014 / www.cybertrend.com

Combined, application specialists own the


largest market share, he says.
Numerous enterprises already use a
dedicated hosted or SaaS (software as
a service)-based UC service, says Art
Schoeller, Forrester Research vice president and principal analyst. Forrester data
indicates 20% of enterprises surveyed use
a dedicated hosted solution, while 10%
use a SaaS-based one. In the future, however, the dedicated hosted share will remain even while the SaaS share will rise
to 20%. Positively, services are businessclass, not consumer-oriented, and robust
enough for SMBs (small and midsize
businesses) and midsize and larger enterprises, he says.
Until recently, only fairly large-sized
companies could access UC as on-premises solutions, which required significant
up-front investments in infrastructure,
says Tim Banting, principal analyst with
Current Analysis. Today, UC is widely accepted as a way to enhance employee productivity, reduce decision-making time,

and provide better engagement across


the business. Presence, for example, enables workers to quickly see a colleagues
availability and select the most appropriate communication channelvoice,
video, IM (instant messaging), etc.to
get needed information.
Obtaining UCaaS (UC as a service)
typically means getting a wide swath of
communications services while negating
the hardware/software acquisition, management, maintenance, and upgrading
costs associated with on-premises solutions. Cloud-based services are generally
available in an on-demand, pay-as-yougrow model, allowing the flexibility to
expand services as business requirements
change without the up-front investments.
Commonly, hosted services include
VoIP, with IP phones for office workers
and soft clients downloadable to desktops, laptops, tablets, and smartphones
for mobile workers. Also common is IM,
presence, conferencing (audio, Web, and
video), contact center capability, email,

calendaring, enterprise social networking,


file sync and share, and UM (unified messaging; combining email, voice, fax, etc.
into one interface).

This is where UCaaS makes a lot of sensefor


companies geographically dispersed with multiple
locations and/or home workers.
TIM BANTING
Principal Analyst : Current Analysis

Enterprise-Level Features
OConnell says most enterprise-level
hosted UC and VoIP solution deployments include a high uptake of VoIP,
mobility, UM, and audio conferencing.
The business case is often replacing the
premium-based PBX [private branch exchange; a private phone network used
within a company], as well as integrating
employee smartphones to enable enterprise mobility, he says.
Basically, Schoeller says, hosted UC/
VoIP solutions include the same set of
capabilities as on-premises UC solutions.
These features are used for enterprise
collaboration, and many look to connect
external users, but in other cases, they use
federation services to do that. Federation
essentially enables UC-type communications with others outside the organization.
In terms of telephony, some organizations require an exhaustive list of features,
including those in operator/attendant
consoles, Banting says. Additionally, with
remote working on the rise, mobility is
becoming key. Here, functionality can
range from simple call-forwarding from
a desk phone to a mobile device on up to
fully featured soft-client apps that enable
making voice and video calls over Wi-Fi,
holding conference calls, and IM.
Bantings own firms use of UCaaS
enables its analysts to operate globally.
U.S.-based clients, for example, use a
U.S. number to contact Banting with the
call routed via public Internet to an office-based IP phone near London. This
is where UCaaS makes a lot of sense
for companies geographically dispersed
with multiple locations and/or home
workers, he says.
Elsewhere, audio, video, and Web conferencing is becoming more common for
some organizations. Conferencing enables team members to meet online in a
scheduled or ad-hoc manner to talk, add
video, and share content. While some solutions may include whiteboarding functionality, Banting says he hasnt seen this

used extensively due the difficulty of using


a mouse to draw effectively.
Also more common are contact center
abilities, such as automatic routing of
voice calls to agent groups, historical
and real-time call reports, IVR (interactive voice response) for self-service, and
supervisor dashboards and tools. Some
providers also offer universal queuing
functionality that adds other communication channels to the voice mix, such
as Web chat, email, and social media
(tweets, for example), Banting says.
Features some may be less familiar
with include FMC (fixed mobile convergence), something Gartner defines as
device and infrastructure technology that
enables the transparent use of voice and
data applications across fixed and mobile
access points. Essentially, FMC is something some telecoms provide that enables
mobile workers to seamlessly switch between an organizations fixed wired network and external cellular networks and
access desktop-based voice, data, and
video functionality regardless of the connection. Desktop sharing, meanwhile, is
a facet of conferencing that enables collaborating remotely online in real-time
through sharing files, data, etc., in some
cases with texting, phone, and chat.
There have also been moves to embed
UC functionality into everyday business
applications, Banting says. Typically,
software can recognize a telephone
number in a field within an application
and pop up a dialer to allow you to click
to dial, he says. Additionally, contact
cards appearing in email or a document
program can provide email, IM, or voice
or video call choices. Schoeller, meanwhile, cites federation as a less familiar
feature that Forrester sees providing
value in terms of enabling a richer,
broader means of connecting to external
parties.

Better Utilization
Currently, many cloud-based VoIP/
UC providers possess limited video
and Web conferencing capabilities,
OConnell says, although some are enhancing their abilities and making them
easier to use. We can expect these integrated solutions to start replacing the
Web conferencing and videoconferencing point solutions, he says.
Forrester, meanwhile, expects good
growth for desktop video, although,
Schoeller says, video conferencing
doesnt seem a fit for some enterprises
culture. Some organizations will eventually grow into video as more consumers
using it personally also want it for business, he says.
Moving forward, Banting says its
worth noting that what was once a complex mix of distinctly different business tools and technologies is slowly
coming together in the cloud. UCaaS
providers recognize that multiple tools
and multiple communication channels
cause unnecessary repetition, delay,
and frustration for organizations, he
says. Likely, he adds, well see a rapid
convergence of real-time communications (voice, video, conferencing, IM,
presence, and desktop sharing) and
asynchronous collaboration (email, intranet sites, voice mail, enterprise file
sync/share, and enterprise social networking).
Currently, the collaboration and
communications market is fragmented
into numerous on-premises and cloudbased offerings, Banting says. As such,
an enterprise will likely compromise some features and functionality
if choosing either an all-cloud or allon-premises solution. Many vendors,
Banting adds, do support hybrid models
that enable integrating both deployment
choices.

CyberTrend / November 2014

37

Cellular Insecurity
HOW DATA ABOUT YOU & YOUR MOBILE DEVICE MAKES THE ROUNDS

SMARTPHONE OWNERS know an incredible amount about how they can use
their phones. What they tend to know
less about are the finer details concerning the voice and data traffic tied
to their phones, including how wireless
carriers and app providers route, access,
store, and use that data.

Traffic Patterns
Cell phones are often described as
essentially being radios, not too unlike
walkie-talkies that transmit and receive
radio signals. Unlike a walkie-talkie,
however, cell phones can operate over
long distances and support multiple frequencies, meaning two people can speak
simultaneously. In short, a cell phone
transmits a signal to a cell phone tower,
which receives, boosts, and transmits
that signal to other towers until the call
reaches one nearby the calls recipient.
When a caller moves outside the grid of
towers his wireless carrier operates, the
signal drops or the phone roams to

38

November 2014 / www.cybertrend.com

find another carriers tower to maintain


the signal.
Generally, says Daryl Schoolar,
Ovum principal analyst, data is backhauled to a wireless carriers location
within a metro area and traffic is aggregated. Packet core networks (the
brains of the network) typically break
traffic out to the Internet but may also
gather data concerning the call. To access, say, a Google Gmail server from
a phone, the Internet traffic would be
delivered through a gateway, though
specific network elements involved depend on the network technology in
use (2G, 3G, or 4G), says Dionisio
Zumerle, principal research analyst
with Gartner. Overall, Zumerle says,
end users generally dont need to have
knowledge about underlying cellular
networks and technologies, because
public mobile networks are reliable for
everyday communications (with legacy
2G networks that have shown vulnerabilities being an exception). Private

communication of sensitive enterprise


matters over public mobile networks,
however, shouldnt be trusted, he says.
Here, a mobile voice encryption technology can offer a solution.

E911 & Privacy


A topic drawing increasing publicity
is location-based determination abilities, which carriers and third parties
can perform using GPS or other measurement-based techniques. Currently,
much activity is occurring around
Wi-Fi positioning and assisted-GPS
approaches to pinpoint user location,
says Brent Iadarola, Frost & Sullivan
global program director. Thus, if GPS
fails, other network-assisted location
technologies can pinpoint a location,
including cell phone triangulation.
That would be sort of the first piece
of the network-based options, he says.
The second piece would be breaking
that down to small cell positioning.
Carriers use small cell solutions to

address coverage gaps, but they can


also help pinpoint location, he says.
Obviously, GPS doesnt do as well
indoors, and one of the things were
seeing is a much greater need for indoor location capabilities, Iadarola
says. And thats where small cell solutions and things like Wi-Fi location
positioning can be extremely helpful.
Every phone shipped today has location-determination abilities embedded,
Iadarola says. This stems to an FCC
E911 mandate in 1996 that required
carriers be able to pinpoint a users
location in cases of emergency calls
placed from cell phones. Eventually,
wireless carriers and other parties
sought out commercial uses for location-based information.
As Cornell University Professor
Stephen B. Wicker writes of the mandate in an August 2012 paper entitled
The Loss Of Location Privacy In
The Cellular Age, the technological
and sociological impact has far outstripped this intuition over the past
16-plus years. Wicker also writes that
cellular telephony has always been
a surveillance technology. Cellular
networks are designed to track a cell
phones location so as to route a signal
to the most appropriate tower. Today,
though, a much more fine-grain location resolution is possible, including
estimates with address-level precision, Wicker writes. For many mobile
phone users, such abilities pose data
and privacy worries.
Zumerle says what user data carriers
can retain and for how long depends
on regional and national regulations.
In the European Union, for example,
the Data Retention Directive requires
providers to retain data pertaining to
public communications for at least
six to a maximum of 24 months, he
says. Typically, carriers must only store
metadata, or call-related data (caller
ID, time, length, etc.). Other countries generally apply similar measures,
Zumerle says. Under what circumstances law enforcement can obtain
data from carriers also depends on
the country, he says. In the Western

In the Western world, accessing data has typically


required a court order, although were observing an
increasing tendency to simplify the access to data.
DIONISIO ZUMERLE
Principal Research Analyst : Gartner

world, Zumerle says, accessing data


has typically required a court order,
although he adds, were observing an
increasing tendency to simplify the
access to data. (In January, the U.S.
Supreme Court stated it would hear arguments in April regarding the legality
of law officers searching a suspects cell
phone upon arrest.)
Iadarola says data-retention issues
can get a bit complex depending on
whether a carrier or app provider is
involved. Generally, carriers have more
standardized policies implemented.
Text messaging data, for example, is
typically kept 60 to 90 days depending
on the carrier. Carriers generally dont
openly discuss their policies, Iadarola
says, because federal or law enforcement factor can influence some
behind-the-scenes scenarios where
maybe some data is being maintained
longer. Overall, while there has been
much press regarding mobility privacy/
security, Iadarola believes users inherently trust carriers more than maybe
is getting reported. Carriers, he says,
are better-positioned for privacy management and security than app providers, and theyre already storing
various user data.

The Ad Influence
Data-retention practices among application providers can greatly vary,
Iadarola says. Though users typically
must opt-in to an agreement concerning data and location information,
providers data requirements are more
flexible and not as standardized, he
says. Iadarola thus strongly encourages reading any language concerning
an app providers formalized policies,
because some providers keep data a
long time and sell data to marketers

or other parties. Most users dont read


such language, however, and end up
seeing marketing or advertising related
to data they didnt necessarily know
was publicly available.
To pinpoint user location, app providers have several options, including
carrier network-based abilities, GPS,
and Wi-Fi positioning. The latter addresses various indoor location-determination requirements, Iadarola says.
One thing were seeing a lot more of
with these check-in, social networking
type of services is theyre often occurring in indoor environments, he says,
often deep indoors. This makes position determination critical to businesses and retailers willing to pay to
target specific groups of users.
The deeper app providers can get
into demographic data theyre collecting, Iadarola says, the more valuable it is for marketing and advertising
purposes. Combine demographics
with location and the data becomes
extremely valuable, he says. When
targeted in a mobile environment,
Iadarola says, theres a much higher
propensity to purchase a service or
product. Iadarola dubs this activity
hyper-targeting.
Although demographics-location
data is valuable, it also raises privacy
issues among those being targeted.
For this reason, many mobile app providers and definitely the carriers realize its important they provide users
an opt-in model for such marketing
and advertising services. For a lot of
those services to succeed in the long
term, its important that users feel
comfortable with the security and privacy around the information theyre
providing to an application provider,
he says.

CyberTrend / November 2014

39

What Are Data Lakes?


BETTER INCORPORATE UNSTRUCTURED DATA INTO YOUR OVERALL ANALYSIS APPROACH

IF YOUR COMPANY currently uses or


is looking to use big data and analysis technologies, then theres a good
chance you have heard the term data
lake in recent months. The technology
is so new, however, that analysts and
experts on each side of the data lake
coin are trying to determine whether
the technology is a viable option for
enterprises. Although some analysts
say data lakes arent as beneficial as
vendors and stakeholders in the market
say they are, there is a chance that,
with proper implementation, many
companies could use data lakes for
storing big data.

Filling In The Gaps


At its very core, a data lake is a
data repository like any other, which
means you can compare it to your
other everyday databases and enterprise data warehouses and get a general understanding of how it works.
However, data lakes are primarily de-

40

November 2014 / www.cybertrend.com

signed for storing unstructured and


semi-structured data vs. enterprise
data warehouses, which are for storing
structured data, says Daniel Ko, senior consulting analyst at Info-Tech
Research Group.
Data lakes take advantage of low-cost
Hadoop cluster storage, and for that
reason, companies can use them as repositories for data they arent using presently but may need to access in the future.
Its almost like an insurance policy . .
. and in the case of a data lake, you are
storing some unused data and hoping
that one day you will be leveraging it in
the future, Ko says. This allows you to essentially save data for later that you dont
necessarily want to store in your main
storage arrays but may want to access in
the future.
Ko says there is a gap in the current way companies use data warehouses and other repositories, because
they only deal with the structured
data that most businesses are familiar

with. Data lakes help fill those gaps


by gathering and storing unstructured
data from things such as Web logs,
data center equipment messages, and
other streams. There could be a lot of
hidden messages in those data sources,
but right now you dont have a way to
analyze them systemically, says Ko.
A data lake fills that gap by providing
a repository where you can store those
data sets. And then once you have a
data lake program up and running,
you can find some specific people to
analyze that data and give you some
insights that are not available in your
enterprise data warehouse.

From Marketing Jargon


To Real-World Use
There is an ongoing debate at this
time among analysts as to whether the
term data lake is simply marketing
jargon or if it is truly a beneficial technology enterprises should explore. For
example, the research firm Gartner has

released reports about data lakes and


how they dont necessarily live up to
their billing. Ko, however, says companies simply need to look at data lakes
just as they would any other immature
technology. They need to put the programs through their paces and make
sure the technology fits well into their
overall big data analysis approach.
Before doing that though, they need
to understand what data can go into a
data lake.
A lot of people are using data
lakes as data landfills, where its just
a mashup of a bunch of data, says
Ko. People are putting too much stuff
in the data lake and turning it into a
data landfill or wasteland. He suggests
that companies start by implementing
a data pond in the beginning, instead
of taking on a data lake, and then they
can pick one or two departments to
start with.

Data Ponds vs. Data Lakes


The concept of a data pond is more
manageable than a data lake, thus its
the term you will want to use during
the pilot phase. The key to having a
successful data pond pilot is to focus
on what you want it to achieve. For
instance, Ko says, you can use improving revenue as a general theme
for your data pond. The IT team can
partner with the sales and marketing
departments, because they are on the
front line interacting with customers
and generating revenue, he says.
From here, you decide what types of
data will fit best into your overall data
pond theme. Ko recommends looking
at high-impact and high-visibility data
first. This means gathering information from CRM (customer relationship management) systems, email
exchanges, recorded calls, or other
records of customer interactions and
placing them in the data pond. By

Im advocating the term 'data pond' because I dont want


people to get lost in the data lake. At this stage, because
the technology is not too mature and you have a bunch of
data in your organization, lets do a data pond before you
do a data lake.
DANIEL KO
Senior Consulting Analyst : Info-Tech Research Group

placing all of this unstructured and


semi-structured data in one location,
you can access the information when
needed without having to convert it
into structured data that your other
systems can handle.
Ko thinks that taking this approach
will make data lakes a reality and allow
people to move past the jargon identifier attached to the concept. He is
currently working with clients that are
starting their data lake journeys. He
explains that these clients are setting
up some kind of Hadoop cluster first,
and then they plan to put some semistructured and unstructured data in
before running analytics.
Its a starting point right now," says
Ko, "but its turning from marketing
jargon to reality. And I can see some
big potential once people are having
some successes, because they will be
using those big data insights in their
operations sooner or later. The actual implementation will be sometime
next year or the year after, and then
they should be getting some successes
within two years.

Put The Right People In Charge


Starting with a data pond and eventually moving to a data lake doesnt
mean much unless you have the right
personnel in place to run the programs
and sift the usable information out of
the mounds of unstructured data. For
that reason, Ko says, you want to build

a dedicated team of data scientists and


data analysts to analyze your data before you actually hand that data over to
your everyday business users.
There are a few reasons for this, but
the major ones are possible security
or privacy issues related to the data
pond or lake, especially if it contains
sensitive customer information. You
want to have dedicated people, and
they should sign an advanced nondisclosure agreement so they are safe
to export that data and, in the meantime, the data is being protected by
the NDA, says Ko. At the end, they
will have a bunch of data insights and
those insights will be actionable at the
operating level. In essence, your data
analysis experts will sift through the
data lake and hand information over to
the business in a form it can use.
Another important step in setting
up and maintaining a successful data
lake strategy, according to Ko, is to
make sure you socialize the concept
of a data lake from your IT management up to your senior management
and board of directors. You need to
make sure that everyone throughout
the organization understands the
value and benefit of a data lake as a
big data insurance policy that could
bear fruit in the future. And if you
can establish that foundation, theres
a much better chance that a data lake
will prove beneficial to your company
as the technology matures.

STARTING WITH A DATA POND AND EVENTUALLY MOVING TO A DATA LAKE


DOESN'T MEAN MUCH UNLESS YOU HAVE THE RIGHT PERSONNEL IN
PLACE TO RUN THE PROGRAMS AND SIFT THE USABLE INFORMATION . . .

CyberTrend / November 2014

41

Big Data Executive Overview


LEARN HOW BIG DATA & ANALYTICS CAN IMPACT YOUR BUSINESS

BIG DATA ISN'T a new term by any


means, but it continues to grow as
a popular topic for companies considering new technologies and future strategies. Although there are
numerous explanations for how big
data works, we still have a lot to learn
about the subject.
Gartner defines big data as high
volume, variety, or velocity information that needs new ways of processing,
either because of cost or because of
what you want to do with the data,
says Nick Heudecker, research director
at Gartner. The goal is to take advantage of more data sources to go beyond
simple analytics techniques. The definition is a good baseline for customers
trying to understand big data, but you
can also take it further.
For example, Brian Hopkins,
Forrester Research vice president and
principal analyst, recommends redefining big data for the market as practices and technologies that close the

42

November 2014 / www.cybertrend.com

gap between the amount of data thats


available and the businesss ability to
use that data for business insight. He
stresses the idea of closing the gap because it is an action and helps position
big data as a journey and not a destination or technology. He calls big data a
complete transformation of the business to be more data driven and says it
gives companies the ability to use more
diverse kinds of data, ask the right questions, and have a technology strategy
that supports using different solutions
as theyre appropriate.

How Companies Use Big Data


The possibilities for using big data
are essentially limitless. Heudecker
says hes aware of companies that use
big data when choosing where to place
products on shelves, and others use
it to determine the ideal location for
planting specific crops or testing new
seeds. The key to having a successful
big data implementation is to make

sure you have clear goals for how you


want to use the technology, and then
figure out how to integrate it into everything you do.
[Big data] is pretty iterative, says
Heudecker. Ideally, companies will
start with some kind of business objective. Lets say they want to improve
or reduce the amount of time a customer spends talking to a customer
service representative. They might look
at data for reps that have very quick
turnaround times, social media data
(to predict how or when they might get
a spike in calls), or stats for training.
The idea is to combine a variety of
different data sources and take them
beyond the typical call logs or other
recorded data. Where things get complicated is when companies implement
a specific technology and integrate it
into their existing infrastructure. Its
difficult for companies to integrate
multiple data sources just from a technological perspective, but its even

harder when you factor in security and


governance.
Companies also need to understand
that technology is the last piece of the
puzzle and isnt the be-all and end-all of
their big data strategies. There are all
kinds of opportunities, but where these
things start is with a business problem,
says Heudecker. The failures that
we hear about start out as technology
problems. I want to play with this new
technology or I think I need X or Y to
compete, but I dont have a plan for
what I want to do with this. Its increasingly challenging for people to reign
these people in who just want to treat
it as a technology boondoggle. Its not a
spending opportunity. Its a competitive
opportunity.

Is Big Data A Fit


For Every Company?
In keeping with Heudeckers idea of
not making the technology itself a priority, Hopkins stresses that companies
should ask themselves whether there are
problems big data cant handle. He says
companies need to stop looking at big
data as a solution to a problem and
start looking at big data as an opportunity to overhaul how they do business
and put themselves in a more competitive position in the future.
If youre thinking about big data in
terms of I can do big data on this, but
not that, then I think youre missing

If youre thinking about big data in terms of I can do big


data on this, but not that, then I think youre missing the
opportunity of big data, which is to close the gap. When
will closing the gap not deliver what you need? Never.
BRIAN HOPKINS
Vice President and Principal Analyst : Brian Hopkins

Big Data Is Much More Than


Business Intelligence
Another common misconception
about big data is that its an extension
of BI (business intelligence), which is
not the case. Hopkins says Forrester
recently conducted a study in which
it asked 1,000 IT and business professionals what they thought about big
data. Interestingly, IT professionals
were 14% more likely to say that big
data is simply an extension of BI than
their business professional counterparts. However, Hopkins stresses that
big data is about so much more than BI.
Its about understanding the customer.
I want to frame this in the context
of a larger trend that Forrester calls the
age of the customer, says Hopkins.
What that means is that we think over
the next 15 years, the only sustainable
competitive advantage that firms have
is their ability to understand customers
so they can win, serve, and retain them.
Everywhere we look, thats what firms
are focusing on, not from a technology

HOPKINS SAYS COMPANIES NEED TO STOP


LOOKING AT BIG DATA AS A SOLUTION TO A
PROBLEM AND START LOOKING AT BIG DATA
AS AN OPPORTUNITY TO OVERHAUL HOW THEY
DO BUSINESS . . .
the opportunity of big data, which is to
close the gap, says Hopkins. When
will closing the gap not deliver what you
need? Never. The point is that you need
to put more insight into the hands of
your business to get a bigger competitive advantage. And there isnt a business in the world that doesnt want to
do that.

perspective but from an overall business


perspective.
Heudecker agrees that one of the biggest functions of big data is to enhance
the customer experience through information gathering and analytics. He
agrees that big data is more than BI, even
going as far as to say that the people who
are doing BI in the organizations today

are not the people who are going to be


getting value from big data initiatives.
Heudecker says that big data requires a
completely different skill set from BI and
will require companies to bring data scientists and analysts into the fold.
Youre going to make additional
investments in your analytics infrastructure, whether its through tools or
through people, skills, and training,
Heudecker says. Hiring is one option,
but my advice, typically, is to look internally first and see if there is someone
who has an interest in data science that
can be trained. Its easy to teach someone
a programming language and how to use
these techniques, but its very difficult to
teach someone your business.

Cloud Considerations
One more thing to consider is how the
cloud could affect your big data effort.
Heudecker says that although its true
the cloud is good for intermittent processing jobs that many big data projects
rely on, and it gives you a variety of options for trying out new things with very
little commitment to investment, there
are some potential disadvantages, as well.
The downsides are pervasive security and privacy concerns, he says. The
other is you might not be able to optimize the environment for your type of
processing as much as you might like
or need. Finally, if you need to move a
massive amount of data to or from the
cloud to on-premises environment, it
can be very challenging and is still not a
well-told story. You may often find you
are doing some kind of analytical processing in the cloud, moving that result
or sample to an on-premises environment, and then doing additional work
there.

CyberTrend / November 2014

43

Greenovations
ENERGY-CONSCIOUS TECH

The technologies
that make our
lives easier also
produce some
unwanted side
effects on the
environment.
Fortunately, many
researchers,
manufacturers,
and businesses
are working to
create solutions
that will keep us
productive while
reducing energy
demands to lessen our impact on
the environment.
Here, we examine some of the
newest green
initiatives.

The Quant is an electric concept car powered by flow cell technology developed by NanoFlowcell AG. The company
hopes to eventually bring the car and its technology to market.

Futuristic Electric Car That Uses Saltwater As Fuel Approved For


Road Testing In Germany
The Quant e-Sportlimousine, a luxury concept car that made its world debut
earlier this year at the Geneva Motor Show, has now been approved for open road
testing on public roads in Germany. The car is being developed by nanoFlowcell
AG, a young energy R&D company based out of Lichtenstein, which is using the
Quant as a proof of concept for its flow cell technology. Flow cells are rechargeable
fuel cells that generate electricity when liquids separated by a membrane exchange
ions. NanoFlowcell says when the electrolytes in the Quant are exhausted, the flow
cells can be recharged immediately by refueling tanks in the car with saltwater.
The Quant has four electric motors, one for each wheel, which the company claims
can power the car from 0 to 62 mph in less than 2.8 seconds and to a top speed of
more than 236 mph. NanoFlowcell is hoping to eventually bring its flow cell technology to market in many industries.

Smart Meters Now In 50 Million U.S. Homes


A report from the Edison Foundation IEI (Institute for Electric Innovation)
on the state of smart meter adoption in the U.S. says that more than 43% of
U.S. households now have installed the energy-saving meters. So-called smart
meters are devices that provide real-time monitoring of electrical use and other
information that customers can use to reduce waste. The report says that 4 million more homes installed smart meters during the past year, bringing the total
number of installed meters to over 50 million. IEI surveyed utility companies
and compiled results for the report and found that 30 of the largest utility providers in the U.S. have already finished deploying smart meters in the field.

44

November 2014 / www.cybertrend.com

Data Centers & Energy


In a recent report, the NRDC (Natural Resources Defense Council) says
data centers continue to consume large
amounts of energy in the U.S., devouring
an amount of power last year equivalent to the output of 34 500-megawatt
coal-fired power plants. By the year 2020,
data centers are projected to consume as
much power as the output from 50 similarly sized power plants.
While the largest data centers have implemented measures to maximize energy
efficiency, the report says smaller data
centers actually account for most of the
energy consumed. The NRDC encourages
companies to save energy where possible,
and says that if companies adopted even
half of the energy-saving practices that
it has identified in previous studies, they
could save billions of dollars annually.

General Electric is giving


everyone the chance to take
control of their appliances
with the GreenBean, a circuit
board that lets users create
programs that work with
compatible washers, dryers,
ranges, and refrigerators.

GE Releases Maker Kit That Lets Consumers Communicate With


Their Appliances
General Electrics GreenBean maker module is a circuit board that gives you
access to the computer circuitry inside compatible GE appliances. Connect one
side of the module to your appliance and the other side to a computer and you
can use the SDK (software development kit) that GE provides to create your
own appliance apps. You will need some programming skills, of course. The
SDK is Java-based and works with PC/Mac/Linux, so you can build apps using
just about any type of computing product. GE gives some examples of useful
ideas, such as programming your dryer to send you a text message when a load
is done, or programming your oven so you can adjust the temperature setting by using your phone. You also can use apps that other users are creating
and sharing online. You can purchase the GreenBean maker module from
FirstBuild (www.firstbuild.com; click the Market tab) for $19.95. Theres also a
link on the FirstBuild site to where the free SDK for the maker module can be
downloaded from GitHub (www.github.com).

The NRDC has forecast that companies in the U.S. will


spend $13 billion or more on energy for data centers in
the year 2020 unless they adopt energy saving practices.

Kyocera Begins Construction Of The Worlds Largest Floating


Solar Power PLANT
SCE opens grid facility
Southern California Edison has
opened a $50 million facility funded in
part by the U.S. Dept. Of Energy that
will test the feasibility of storing renewable energy on the energy grid. The
Tehachapi Energy Storage Project, located in Tehachapi, Calif., stores up to 32
megawatt hours of energy in giant banks
of lithium-ion batteries. The facility is
located in an area slated for major wind
energy production by 2016 and will be a
testbed for SCE.

Land is a precious commodity, especially in a heavily populated place like


Japan thats surrounded by water. Unfortunately, it takes a lot of space to hold
all the conversion panels used in larger solar power installations. Now Kyocera
has announced it is partnering with Century Tokyo Leasing Corporation and
Ciel et Terre to build two new solar power installations in Kato City, Japan, that
will avoid the land problem because they will float on the water. The first of the
installations will be built on Nishihira Pond and is projected to generate up to
1.7 megawatts of energy. The second installation will be built at Higashihira
Pond and will generate up to 1.2MW. Kyocera will provide the solar panels and
oversee construction for the project, while France-based Ciel et Terre will provide the floating platforms and Century Tokyo will arrange financing. When
completed, the Nishihira Pond installation will be the worlds largest floating
solar power plant in terms of power output. Kyocera says it intends to develop
other floating solar power facilities in Japan during the coming year that combined will generate up to 60MW of energy.

CyberTrend / November 2014

45

VPN Benets & Setup


REMOTE CONNECTIONS, DONE RIGHT

YOUR ORGANIZATION'S internal network, or intranet, exists to provide


authorized employees with access to
the raw data, sensitive and otherwise,
that they need to perform their jobs.
Hopefully, by necessity, this network
is sandboxed and hidden behind layers
of security, including firewalls, encryption, and limited access control. The
problem organizations large and small
alike encounter with this basic structure
is that sometimes, perhaps many times,
not all authorized employees are located
behind the physical walls of the establishment, yet they still need access to this data
to perform their functions. Workers at
satellite offices and branches also need a
simple yet secure way to access the data
stored on servers at headquarters, and
vice versa. For these situations, organizations can create a secure link that
lets these remote workers access private
data from over the Internet, without perpetrating the networking equivalent of
hiding a key under the Welcome mat.

46

November 2014 / www.cybertrend.com

Setting up a VPN (virtual private


network) affords a variety of advantages, and depending on your organizations needs, the process may be
easier than you think. In the pages that
follow, well explain what a VPN is,
list a few of the upsides, and detail the
options for setting up a VPN on your
own or with the help of a third party.

Nothing Virtual About


The Benefits
In its simplest form, a VPN is a private network a user can access from
the Internet. As such, securing those
transactions is a VPNs primary occupation. The key to a successful VPN
in all instances is maintaining the integrity of the network; if you violate
the intranets isolation or disregard
existing access management policies,
even temporarily, then youre setting
yourself up for a major security breach.
Although no network can be 100%
secure, a good VPN is one that en-

crypts all incoming and outgoing data


so that anything that gets intercepted
is unreadable. Implementing a limited
time frame during which users can access a VPN, for instance between 8
a.m. and 5 p.m. in a given time zone,
is another way to bolster a VPNs security.
Another facet of a solid VPN is ease
of use. Connecting to the network
needs to be easy enough that nontechnical employees can do it without
needing a technician to walk them
through the process. Its also important
that the VPN connection remain intact
for as long as is necessary, however,
a built-in timeout period is essential
to prevent unauthorized users from
gaining access.
Organizations should also make sure
there are enough networking resources
dedicated to the VPN to allow multiple
simultaneous outside connections at
a time. And choosing a type of VPN
that can scale to your evolving needs

Microsoft Windows is setup to


let you connect via VPN (virtual
private network).

can save you time and


money in the long run.
VPNs are also prized
for their flexibility.
Workers who have the
freedom to accomplish tasks while away
from their desks can be
more productive than
those who cant. For
employees who would
otherwise need to endure a long commute
or relocate to accept a
position, they can use
a VPN and work from
home, saving travel and
moving costs for both
the company and the
employee.

Tunnel Vision
A VPN can be set up to use encrypted dedicated point-to-point connections or a site-to-site protocol.
When one of these methods is in place,
the remote user can initiate the secure
connection and access the intranet
right from a browser on a variety of
devices, including PCs, laptops, tablets,
and smartphones.
Larger organizations that also have
branch offices can benefit from setting
up a site-to-site type of VPN, which
is commonly used to establish a link
between the intranet and predefined
locations.
The remote-access VPN variety, on
the other hand, lets a user tap into files
and folders stored on the companys
servers using an Internet-connected
computer (or other device) located
anywherewhether it be at home or
in a hotel room on the other side of
the planetas if he were on a local
machine. This type of setup is ideal for
workers who travel frequently, work
from home, or who occasionally need
to access files from an offsite location.

This type of system starts with a network access server, media gate-way,
or remote-access server (this can also
simply be a computer) located within
the organizations main office. This device can run the software that performs
the user authorization, or it can rely
on a standalone authentication server
running on the network. The end-user
portion of a VPN consists of simple
VPN client software, which lets the
user input his credentials to initiate the
link, maintain the communication, and
ensure transmitted data is encrypted.
Much of the commonly available
VPN software relies on static IP addresses, which lets the client and server
know exactly where on the Web each
is located. For organizations that have
static IP addresses (a common business-class service ISPs offer), this isnt
a problem, but you may need to pay an
additional monthly fee to add this feature to your Internet service if it is not
already supported. You can also turn
to third-party software that is capable
of essentially turning your dynamic IPs
into static ones.

There is also a variety of third-party


ESP (enterprise service providers) that
offer VPN services capable of taking
the fuss out of setting up a remoteaccess VPN on the server and the client
side. If youre running a Windows
Server OS, then you can build your
VPN from scratch: select a VPN protocol (Point-to-Point Protocol or
L2TP/IPSec), choose an authentication protocol (MS-CHAP, MS-CHAP
v2, EAP-TLS), determine your encryption needs, institute a certificate/client
authentication system, and optionally
set up Network Access Quarantine
Control and remote access account
lockout procedures.

Let Your Data Travel Safely


As your organization grows and
evolves, chances are good that workers
will increasingly need to be away from
their desks to get work done. Setting
up a VPN solves the security problems that would otherwise hinder that
growth and development. Given the
available options, theres bound to be
one that suits your needs.

WORKERS AT SATELLITE OFFICES AND BRANCHES ALSO NEED A SIMPLE


YET SECURE WAY TO ACCESS THE DATA STORED ON SERVERS AT
HEADQUARTERS, AND VICE VERSA.

CyberTrend / November 2014

47

Better Servers, Better Business


UPGRADE YOUR INFRASTRUCTURE TO BETTER SERVE CUSTOMERS & ENCOURAGE GROWTH

IF YOU DONT REGULARLY visit your

KEY POINTS
The data center is the companys
backbone. Upgrade servers regularly to achieve top performance for
employees and customers.
Efficiency gains in just 3-5 years
can be massive. You can save
money and improve performance
using the same power and cooling.
Many servers can now diagnose
their own hardware or software issues and alert the IT administrator
so they can respond quickly.
Combined with cloud-based
solutions, DCIM (data center infrastructure management) tools help
better maintain and track servers.
Having newer servers also makes it
easier to implement these tools.

48

November 2014 / www.cybertrend.com

companys data center or have any direct interaction with it, then its easy to
see why you might take it for granted
or at least not fully understand its importance to the company. In fact, the
data center is the technological foundation of the entire business. Its the
reason why employees have access to
the applications that let them do their
jobs and why customers can interact
with the business quickly and easily.
The customer part of the equation has
expanded in recent years with the advent of mobile and social technologies,
which means that if your data center
isnt operating at its peak level, you
could be missing out on potential sales.
If the data center is the foundation
of the business, then servers are the
backbone of the data center. These systems are more than just hunks of metal
that take up space and require large
amounts of power and cooling. They
provide all of the computing resources

you need to run your business efficiently and effectively, as long as you
keep them up-to-date and upgrade on
a regular basis. If you repeatedly ask
why you need to frequently invest in
new servers, the answer is relatively
simple. The resources that you put
into your data center is really to keep
your customers happy, says Jennifer
Koppy, IDC research director.

Why Server Upgrades


Are Important
Building on the idea that servers
are crucial for a business and its
data center, Greg Schulz, founder
StorageIO, suggests looking at your
data center as an information factory, and like any factory, you have
tools and technologies to put inside of
it. One of those key tools is a server,
which lets you set up certain business
processes. How you do this depends
on how well you use the available data
center tools to deliver a good quality

product that meets or exceeds your


customers expectations with the minimum amount of waste, Schulz says.
The newer your servers are, the
more efficient they will be. Even
through upgrades you can get better
energy efficiency, improve the effectiveness of your cooling, free up some
extra floor space, and more, but this
is just a starting point. When you
really get to the bottom of it, youll
realize that the biggest benefit of implementing new servers is to improve
productivity.
You tie in the efficiencies, effectiveness, and all those related things, but
lets pull it back to productivity, says
Schulz. You can take the costs out of
doing things while also boosting productivity, getting work done quicker in
the same amount of physical space or
smaller, and getting work done quicker
with the same or less amount of power
to run it or cool it. Youre taking the
cost out of doing things like that, but
the other side is that as you start to do
more business. Youve taken the cost
out of doing each function and can do
it quicker, so now you can grow the
business while addressing both the top
and bottom lines.
This potential for growth extends to
even the smallest companies. Schulz
points out that server innovation has
gotten to the point where many new
systems are quiet to the point where
you could put a super server underneath your desk and not even know
its running. Smaller offices can benefit from having a high-speed, highperformance server without the need
for a traditional data center, Schulz
says. And for larger organizations, it
simply means that the footprints are
continuing to get smaller, so you can
do more with less.
Koppy agrees and says that server
innovation runs in line with Moores

Server innovation has improved to the degree that many


new systems are quiet to the point where you could put
a super server underneath your desk and not even know
its running.
GREG SCHULZ
Founder : StorageIO

Law of technology where you get


better performance and much lower
energy consumption in a smaller form
factor. She adds that building a new
data center is a long, painful, and expensive process, so if you can expand
your capacity by pulling in servers that
are much more dense, whether its
blades or just servers with very high
core counts that are drawing a lot less
energy, you can get that performance
boost without having to invest in a
brand new facility. If you can get a
server that consumes less energy or
puts out less heat, you can get a lot
more life out of your existing data
center, Koppy says. Thats a huge
reason to upgrade.

Efficiency Gains & Cost Savings


When you consider the actual cost
savings of upgrading your servers, look
no further than maintenance. Koppy
says that once a system is off maintenance after five years, the maintenance increases a lot and youre paying
so much for that and the labor for
someone to keep the system running.
If your company waits to refresh for
five years or even longer, imagine how
those maintenance costs continue to
rise year after year.
You can save so much money just
by having a regular refresh cycle,
Koppy says. Whats difficult for a lot
of people is that when they purchase
or lease servers, they have this idea
that yes they want to replace them after

three years, but when it comes down


to it, they dont have the agility or
asset management and tracking tools.
Thats why its so important to have an
upgrade path in place and give yourself
the opportunity to take advantage of
new server technologies as they become available.
Speaking of innovation, Schulz says
that if you havent upgraded in a year
and a half or even in the past three
years, you will already experience big,
visible gains with new servers, but if
youve waited three to five years, you
may be surprised that newer servers
are much more efficient overall. For
starters, manufacturers continue to
find ways to make servers smaller, so
they not only take up less space, they
can aid in consolidation with others.
A lot of servers can give you twice the
performance or more while requiring
the same amount of power and cooling
as your existing systems provide.
There are two sides to this, says
Schulz, speaking to the different approaches possible when it comes to calculating the benefits of new servers. If
your objective or mandate is to remove
or reduce costs, you can get into the
new technology to do the same amount
of work or similar work and lower
your costs. On the other hand, if your
objective is to stay within your cost
footprint, but you need to double the
amount of work being done and you
need to get it done in a shorter amount
of time, thats the other benefit.

THESE SYSTEMS ARE MORE THAN JUST HUNKS OF METAL THAT TAKE UP
SPACE AND REQUIRE LARGE AMOUNTS OF POWER AND COOLING. THEY
PROVIDE ALL OF THE COMPUTING RESOURCES YOU NEED TO RUN YOUR
BUSINESS EFFICIENTLY AND EFFECTIVELY . . .

CyberTrend / November 2014

49

Built-In Functionality
& More Intelligent Servers
In addition to making servers more
powerful, efficient, and relatively
small, manufacturers have made great
strides in integrating more intelligent
features into newer servers. For instance, many current server models
come with onboard diagnostic capabilities. Whereas IT used to have to
manually find out why a server fails,
the intelligent server of today will alert
IT personnel to potential issues on
the horizon and help them pinpoint
why a failure might have occurred
in real-time. In other words, there is
less looking under the hood and more
paying attention to the dashboard.
Servers can even be set up to deliver alerts and notifications so your IT
team can respond to and fix issues remotely if necessary. But the great news
for smaller organizations is that the
functionality that you used to have for
diagnostics, troubleshooting, maintenance management, and alerts and
warnings that used to be only on the
high-end, enterprise-class, big mainframe servers, are now common today
all the way down to SMB [small to
midsize business]-based servers, says
Schulz.
Another major server innovation designed to improve system intelligence
and make the IT administrators job
easier is the inclusion of such things
as virtualization technology. Such features are built right into the server
to help accelerate and offload some of
the common processing tasks, Schulz
says. All of this goes back to the concept of being able to do more with less.
Its often taken for granted, for example, that new servers can run multiple operating systems. They can also
now handle more computing tasks that
were once handled by separate cards
and components that would need to be
installed in the server. In other words,
new servers require much less physical

If you can get a server that consumes less energy or


puts out less heat, you can get a lot more life out of your
existing data center. Thats a huge reason to upgrade.
JENNIFER KOPPY
Research Director : IDC

tinkering and manual attention than


they did in the past, which frees up
the IT teams time to focus on more
pressing projects.
If youre [operating in] a small environment, it means that your staff
can be more productive, Schulz says.
It just allows them to troubleshoot
and address problems quicker. On the
other hand, if you have a larger environment where you have many servers,
it allows your staff to leverage that automation and let the servers handle
more things so those people can spend
more time on thought work and addressing other issues.

Better Management Tools


Improve Performance & Agility
Koppy says that new server management tools may actually be the gamechanging features that separate new
servers from older ones. You can now
take advantage of DCIM (data center
infrastructure management) solutions,
IT service management solutions, and
cloud orchestration tools. There are so
many more management tools available to IT teams nowadays, and they
provide you with that single pane of
glass and that single version of the
truth in order to be that agile, Koppy
says. However, she warns, the process
of implementing and using all of these
wonderful new features starts with a
great deal of up-front work and a commitment to tracking all systems and
keeping logs of all the workflows, particularly when the tools are being applied to existing systems. Its a slightly
different story with new servers.

A lot of times companies have a


hard time going in and applying that
methodology and discipline to an existing data center, but if they upgrade,
they can do that with a new system
and get closer to the whole idea of that
software-defined data center, which is
the utopia where people want to be,
says Koppy. Thats getting toward that
agility and being able to move your
workloads around and use your data
center like a single server basically. That
all requires a very strong base level of
management. She adds that companies
need to look at management tools as
being just as important as the hardware,
which will be reflected in the server
management market as vendors start to
release new tools that make it easier to
manage and maintain your systems.

Internal Cooperation
Truly understanding the importance of server upgrades means that
you have to start looking at servers
differently. In the same way that you
upgrade smartphones every couple of
years, you may need to start upgrading
servers just as frequently, depending
on the needs of your company. Its important for the business and IT sides of
the company to communicate on these
issues and for the business side, executives included, to remember that the
data center is the information factory
for employees and customers alike, and
that upgrading servers to improve performance, efficiency, and productivity
will have long-reaching effects and ensure the future growth and success of
your company.

. . . NEW SERVER MANAGEMENT TOOLS MAY ACTUALLY BE THE GAMECHANGING FEATURES THAT SEPARATE NEW SERVERS FROM OLDER ONES.

50

November 2014 / www.cybertrend.com

A New Level Of Colocation Speed


CYRUSONE NATIONAL IX DELIVERS FASTER, MORE RELIABLE CONNECTIONS

COMPANIES USING COLOCATION services understand how important it is


to have a fast and reliable Internet connection between the companys main
office and its colocation facility. This is
particularly true for enterprises taking
advantage of multi-facility colocation
arrangements for business continuity
and disaster recovery purposes. To address these business demands, CyrusOne
(www.cyrusone.com) built its own
high-speed, high-availability fiber network called the CyrusOne National IX
(Internet Exchange), which adds reliable
high-bandwidth connections between its
facilities across the United States.

Changing Demands
In recent years, CyrusOne noticed
that most of its customers started at
one location and then expanded their
deployments to multiple data centers,
either for active-active or active DR
(disaster recovery) solutions. In fact,
according to Gary Wojtaszek, president and CEO of CyrusOne, roughly
60% of the companys revenue is generated by customers with applications
deployed across multiple data centers.
The problem, as CyrusOne quickly
found out, was that each of these customers was also building its own individual network to connect applications
and equipment across multiple locations. Wojtaszek says that if they have
50 customers, for example, it would
be 50 customers re-creating 50 separate networks to tie together our data
centers. The solution to this problem

was for CyrusOne to build the National


IX network as an alternative for its
customers. Now those same customers
have access to a high-speed network
that can connect nearly every colocation facility without having to build
their own unique network.

Cost Savings & Reliability


The National IX has fundamentally
changed the way CyrusOne offers colocation services to its customers and has
also resulted in a sizable cost reduction. Instead of paying for a network
build-out, the costs associated with
National IX are simply included in the
regular colocation agreement.
National IX is also much faster
and more reliable than a network a
CyrusOne customer might have built
in the past. We have designed our
fiber routes so that they are completely
diverse between data centers, and
weve tracked all of those routes and
know that theres absolutely no shared
broadband infrastructure anywhere
along the paths, says Wojtaszek.
Because of the diversity, we provide
a very high degree of availability, such
that we could sustain a disruption in
one of the routes and still have all data
transmissions up and running.

A Fit For All Companies


According to Wojtaszek, All were
doing is replacing the existing network infrastructure that [enterprises
have had] in place for decades with
the shared network that enables them

Gary Wojtaszek, CyrusOne President and CEO

to connect reliably and securely across


our data centers. Therefore, companies that use more data-intensive
applications (say, for data mining or
digital content delivery) benefit as
much as companies using colocation
primarily for disaster recovery and
comparable reasons.
Its just another chapter of
the whole outsourcing trend, says
Wojtaszek. As companies are trying
to reduce their IT expense footprint
and take advantage of new technologies, this is one of the ways they can
do it. It enables them to focus on creating apps or other IT applications
that are really going to give them a
competitive advantage and not really
have to focus on spending a lot of
money on infrastructure.

CyrusOne | 855.564.3198 | www.cyrusone.com

CyberTrend / November 2014

51

Mobility & The Network


HOW MOBILITY IMPACTS CORPORATE WLAN PERFORMANCE

KEY POINTS
To combat the strain that more mobile devices coming into the workplace
are putting on corporate networks,
simply add more access points.
While older devices on a corporate
network can slow down other devices,
the situation isnt especially common
and is addressable by upgrading the
legacy devices.
When upgrading a corporate networks capacity, consider that users
will likely connect more devices to the
network in coming years.
Upgrading to 802.11ac technology
will add extra capacity and other
improvements that should alleviate
some issues an organization may be
currently experiencing.

52

November 2014 / www.cybertrend.com

ENTERPRISE NETWORKS are under du-

Devices = Problems

ress. Virtualization, cloud computing,


applications, streaming video, real-time
communications, and other entities are
certainly contributing to the strain corporate networks are feeling, but where
corporate WLANs (wireless local-area
networks) are concerned, mobility and
the increase of tablets, smartphones,
and other wireless devices now in
the workplace are producing a slew of
headaches for network administrators
charged with overseeing performance.
In short, Andre Kindness, Forrester
Research principal analyst, says for organizations supporting BYOD (bring
your own device), a WLAN refresh
is a common undertaking.
In addition to detailing the impact
more mobile devices are having on enterprise wireless network performance,
the following details problems that
can result, how network upgrades can
alleviate the pain, mistakes to avoid,
and more.

Increasingly, WLANs are becoming


mission-critical infrastructures to organizations, to the point that wireless networks are becoming the primary access
layer in many environments, says Mark
Tauschek, Info-Tech Research Group
research director. In other words, the
vast majority of users are relying on
wireless connectivity and not plugging
blue cables in, he says. For many enterprises this means simply installing
an AP (access point) in a boardroom
for Wi-Fi access during meetings isnt
enough anymore. Workers need and
expect throughput in all locations.
Not long ago, network administrators could roughly plan for one wireless
device per user. With widespread tablet
and smartphone adoption, Tauschek
says to assume each user will now connect up to three devices. As wearables,
such as smart watches, catch on and
begin connecting via Wi-Fi rather than
predominately via Bluetooth, he says,

Encourage workers to not accept that the network is


slow. If poor performance is making it hard to do your
job, IT and management need to know that.
JIM RAPOZA
Senior Research Analyst : Aberdeen Group

networks will take on even more of


a burden. In coming years, one user
could be connecting a tablet, smartphone, one or two wearables, and
laptop, Tauschek adds.
Today, even small companies are upgrading their WLANs to multiple APs
to handle more devices. Chris DePuy,
DellOro Group vice president, says
in terms of capacity, DellOro has observed 50 to 100 connections on highcapacity APs. Once the number of
high-bandwidth-consuming users tops
10 or so, however, some APs run out of
capacity, he says, although this can vary
widely depending on the AP.
Although mobile devices dont typically use an enormous amount of data,
Tauschek says there can be significant
chattiness if each user has multiple
devices connected, and that will impact performance. Applications can
also impact network performance, because he says they are more media-rich.
Tauschek adds that developers typically dont consider bandwidth implications because theres a perception that
bandwidth is not a limitation.

Problems At Hand
Among the wireless network-related
problems that organizations can experience: older devices on the network
slowing down others. Lee Badman,
Syracuse University adjunct and network
architect, says that while less powerful
Wi-Fi radios dont equate to slow, especially in a well-designed WLAN environment, where slowness can occur

is from dated radio technology, such as


802.11b/g, operating in a world thats
moving to 802.11ac technology. Dated
technology can be a real pain, Badman
says, especially where legacy devices that
demand data rates and security protocols
that are no longer supported exist.
Organizations can address this scenario either by accommodating the old
devices, with the penalty being decreased
performance for newer client devices and
associated network complications, or they
can disallow the legacy stuff, Badman
says. The latter will hopefully identify
better gear for the people youre saying
we will not support that to, he says.
For most organizations, this situation
isnt much of an issue, Tauschek says. In
fact, most organizations wouldnt even
know that the network was adjusting itself to accommodate older/slower clients.
The only thing you can do to mitigate if
it is an issue is to upgrade older clients,
he says. Suffice to say that b clients will
slow down other clients and the network
in general, while g clients will impact
overall network speed but not other clients because they contend for time slots
at a lower transfer rate.
In general, its often users that will tip
organizations off that wireless network
performance is lagging. Jim Rapoza,
Aberdeen Group senior research analyst, encourages workers to not accept
that the network is slow. If poor performance is making it hard to do your job,
IT and management need to know that.
Most organizations have an NMS
(network management system) in place

that is capable of monitoring network


traffic and performance and viewing
when bandwidth consumption is peaking
and where bottlenecks and chokepoints
exists. If not utilized fully or if missing,
however, Tauschek says users who will
complain loudly and frequently will be
the best indicator that the network isnt
performing as it should.
In terms of scale, larger organizations
typically face the greatest hurdles with
wireless networks, including having to
scale to the most users and devices and
accommodate the largest areas and most
remote locations. That said, wireless coverage can be a complex issue for any
sized organization in terms of an APs
location, building layout, and building
materials impacting wireless performance. Often, poor floor plans, multiple
floors, wall thickness, machinery, and
other WLANs located nearby can impact
wireless performance.
Mike Fratto, Current Analysis principal analyst, advises monitoring airspace for interference from unauthorized
APs on the network, other radios in the
2.4GHz or 5GHz range, and any other
sources of interference. Work with
neighbors to make sure there are no APs
using overlapping channels, he says.
APs that overlap can degrade performance because the overlapped channel
is treated like interference and causes
errors over the air. Letting 802.11 radios
work out fair access will improve performance for everyone.
Microwaves, says Tauschek are one
example of a non-Wi-Fi device that can
cause interference. Numerous devices
operating in the 2.4GHz range, he says,
can potentially cause some interference.
Having visibility, from a troubleshooting
perspective, into the spectrum via handheld tool or the AP will put the organization on a better path to identify sources of
RF (radio frequency) interference, he says.

IN TERMS OF CAPACITY, DELLORO HAS OBSERVED 50 TO 100


CONNECTIONS ON HIGH-CAPACITY APS. ONCE THE NUMBER OF
HIGH-BANDWIDTH-CONSUMING USERS TOPS 10 OR SO, HOWEVER,
SOME APS RUN OUT OF CAPACITY.

CyberTrend / November 2014

53

Avoid Mistakes
For organizations considering a
wireless network refresh or upgrade,
performing a thorough site audit up
front that covers device density, analyzing network traffic (applications in
use, where people use devices, etc.),
WLAN security, and other pertinent
issues can help them avoid many problems later.
Matthew Ball, Canalys principal analyst, meanwhile, says some organizations
fail to recognize that theres a difference
between treating a wireless LAN as an
overlay network vs. deploying a converged fixed/wireless infrastructure with
single-pane-of-glass management. He
explains that overlay networks typically
are treated secondary to a primary fixed
network, meaning they lack adequate
capacity and coverage. This approach
also means two points of management,
two security solutions, etc., are required.
The converged approach addresses
these areas, Ball says. Organizations
could also treat both the WLAN and
fixed network as the primary network,
providing both with adequate resources,
he adds.
Today, largely due to BYOD and mobility initiatives, its recommended that
organizations build out wireless networks with a focus on adding capacity
via more APs rather than stressing

Inevitably, if you build for coverage your network is very


likely to fail. Rather than just meet immediate capacity
needs, however, anticipate in your wildest dreams what
capacity will be years out to reduce the odds of facing
issues related to coverage caps, interference, channel
allocations, power levels, and more.
MARK TAUSCHEK
Research Director : Info-Tech Research Group

Compatible with the previous 802.11n


and 802.11a standards, 802.11ac adds
extra capacity via support for roughly
gigabit Wi-Fi speeds and includes added
beamforming abilities. (Essentially,
beamforming targets signals at clients vs.
sending them out across a wide area.)
Daryl Schoolar, Ovum principal analyst,
says while 802.11ac increases network
throughput and handles more users,
because it operates in the 5GHz spectrum level its radio range isnt as great
as 2.4GHz radios with 802.11n. Fratto
adds that as existing APs are replaced,
the 5GHz range will get congested,
as well.
Tauschek says with the use of MIMO
(multiple input, multiple output) and
beamforming, many of the issues that
once occurred with latency, delay, and
multiple paths related to bouncing

AS A STARTING POINT IN UPGRADING WLANS,


MOST EXPERTS RECOMMEND ORGANIZATIONS
WORK WITH THEIR WIRELESS VENDORS.
coverage, as was typical in the past.
Inevitably, if you build for coverage
your network is very likely to fail,
Tauschek says. Rather than just meet
immediate capacity needs, however, anticipate in your wildest dreams what
capacity will be years out to reduce the
odds of facing issues related to coverage
caps, interference, channel allocations,
power levels, and more, he says.
For organizations that havent already, an upgrade to the 802.11ac standard is likely in the immediate future.

54

November 2014 / www.cybertrend.com

signals dont exist today. You dont


see with 802.11n, and certainly with
802.11ac, the same types of issues we
used to have with g and a/b before that,
he explains. Overall, Tauschek says that
when he identifies a trouble spotmaybe
an organization is having difficulty getting coverage to a particular location
he says, my guidance is to add another
access spot. The best way to mitigate
trouble spots is to add capacity and add
an access point. Frankly, thats probably
the cheapest way, too.

Elsewhere, Fratto says for organizations using a controller-based WLAN,


distributing controllers closer to APs
can take traffic off the network closer
to where it originates. Some vendors
also split off controller functionality,
he says, which maintains the benefits of
centralized control of all APs while terminating WLAN traffic on the AP and
putting it directly onto the wired network. This model also maintains survivability because if the AP loses touch
with the controller, it will continue to
operate with the last good configuration, he says.
Badman, meanwhile, says the best
thing organizations can do to improve
their WLANs is just ensure that all engineering, installation, or support staff
know what they are doing. From design
to daily operation, skilled expertise is
the difference between constant headaches and user complaints and a system
that just works, he says. Beyond this,
developing a WLAN that enforces and
enables the organizations business operational goals is key, he says. Afterward,
he cautions against doing weird stuff
with the WLAN, such as trying to make
all sorts of consumer-grade junk work
on it at the cost of reliability.
As a starting point in upgrading
WLANs, most experts recommend organizations work with their wireless
vendors, which have tools that can account for building layout, materials,
etc., and that can model environments
and demonstrate coverage to pinpoint whats needed to increase coverage areas, create seamless integration
among areas on a campus, and more.

Help I.T. stay on pace with the

SPEED OF CHANGE
You use CyberTrend
to keep tabs on the latest

business technology trends.


IT and data center leaders turn
to Processor to learn more about
the products and technologies
that impact organizations
at their core.

Processor is a leading trade publication

.COM
Get the latest issue right now
online at www.processor.com
or on your iPad via the iTunes Store.

that provides the news, product information,


and technology advice that IT leaders and
data center employees can trust.

THE LATEST PREMIUM ELECTRONICS

This Watch Is Smarter Than It Looks


WWW.ASUS.COM
Known for its Zenbook line of laptops, Asus recently announced its first wearable device: the appropriately named ZenWatch. With
a classic silver-and-gold overall design and leather band, the watch face can display a classic look as well, or you can customize it
with one of more than 100 face options. You can also, of course, use the ZenWatch for more than a timepiece. When connected wirelessly (via Bluetooth 4.0) to an Android phone, you can use the ZenWatch to place calls, mute incoming calls, and even take photos. In
the boardroom, you can use the device as a presentation remote control. You can also use the water-resistant ZenWatch as a fitness
accessory to track calories burned, your heart rate, and many other statistics. The ZenWatch may be available by the time you read
this; at press time it was not in full release and pricing had not been set.

56

November 2014 / www.cybertrend.com

Battery Running Low?


Here's A Solution
WWW.ADATA.COM
Staying connected means staying charged,
but mobile devices have a nasty tendency to
run low on battery life just when you need
them most. With the ADATA PV110 Power Bank
($29.99) in your pocket or bag, however, you
wont be stuck without a charge. The PV110 provides 3.1 amps of power, although not all to one
device; it features a 2.1-amp USB port to charge
one device and a 1-amp USB port for a second
device, so you can charge, say, a tablet and a
smartphone at the same time. The fully charged
PV110 can power up a smartphone roughly five
times. The device uses Micro USB for its own
charging, includes a set of lights to indicate
remaining charge, and is available in four colors:
gold, silver, blue, and pink.

Roman Design,
Intelligent Cooling
WWW.DEEPCOOL.COM
The folks at Deepcool have compared the
shape of the U Pal ($19.99) to the Arch of
Constantine in Rome. Style points aside, the
design offers substantial technical benefits.
With two 140mm fansone on each side of
the archthe U Pal is able to cool laptops
precisely where they produce the most heat.
And with the arch in between, there is more
opportunity for the heat expelled by the fans
to disperse. The U Pal offers quiet operation
and a USB 3.0 pass-through. The device also
has rubber feet to keep the laptop steady and
prevent slipping, and it is adjustable so that
your laptop can rest parallel to a flat surface
or tilt at up to a 45-degree angle.

CyberTrend / November 2014

57

Trends In Articial Intelligence


ADVANCEMENTS THAT MAY IMPACT YOUR BUSINESS

AI (ARTIFICIAL INTELLIGENCE) has


often been a popular topic in science
fiction, but theres real reason to believe that AI will be a reality for business services in the not-too-distant
future. Apples Siri and Microsofts
Cortana are recent examples of basic
AI tools that can answer simple questions, but newer forms of AI could be
much more powerful and could take
advantage of big data resources, advanced analytics capabilities, natural
language processing, and adaptive reasoning. The combination of technologies will allow AI to manipulate diverse
information, so data doesnt have to
reside in a structured database, and the
AI can use reasoning (based on such
things as trend analysis or link analysis) to reveal patterns and discover
relationships that would be difficult for
a person to unearth.
In this article, we look at some of
the many ways AI might be used in
the future.

58

November 2014 / www.cybertrend.com

AI Trends
One of the biggest reasons to be excited about AI is that there are several well-funded programs throughout
the United States that are focusing on
developing AI technology and software. A number of companies . . .
are actively working on neurocomputational hardware technologies, says
Dan Kara, practice director, robotics,
at ABI Research.
Neurocomputing is a field of study
that uses computers to simulate the
human brain and perform specific
tasks, such as improving operational
efficiency or automating repetitive
activities. There is a sizable effort
dedicated to developing commercial
neuromorphic hardware (neuromorphic computing), and the ability [to]
perform massively parallel processing
on low-power, embedded processors,
makes possible a wide range of new,
dramatically more powerful, devices
and applications, says Kara.

Neurocomputing isnt the only path


to AI. There are also AI tools that use
machine learning techniques where
computers can perform tasks that are
not explicitly programmed. Machine
learning takes advantage of a special
set of algorithms that can discover
useful patterns via a given set of data,
such as images, sound, or text. Michele
Goetz, principal analyst at Forrester
Research, says, The first wave [of AI]
seems to be based around early commercialized cognitive solutions that
utilize a combination of deep learning
and decision forest algorithms. Deep
learning is a form of machine learning
that can work with neurocomputing
to let the computer think without as
many input examples as youd see with
traditional machine learning.
What types of practical things can AI
accomplish with deep learning? Goetz
cites an example of a company that uses
this approach to create applications that
analyze profiles and transactions to not

only expose behavior and transactional


patterns, but also provide transparency
in the factors that are contributing to
these insights for areas in technical support, purchasing, and financial risk.
This way, the AI can help you to understand the risk level of a given borrower
or how likely it will be that a person
could repay a loan on time, and the AI
can list the unique features of the individual in question, which can be used to
help mitigate risk.
Another factor thats helped to break
down AI barriers is that computers now
have access to digital infrastructures and
general knowledge bases that werent
easily available in the past. When you
combine big data with smart algorithms
and fast computers, you have the technology in place to make AI impactful.
Mike Battista, senior consulting analyst at Info-Tech Research Group, says,
With screens everywhere, its easier
to imagine AI having a face. With sensors everywhere, its easier to imagine
AI having eyes, ears, and other senses.
Were at a point where a lot of pieces
that could make advanced AI more viableand usefulare part of our everyday lives already.

AI Use Cases
The advancements in AI have started
to move into the realm of the business
world and commercial systems. As a
whole, the AI community is more focused
on providing working, practical solutions,
and solving problems as opposed [to] emphasizing theory, says Kara. AI techniques are being used successfully every
day for manufacturing, healthcare, [and]
communications, among other things.
Forresters recent Cognitive Engagement: A New Force Of Creative
Destruction report identifies three ways
in which cognitive systems (those that can

AI techniques are being used successfully every day


for manufacturing, healthcare, [and] communications,
[among other things]."
DAN KARA
Practice Director, Robotics : ABI Research

listen, learn, and respond using natural


language and symbolic reasoning) can
benefit companies.
In one instance, cognitive engagement
can augment peoples capabilities; such
as with digital assistant tools. A good
example of this is Google Now, which
scans information on your Google account (such as calendar entries and your
current location) to recommend events
or activities. The system could also send
a text message informing you to leave the
office a bit early to beat traffic and make
an upcoming appointment.
Another benefit according to
Forresters report, is the potential availability of tools that can improve business
operations by scaling knowledge capacity.
This involves human-like uses of AI that
can provide advice for a given situation.
For instance, AI could help a customer
service agent quickly resolve complex
problems, such as what insurance plan
is right for the client. Where big data
mostly resonated and made sense to IT
and analytic professionals, artificial intelligence is demonstrating possibilities that
non-technical individuals and business
leaders understand, says Goetz.
Lastly, AI technology could lead
to automation of a repeatable activity.
Forrester reports that when a cognitive
engagement system can process input,
make decisions, and act autonomously
in millisecondsnew categories of capability open up. Just imagine all of the
Internet-connected devices that could
become automated with the use of AI.

. . . IMAGINE ALL OF THE INTERNETCONNECTED DEVICES THAT COULD BECOME


AUTOMATED WITH THE USE OF AI. YOUR
HOUSE COULD START COOLING DOWN OR
WARMING UP AS SOON AS YOU LEAVE WORK.

Your house could start cooling down or


warming up as soon as you leave work.
More powerful AI could further automate
manufacturing lines, warehouses, and distribution centers, among other things.

Whats Next
Great strides are already being made
in many areas such as perception and
language understanding, says Kara.
For example, machines are now nearly
as good as humans at recognizing images. Imagine when these same machines
will have global, pervasive access to
distributed, indexed repositories of the
worlds images [and] databases, by the
way which will continue to grow exponentially. Extra help may come from
big companies that are researching AI
technology. Battista says that these large
companies have access to a whole lot of
raw digital data about people, and about
the world, that may be necessary for AI
to flourish. He also notes that Perhaps
even more importantly, they have the
resources to try an expensive idea, fail
miserably, [and] then try another.
In terms of effects of the business
market, AI experts are working to help
people make better decisions. Goetz says,
The next wave will be to move expert
systems to interactive problem solving
platforms. He explains, they are
working on how to connect todays corporate silos of analytic competency centers to operational competency centers
and reduce the bottleneck to corporate
knowledge and fast time to scalable business value. Kara echoes those thoughts,
saying, In the near term, the greatest
benefit will be to industries that can benefit from drawing understanding from
massive amounts of data. Healthcare,
retail, and logistics provide examples.
All-in-all, it looks like a bright future for
AI technology.

CyberTrend / November 2014

59

Images, clockwise from top left, courtesy of Apple (1), Samsung (2), Sony (3), Microsoft (4, 5), and BlackBerry (6)

Smartphone Tips
IMPROVE YOUR MOBILE EXPERIENCE

WINDOWS PHONE
Pin, Unpin & Resize
Start Screen Tiles
When you power up a brand new
Windows Phone smartphone or download new apps, youll see that app tiles
simply show up on-screen. You dont
have to resign yourself to the size and
placement of those tiles. To pin an app
to the Start screen, swipe left to access
the Apps menu, tap and hold the tile,
and tap Pin To Start. To unpin a tile,
tap and hold the tile, and tap the Unpin
icon. You can also tap and hold a tile
and then tap the arrow icon to make the
tile smaller or larger. The larger it is, the
more information it will display.

60

November 2014 / www.cybertrend.com

Take Advantage Of Auto-Correct


Windows Phone offers numerous automatic text correction features that
are designed to speed up typing. When you reach the end of a sentence, for
example, tap Space twice to automatically place a period at the end of the
sentence and capitalize the next word if you should decide to type another
sentence. Windows Phone also adds accents and apostrophes where it seems
they might be needed (for example, it will change werent to werent, and
will even try to determine whether the word well is meant to be left as it is or
changed to well). As you type words that Windows Phone doesnt recognize,
it might try to change them. If you should ever need to reset the dictionary, access Settings and tap Keyboard, Advanced, and Reset Suggestions.

Combine Duplicate Contacts


Windows Phone 8 does a great job of collecting all of your contacts into one
place so theyre easy to search for. If you find you have multiple instances of the
same contact, however, drawn in from various email and social media accounts,
you can use the Link feature to combine them on your smartphone. Find one of
the contact instances, tap the Link icon, and then either select another contact instance to connect it to or tap Choose A Contact to find the correct contact to link.

ANDROID
Cant Log Back In To A
Hotel Wi-Fi Hotspot?
Whether youre in a hotel room,
restaurant, or anyplace where there
is a secured Wi-Fi hotspot available,
the following scenario might be familiar. At one point you asked for the
SSID (or password for accessing the
hotspot), entered the SSID on your
smartphone, accepted the hotspot providers terms and conditions, and made
use of the wireless Internet access, only
to return at a later time to find youre
unable to access the same hotspot. You
might see a wireless symbol, but it has
a diagonal line through it, indicating
that your Android smartphone recognizes youve logged into that hotspot
before, but for some reason isnt permitting you to access it now.
To solve this, start by swiping
down from the top of the smartphone
screen and tapping the wireless icon.
In the list of wireless hotspots, find
the hotspot youre trying to connect
to, press and hold it, and tap Forget.
Wait for a moment to see if the Wi-Fi
network shows up again in the list.
If it does, tap it and log in as you did
previously. If it doesnt show up righta
way, power your smartphone off and
then back on, and try accessing the
Wi-Fi network again. Taking these
steps should allow you to once again
enter the SSID and accept the hotspot
providers terms and conditions.

Delete Browsing History


Deleting the mobile Chrome browser
history on your Android smartphone
eliminates some potential privacy issues and helps keep your device
clutter-free and running as quickly
as possible. In Chrome, tap the menu
icon, Settings, (Advanced) Privacy,
and Clear Browsing Data to remove
browsing history, site data, and related
information.

Get Wireless Internet For Your PC In A Pinch


You can use your Android smartphone as a wireless modem to connect a PC
to the Internet if you have three things: a smartphone model that supports it,
the appropriate USB data cable (almost always sold separately from phones),
and login information (ID and password) from your wireless service provider).
Keep in mind that the connection will be slower than youre used to on the PC
and that using your smartphone as a modem will drain the phones battery life
more rapidly than ordinary use, but this procedure can come in handy when
you need it most.
On the phone, access applications and tap Settings, About Phone, USB
Settings, and PC Internet. Press the Home key, use the cable to connect the
smartphone to the PC, and wait for the installation instructions to appear.
Follow the instructions provided to establish an Internet connection for the PC
(this will vary depending on the smartphone and the PCs operating system).
Enter the required ID and password and wait for the connection to complete.
Simply unplug the phone and return its settings to normal when youre done.

Download Maps For


Offline Use
Sometimes you have data access, and sometimes you dont.
Because it can be a pain to suddenly lose access to live maps on
your Android smartphone while
traveling, its wise to search for
the maps you might need ahead
of time and download them for
offline use later on. That way,
whether you encounter a cellular dead zone, have no access to
a Wi-Fi network, or both, youll Need to be sure a map will be handy later on? Google Maps
still have access to the maps you makes it easy to save maps for offline viewing.
need. To download a map, open
the Google Maps app, find the
area youre looking for, tap Menu,
and tap Make Available Offline. You can then view the estimated size of the available map section and pinch and zoom to select the map area you prefer. Keep in
mind that there are some limitations: you can download up to six apps (if you try to
download a seventh, youll have to delete a previously downloaded map) and there
must be enough storage capacity available on your device to accommodate the maps.

Add Voice Privacy


All major carriers will assure you that your voice calls are secure, but there is
a way to add another layer of encryption to voice calls, which makes them more
secure. To do this using an Android smartphone, access Settings, tap My Device,
tap Call, and tap to place a check mark in the Voice Privacy check box.

CyberTrend / November 2014

61

BLACKBERRY
Merge Contacts & Calendar
Items
If you find yourself with multiple
contact and calendar entries on your
BlackBerry 10 smartphone, it could
be the result of BlackBerry Link synchronizing items between your computer and your device but leaving
behind items that you created on your
smartphone. If thats the case, you can
rectify the situation by deleting the
smartphone-only entries. To do this,
access Settings on your BlackBerry, select Accounts, select the More (three
vertical dots) icon, and then select
Clear Local Contacts and/or Clear
Local Calendar. Keep in mind that this
action permanently deletes the entries
that exist solely on the device.

Zoom, Even When The


Screen Doesnt Allow It
Every touchscreen user probably knows by now what it means to
pinch to zoom, in which two fingers
are used to zoom in or zoom out on
a screens text and images. As youve
likely noticed, however, there are many
apps and browser pages on which this
is possible, and many others on which
this isnt possible. Dont let that stop
you, however. If you have a touchscreen BlackBerry 10 smartphone, access Settings, tap Accessibility, and
switch on the Magnifying Mode feature. Doing this magnifies the screen a
little bit right away. You can adjust the
level of magnification by sliding two
fingers apart on the screen (to zoom
in) or by pinching two fingers together
(to zoom out). To toggle Magnify
Mode on and off without having to go
into settings, use two fingers to swipe
down from the top of the screen.

62

November 2014 / www.cybertrend.com

Watch A Video Without Sound


For those times when you need to watch a video file on your BlackBerry but
youre not alone and dont have a pair of headphones, you have a few options. Two
optionswatching without sound or turning the sound down low and holding
your smartphone to your earusually arent all that helpful. There is a third option,
however. Access Settings, select Accessibility, and flip the Closed Captions switch to
the On position. Now youll be able to turn the volume all the way down, watch the
video, and read the text captions.

Forget Auto-Correct, Create


Your Own Text Shortcuts
All right, dont actually forget autocorrect. Despite its often-documented
failings, auto-correct (known as word
substitution in the BlackBerry universe) is arguably more helpful than
not when it comes to typing on a
smartphone touchscreen. However, if
youre using a BlackBerry 10 smartphone, you can create your own text
shortcuts to speed up your typing. If,
for example, theres a certain unusual
word, or even an entire phrase, that
you use fairly often, you can establish
a shortcut using an abbreviation or a
nonsense word that, when you type it,
uses word substitution to replace what
you typed with the full word or phrase.
To do this, access Settings, and then
tap Language And Input, Prediction
And Correction, Word Substitution,
and the Add (plus sign) icon, then
enter the abbreviated and full text
when prompted.

Within the Language And Input settings, tap Word


Substitution on the Prediction And Correction screen to
alter the ways in which your BlackBerry smartphone
substitutes typed text with other text.

Use The Smart Tag App


BlackBerry 10 smartphones that support the short-range wireless standard called
NFC (Near Field Communications) can store scanned NFC tags as well as QR codes
as smart tags. In the case of QR codes, you can think of a smart tag as somewhat
akin to a Web browser favorite, storing the essential information (such as a URL)
provided by the QR code. With an NFC tag, you might, for example, use smart tags
to save the information from someones business card after bumping phones to
exchange cards, or to save the details about an event after touching your smartphone
to an NFC-enabled kiosk. To use smart tags, open your BlackBerrys Smart Tags app
before scanning a QR code or NFC tag. Then, after capturing a QR code or tapping
your phone against an NFC tag, youll have the option to save the smart tag.

iOS
Stream Music From Computer
To Phone
Your iPhone is a great communication tool for you and your business,
but that doesnt mean its all work and
no play. In your downtime, an iPhone
makes a great iPod, letting you listen to
music and podcasts, and watch videos.
The downside is using the valuable and limited storage space on your
iPhone to hold your media library. But
there is another way, at least for when
youre at home. If youre using iTunes
10.2 or later, and your iPhone is using
iOS 4.3 or later, you can use Home
Sharing to stream all of your iTunes
content to your iPhone (and other
iOS devices) using your homes Wi-Fi
connection. Because youre streaming
the media, youre not using up precious storage space on your iPhone.
This only works, however, as long as
your computer and iOS device are both
using the same wireless network.
To Turn on Home Sharing, launch
iTunes on your Mac or PC. From the
File menu, select Home Sharing and
then Turn On Home Sharing. Enter
your Apple ID when asked, and then
click the Create Home Share button.
On your iOS 8 phone, access
Settings and select Music, and then
log in to Home Sharing using the same
Apple ID you used to enable Home
Sharing in iTunes. Launch the Music
app, tap More, select Shared, and then
choose the shared library you wish
to access. (In earlier iOS versions,
open Settings and select iPod to find
the Home Sharing section. Launch
the iPod app, tap More Options, select Shared, and then choose the appropriate library.) From there, use the
iPod app the way you normally would.

Cut Signals To Save Battery Life


Whether youre using your iPhone or not, the Wi-Fi radio continually scans
and therefore slowly drains your battery. To turn off Wi-Fi, tap Settings, tap WiFi, and slide the switch to the Off position. Turning off cellular data is another
great way to conserve battery life. To do this, tap Settings, tap Cellular, and set
Cellular Data to the Off position.
Although disabling Wi-Fi and cellular data does a lot for your battery
life, it can be murder on your mobile
productivity and social life. If youre
looking for a better way to reclaim
lost processing cycles, consider disabling push notifications. To do this,
tap Settings, Notification Center, and
scroll through the list of applications
that support notifications to enable or
disable Sounds, Alerts, and Badges.
To turn off notifications for individual
apps, tap the app and slide the toggle
switches for Show In Notification
Center and Show On Lock Screen to
the off position, and then tap Back.
If you want to turn off notifications
entirely, you can do so by using the
Do Not Disturb feature. Tap Settings,
tap Do Not Disturb, and slide the
Manual switch to the Off position
or set up a schedule for making Do
Not Disturb active. Using this feature turns off alerts and calls as well. Use the iPhones Do Not Disturb feature to turn off
When Do Not Disturb is in use, a notifications entirely.
moon icon displays in the status bar.
If you want to kill every wireless
signal your iPhone emits, you could be looking at a substantial battery-saving step,
and its easy to do: just enable Airplane Mode. To do this, tap Settings, and then
slide the switch adjacent to Airplane Mode to the On position.
Other battery-conservation tips include disabling Bluetooth (tap Settings, tap
Bluetooth, and slide Bluetooth to the Off position) and turning off location services (tap Settings, Privacy, and Location Services; slide Location Services to the
Off position).

Block Calls, Texts & FaceTime


Accessing the block list on iOS 7 or iOS 8 requires only a few simple steps.
Access Settings, scroll down, and tap Phone, Messages, or FaceTime. (It
doesnt matter which option you select; you wont receive calls, messages, or
FaceTime connections from any contact you add to the Blocked list.) Next, tap
Blocked in any of the three categories and choose Add New to select who to
add to your Blocked list.

CyberTrend / November 2014

63

Isolate Malware
HOW TO COMBAT ATTACKS

AN UNFORTUNATE FACT about using


an Internet-connected computer these
days, whether it is a personal or company-issued notebook, is the constant
threat of malware infection. Even when
taking preemptive action to combat
malware attacks, theres a fair chance
one will eventually hit your notebook
anyway, if for no other reason than
the sheer volume of malware that attackers introduce daily. Frighteningly,
research from a leading security software maker revealed that more than
15 million new malware samples were
developed between January and March
2014 alone. Of this number, Trojan
horses accounted for 71.85% of all malware and were responsible for 79.90%
of all global computer infections.
Whats startling is that these attacks
included zero-day threats in which, as
the name suggests, zero days expire
between when a given vulnerability is
discovered and when attackers release
malware targeting the vulnerability.

64

November 2014 / www.cybertrend.com

With malware being so prevalent and


persistent, a large part of combatting
it is being able to recognize signs that
a system may be infected and then
knowing how to troubleshoot the
problem. Also important is what security tools are available to detect, protect
against, and remove malware. The following details these issues and others
for notebook business users.

The Warning Signs


Although new malware variants are
constantly being developed and released, malware is generally categorized into several common groups,
including viruses, worms, rootkits,
spyware, Trojans, keyloggers, adware,
and ransomware. What these groups
have in common is an aim to infect
a users notebook to steal personal
or company information, hijack the
system outright, or cause other types
of damage. Malware infections can
transpire in numerous ways, including

when you visit an infected website, install software or an app with malware
hiding inside, click links or open attachments in email, or insert an infected USB thumb drive.
Though warning signs that malware
may be present can differ depending
on the malware type, there are some
primary indicators to look for. Michela
Menting, ABI Research practice director, says the most common include
applications and programs running noticeably more slowly, slower Internet
performance, and data or files that
are unexpectedly deleted or altered.
A notebook running more slowly, for
example, could indicate malware is
stealing computing resources to fuel
whatever activity the malware was designed to execute, such as hijacking
the system to help generate and spread
spam to other systems.
Some specific examples of changes
in notebook performance to watch out
for include programs, files, and folders

that take longer to open or that dont


open at all and the notebook taking
exceedingly long to shut down or not
shut down at all. Menting says an easy
way to check for system performance
issues on Windows notebooks is to
look at the processes running in the
Task Manager and pay particular attention to memory or CPU resources.
If users regularly check the Task
Manager, they may be able to more
easily spot when something looks different from normal, she says.
Other odd or strange system-related
occurrences that can signal possible
malware activity include the notebooks battery draining more quickly
than normal, beeps or alarms sounding
unexpectedly, and internal fans
speeding up for no obvious reason.
Elsewhere, the sudden and constant
appearance of error messages can be
a clue that malware is present, as can
a Web browsers home page changing
or new toolbars appearing in the

Most malware will use the Internet connection to send


information back or infect other computers on a network.
Isolate the laptop and then run an antivirus scan.
MICHELA MENTING
Practice Director : ABI Research

that the user appears to initiate and


share with his contacts.

Immediate Response
When you suspect malware has infected your notebook, Menting advises
turning off its Internet connection.
Most malware will use the Internet
connection to send information back
or infect other computers on a network, she says. Isolate the laptop
and then run an antivirus scan.
Additionally, ensure that antivirus
software on the notebook is up-to-date
with the latest malware signatures. If

WHEN YOU SUSPECT MALWARE HAS INFECTED


YOUR NOTEBOOK, TURN OFF THE INTERNET
CONNECTION IMMEDIATELY.
browser without the users involvement. Additionally, an inability to access various system tools; messages
that report that administrator rights
have been denied; and a sudden disappearance or appearance of unfamiliar
icons, shortcuts, folders, photos, and
file types are all other possible malware
warning signs.
Pop-up messages, including those
that appear out of the blue when a Web
browser isnt even open, are another
indication that malware (particularly
adware and Trojans) may be present.
An especially cruel type of malwarerelated pop-up is one that warns
a user of security vulnerabilities on
his notebook and recommends that
he download or buy the suggested security software (which happen to be
fake). Another indicator to watch for
includes phony social network posts

not, then copy a free AV program onto


a USB thumb drive and use it to install
[the software] on the disconnected infected PC, she says. More sophisticated malware, Menting says, may
be able to obfuscate its presence, and
others, such as zero-days, have simply
not yet been uncovered by security
firms and, therefore, an antivirus [program] will not help. In such cases,
Menting says the best option may be
to wipe the hard drive clean and reinstall the operating system.

Means Of Prevention
As a means of prevention, Menting
says, at the least, you should ensure
that a firewall is running and working
properly. Generally, she says, most
operating systems have built-in security features that users should activate.
Additionally, numerous programs

(including PDF and document-creation programs) provide options to


password-protect files. These are really useful for protecting sensitive documents, she says. On browsers, there
are a number of security features that
can also be activated or increased.

Malware Removal Tools


Beyond built-in tools, numerous
malware-removal tools are free for
download and use, as are numerous
useful and easy-to-use program-based,
on-the-fly encryption tools and antitheft products. Menting says, Users
should definitely consider protecting
their data as well as their devices.
She says specific features and abilities to seek out in such tools included
antivirus, antispam, antiphishing, and
antispyware; firewall and intrusion
prevention systems; email, browser,
chat/instant messaging, and application protection; privacy, ID, and
online transaction protection; encryption and password management;
antitheft and remote locate/lock/
wipe; and cloud-based services and
backup platforms.
Usage-wise, routinely run antivirus
scans and avoid opening email and
attachments or clicking links within
messages from senders you dont recognize; dont reply to suspicious email;
avoid visiting suspicious or unknown
websites; dont click pop-ups that appear suspicious and consider using a
pop-up blocker; and dont download
and install software from suspect
sources. Additionally, keep software,
including Web browsers and security
programs, updated; back up data regularly; and report suspicious activity to
your companys IT department.

CyberTrend / November 2014

65

Ease Travel Headaches


THESE DEVICES CAN HELP

IF YOU DO ENOUGH traveling, you quickly


learn tricks to eliminate some of the stress
and headaches that come with being on the
road. Technology can often help, as the following travel-friendly devices prove.

Bluetooth Finder
$20+
Ever set down the keys to your rental
car or your own vehicle on a restaurant
table only to get up, walk away, and leave
them behind? Many of us have done
this. One way to prevent the aggravation
that typically follows is to attach a small
Bluetooth dongle-like device to the key
ring. These devices communicate with
a smartphone, warning you when the
phone moves out of range of the dongle.
Typically, such devices cost about $20 or
more and have a smartphone app that
sounds an alarm, emits a beep, or even
provides a visual indicator that lets you
see when you get closer or move farther
away from the key ring or other object to
which you attached the device.

66

November 2014 / www.cybertrend.com

Bluetooth Keyboard
$60-$100+

Device Charger
$40-$100+

No matter how adept or comfortable


you are typing on a mobile devices
touch-enabled keyboard, there are
times when facing a hefty amount of
text or numbers to enter into a document or spreadsheet that would make
anyone conclude that a full-fledged
keyboard would be a better choice.
While some tablets bundle a Bluetooth
keyboard, others dont. Fortunately,
plenty of third-party options are available. Nearly all Bluetooth keyboards
are thin and light enough to make
an ideal travel companion, and some
double as a protective tablet cover as
well as conveniently fold into a stand
to present a laptop-like form factor.
Depending on the model, a particular
Bluetooth keyboard may include backlighting and specialty keys, integrate
a rechargeable battery, and provide a
carrying case. Pricing generally ranges
from $60 to $100 or more.

If theres one certainty about mobile


devices its that they dont hold their
power forever. Further, an outlet isnt
always available for recharging. Thus,
its a good idea to pack your own portable power supply so you can keep using
that smartphone, tablet, notebook, or
other device. Though price, size, and
weight can vary widely, general portable
charger options include battery packs
that are solar powered and/or integrate
a rechargeable battery. Typically, models
bundle multiple device adapters, and
some models may include multiple connectors to charge several devices simultaneously. Some packs also function as
a cover for a smartphone. Other powerrelated products worth considering
include USB power adapters that plug
into power outlets to provide a USB port
for charging, solar backpacks, power
adapters for overseas travel, and mini
surge protectors.

Luggage Tracker
$50-$100
Some people would pay a small fortune to know where their luggage is
at all times. Anyone who has experienced lost luggage understands why.
The good news is you dont have to pay
a fortune to pinpoint the exact location of your baggage. For somewhere
in the neighborhood of $50 to $100,
you can acquire a device (perhaps even
an FCC-certified and FAA-compliant
one) that uses GPS (Global Positioning
System), GSM (Global System For
Mobile Communications), and other
technologies to provide tracking abilities for luggage. These devices are
small enough to easily fit in a bag and
can do such things as email or text
you a verification that your luggage arrived at its destination. If your luggage
is completely lost, the devices let you
pinpoint the bags locations on a Webbased map.

Multi-Tool Knife
$25-$100+
While you still cant carry a pocketknife onto a plane, you can take one
with you when traveling by land, and
having a multi-tooled knife on hand
can get you out of a fair number of
jams, including in a storage sense.
Thats because beyond packing an actual blade (as well as possibly a screwdriver, scissors, wine opener, pliers,
etc.), some multi-tool knives integrate
a USB memory stick for storing and
transferring several gigabytes worth
of document, image, video, audio, and
other files. Thus, even if you lose your
notebook or tablet, youll still have important files handy. Depending on the
storage amount and model, prices of
multi-tool knives can span from about
$25 to $100 or more.

Noise-Canceling Headphones
$50-$100s
Whether in a plane, shuttle bus, taxi,
or room in a not-so-quiet hotel, noise
seemingly always surrounds a traveler. Noise-canceling headphones provide an escape from the din by using

battery-powered technology to essentially match and cancel external sound


swirling around. Alternatively, less
bulky and non-battery charged soundisolating earbuds offer comparable results by fitting into a users ear canal
to isolate sound rather than cancel it.
Depending on the mechanics, engineering, sound quality, and materials
in use, headphone and earbud prices
can stretch from $50 for an acceptable
pair to into the hundreds of dollars.

Portable Hotspot
$50+
Staying Internet-connected while
traveling is an absolute necessity for
business travelers, but doing so via
Wi-Fi isnt always easy. Although
public Wi-Fi hotspots are available,
theyre not always free and security
can be an issue. An alternative is to
use a portable hotspot device (costing
about $50 or more) that uses a mobile
broadband connection (a 4G LTE network, for example) to form a wireless
hotspot for multiple devices. Another
option is to use a portable router. Plug
an Ethernet cable into, say, a hotel
rooms wired Internet port, plug the
cables other end into the router, and
youve created a wireless network with
multiple-device support. Elsewhere,
various companies provide private
Wi-Fi services for a monthly/annual
charge that provide a welcomed layer
of security when connecting to a public
Wi-Fi hotspot.

Portable Monitor
$75-$200+
In some office environments today,
you may see two (or more) monitors
on an employees desk. This setup allows users to easily switch among open
programs, keep tabs on continually updating information in a Web browser,
and perform other multitasking chores.
Travelers can get this same secondscreen ability away from the office by
bringing along a USB-based portable
monitor. Such screens (generally $75
to $200 and more) connect and draw
power from a notebook via USB cable.

Typically, the screens support landscape and portrait modes, bundle a


stand to prop them up, and are light
enough you likely wont notice when
youre toting one around. Depending
on the model, touch abilities might
also be included.

Storage Device
$10+
One thing you cant have too much
of is storage, including for reasons
tied to accessing and backing up data.
Packing an extra memory card, USB
thumb drive, or external drive on
which you can grab or offload photos,
videos, and other files is smart. (Just
make sure not to lose the storage device on which that data resides.) If
youre expecting rugged conditions
where youre going, some external
drives and USB memory sticks offer
water- and shock-resistant features
to provide protection against the elements and drops. Overall, storage
continues to be an excellent value.
Memory sticks, for example, can
start at around $10 depending on the
storage capacity.

Video Streamer
$35-$100+
Its possible you already have a settop box-like device attached to your
HDTV at home to stream on-demand
TV programs, movies from subscription services, and audio from Internet
radio stations. A nifty aspect of these
devices is that travel-friendly sizes
(some are essentially a glorified USB
stick) means they can easily stow away
in a suitcase. Unpack it once in your
hotel room; connect it to the TVs
HDMI, USB, or AV port; configure
the Wi-Fi settings; and all the content
you have at home is now available for
watching in your hotel room. Should
you forget to pack the remote control
the device bundled, many streamers
have compatible apps that enable a
smartphone or tablet to function as a
remote control. With prices ranging
from about $35 to $100 or more, these
devices are an entertainment steal.

CyberTrend / November 2014

67

PC Problems On The Road?


HERE ARE SOME QUICK FIXES

IF YOU HAVE USED a computer for any


amount of time, then you know that
PC problems can often occur with little
warning. Maybe you are having trouble
connecting to a Wi-Fi hotspot, or you
cant get your mouse to work. We explore how to troubleshoot these and
other common PC problems so you can
get back to work quickly.

Hotspot Troubleshooting
Ordinarily, when you carry your
laptop into an airline lounge, it will
automatically connect to the available Wi-Fi hotspot. But what if that
doesnt happen? First, check that your
notebooks Wi-Fi adapter is turned on.
Often, youll see a backlit Wi-Fi icon
near the keyboard. If the icon isnt illuminated, look for a physical switch
that you can flip to enable the adapter.
Sometimes, the state of your network
connection is easily determined by
an icon in the notification area of the
Taskbar. For instance, a red X on the

68

November 2014 / www.cybertrend.com

network icon indicates the adapter is


disabled while an asterisk means the
adapter is in the process of detecting
the available networks. You can rightclick the network icon in Windows
7 or Win8 and select Troubleshoot
Problems. When the Windows Network Diagnostics utility opens, it will
reset your connection, disable the
wireless adapter, and then enable the
adapter again.
The utility will display descriptions of the problems it detects along
with some recommended solutions.
In most instances the utility will repair the connection and report the
issue as Fixed. To enable a disabled adapter, right-click the Network
Connections icon, click Open Network

And Sharing Center, select Change


Adapter Settings, and then right-click
the name of the wireless adapter. In
the resulting menu, you can choose to
disable or enable the adapter, connect
to or disconnect a network, and diagnose problems, among other options.
Click Properties to access detailed options that may help you troubleshoot
the problem.
When your adapter is working properly, Windows may display a message
indicating there are several available
wireless networks. Select the message
and choose a network SSID (service
set identifier, or name) from the list.
(You may need to input a security
password.) To display a list of available
networks in Win 8, go to the Settings

MICROSOFT'S INBOX REPAIR TOOL MIGHT


NOT LOOK PRETTY, BUT IF YOU'RE SUDDENLY
UNABLE TO ACCESS YOUR INBOX DUE TO A
CORRUPT FILE, IT'S A WELCOME FIX.

The Microsoft Outlook Inbox


Repair Tool (Scanpst.exe) lets you
quickly recover corrupted Outlook
PST and OST files.

this key combination will either automatically activate the touchpad or display a device settings dialog box that
gives you the option to enable your
touchpad. Alternatively, you can check
the notification area in the lower-right
corner of the screen for a touchpad
icon. Click the icon and the touchpad
control panel appears where you can
enable or disable an input device.

An Unresponsive
Keyboard Or Mouse

option in the charm bar and click the


Available Networks icon. If the adapter
is working and your system appears to
be connected, but you still cant access
the Internet check for a browser-based
splash screen and/or a Terms Of Use
statement to agree to. Launch a fresh
browser session and click the Home
icon to redirect.

Fix Broken Outlook


PST & OST Files
The PST (personal storage table)
file and the offline OST (Outlook Data
File) is where Outlook stores messages,
calendar events, and notes specific to
your email account. If this file becomes
corrupted, you may find yourself
ousted from Outlook. There are a few
things, however, that you can do to get
a foot in the door.
Microsofts Inbox Repair tool,
Scanpst.exe (Outlook 97-2003, 2007,
2010, and 2013), lets you solve busted
PST/OST problems quickly. To access
the tool, close Outlook and navigate
to C:\Program Files\Microsoft Office\
OFFICE12. (This last folder may have
a different number, for instance, our
version of Office 2013 stores the utility
in the \OFFICE15 folder.) Doubleclick Scanpst.exe. By default, the address for our OST file was already
listed, but if the field is blank, look in
the C:\Users\USERNAME\AppData\
Local\Microsoft\Outlook\ folder. Click
the Options button to access Replace,

Append, or No Log functions and click


OK. Click Start to begin the scanning
process. Windows will inform you of
any errors and prompt you to perform a
repair when the scan is complete. Before
clicking the Repair button, make note of
the scanned files backup location. Click
Repair and OK when you see the Repair
Complete message. Launch Outlook to
see if this fixes the problem.
If the file structure was corrupted
beyond repair, Scanpst.exe resets your
file structure and rebuilds the headers.
The Recovered Personal Folders item
in your Outlook folders list, if it appears, will contain all the data that is
recovered. You can then drag the data
to your new PST file and delete the
Recovered Personal Folders item from
Outlook.

A Touchy Touchpad
If you use your laptop on a dock
(and use an external mouse and keyboard), you can go weeks or months
with a deactivated touchpad and never
realize it until you hit the road. If you
find yourself in this situation, you
can activate the touchpad by pressing
the Fn (function) key simultaneously
with the F number key associated with
the laptops touchpad (often labeled
with an image of a touchpad). Using

If your programs and applications


dont respond to keyboard commands,
use your mouse to shut down the computer by clicking Start, then Shut Down
(in Win7) or tap the Power Button and
tap Shut Down (in Win8). Unplug the
keyboard from your PC and then reconnect it. Restart your PC to determine whether this process corrected the
problem. (If both input devices are unresponsive, you can press and hold the
Power Button on the tower to manually
shut down your system.)
If your mouse isnt responding, but
your keyboard is, press the Windows
key in Win7 to open the Start menu,
use the Right-Arrow key to select Shut
Down, and then press ENTER to shut
down the computer in Win7. In Win8,
press CTRL-ALT-DELETE, press the
Tab key until the power icon is highlighted, and then press ENTER. Unplug
your mouse and then reconnect it. (If
necessary, you can press and hold the
Power button to shut down the PC.)
Then restart your computer to see if
these instructions fix your problem.
If youre using a wireless keyboard
and mouse, ensure that the peripherals
are synced and in range of the wireless
receiver. You may also need to install
new batteries. If these steps dont enable peripheral communication with
the PC, try reinstalling device drivers.
You can often download these from
the mouse and keyboard manufacturer
websites.

YOU DON'T HAVE TO PUT UP WITH A LAPTOP'S


OVER-SENSITIVE TOUCHPAD.

CyberTrend / November 2014

69

Excel Formulas
MAKE THEM WORK FOR YOU

EXCEL SPREADSHEETS are useful for


tracking finances, storing important
figures, or even creating databases
of information. But the only way to
take full advantage of Excel is to use
functions and formulas. Whether you
simply want to find the sum total of a
column of numbers or calculate compound interest, formulas are the best
way to transform your data. Here are
examples of formulas that might save
you time.

Calculate Compound Interest


Because Excel doesnt have a builtin function for calculating
compound interest, Microsoft
provides a formula that will
get you the results you need
using present value (PV),
Excel doesnt have a built-in
compound interest function, but
you can use this relatively simple
function to get the same result.

70

November 2014 / www.cybertrend.com

interest rate (R), and the number of


investment periods (N). So, if you
make an investment of $100 and
want to see how much money youll
have in 10 years with a 4% interest
rate, you can plug those numbers
into the =PV*(1+R)^N formula. In
our example, your formula would be
100*(1+.04)^10. Note that you need
to change the 4% figure into a decimal
number, otherwise you might expect a
larger than life return on your investment. Calculate the formula and youll
see that over 10 years your initial $100
investment will grow to $148.02.

Calculate Percentages
You can calculate percentages in a
variety of ways using Excel, depending
on the information you already know.
For instance, you can use a simple division formula to find a comparison
between two numbers. For instance, if
you shipped 25 products and only one
of them was returned, you can simply
enter =24/25 (or use cell coordinates)
to get a figure of .96 or 96%. If you
want to calculate change between numbers (200 to 250, for example), you can
use the formula =(250-200)/ABS(200)
to get a growth rate of .25 or 25%.

Sum Of Totals Across Multiple


Worksheets
Lets say you keep track of sales
figures over the years using the same
Excel document. Not only do you want
a record of your current years sales,
but you also want your sales figure
from the previous year at the top of
each sheet. This will require the use

of the SUM function as well as some


cross-sheet calculation. Using the SUM
function, =SUM(Sheet1!A1:A6) for instance, you can take numbers from the
first sheet, add them together, and display them in a cell on the second sheet.

MATCH Function
Excels MATCH function makes it
easier to find the location of a specific
figure relative to its order in a column.
For instance, if you are searching
for the number 780 in a column of
30 cells, you can type the formula
=MATCH(780,B1:B30,0) to find your
exact match. If the information is located in the 15th cell, for instance,
youll receive the result of 15 from the
formula. You can also use a 1 or -1
modifier in place of the 0 to find the
number that is greater than or less than
your desired figure.

Round Up Or Down
If you work with figures that have
multiple decimal numbers and need to
round up or down to a specific decimal
place, then Excel has two easy functions you can use to get the job done:
ROUNDUP and ROUNDDOWN.
For example, take a number you want
to round up, such as 12,345.678 and
decide what decimal place you want
to round to. Then, use the function
=ROUNDUP(12,345.678, 2) and Excel
will automatically round it up to
12,345.68.

The MATCH function


is helpful if you want
to find a specific figure in a long column
of numbers. It shows
you where your query
is located in relation
to the array you provide in the formula.

WORKDAY Function

Display Current Date & Time

WORKDAY lets you take a start


date and a number of days to determine what your end date will be
with weekends and holidays taken
into account. For example, you need
to enter the DATE formula, well use
=DATE(2014,3,1) into the A1 cell, and
a specific number of days in the A2
cell, well use 18, you can use the formula =WORKDAY(A1, A2) to find
your end date, which in this case is
March 27, 2014. You can also add holidays to the formula by entering the
dates into cells and adding them to the
end of the formula =WORKDAY (A1,
A2, A3:A9), which will change the end
date.

Excels NOW function is a quick and


easy way to display the current date
and time in your spreadsheet. Type
=NOW() into a field and the date
and time will appear. This information doesnt update automatically, but
rather every time you make a calculation within the spreadsheet as well as
every time you open that particular
Excel document.

Cross-sheet calculation makes it possible to link


formulas across multiple sheets in the same workbook,
so you dont have to copy and paste information
or calculate figures outside of Excel.

REPT Function
Typing the same thing over and over
can quickly get repetitive, especially
if you need 32,767 instances of the
same information. If you think that
number is oddly specific, youre right.
Its the maximum number of times
you can use the REPT function, according to Microsoft. To use the REPT
function, simply take a word, number,
or other entry (Repeat, in this instance) and tell Excel how many times
you want it repeated by
typing =REPT(Repeat
,5) into a cell. You
can also use this function to better visualize
data. For instance, you
can use symbols to represent sales figures or
your amount of customers and watch your
growth over time.

CyberTrend / November 2014

71

Certied Piedmontese beef tastes great: lean and tender, juicy and delicious.
But theres more to it than just avor. Certied Piedmontese is also low in fat
and calories. At the same time, its protein-rich with robust avor and premium
tenderness. Incredibly lean, unbelievably tender: Its the best of both worlds.

piedmontese.com

You might also like