Вы находитесь на странице: 1из 9

WHITE PAPER

The Role of Linux in Datacenter Modernization


Sponsored by: Red Hat
Al Gillen
August 2013

Global Headquarters: 5 Speen Street Framingham, MA 01701 USA

P.508.872.8200

F.508.935.4015

www.idc.com

IDC OPINION
Datacenters are, and always have been, investments that evolve as the industry
changes, new technologies emerge, and application needs change. However, today,
datacenter managers are facing a daunting future that may not include any
meaningful expansion of their datacenters in terms of floor space, yet their existing
datacenter assets may need dramatic modernization in preparation for tomorrow's
computing models. In detail:
The same evolving technologies that are at the forefront of today's conversation
will potentially bring dramatic change to today's datacenters. Just a few of these
technologies include the broader concept of cloud computing hosted by thirdparty service providers, the eventual movement to a platform-as-a-service
(PaaS) compute model, and the need for a fully virtualized dynamic datacenter
that can federate with external resources on an as-needed basis, creating new
challenges to network and storage infrastructure, and the related impact on the
system infrastructure software layers.
Today's datacenters continue to have a diverse infrastructure in use. It is not
uncommon to have at least three, and potentially four (or more), architectures in
use. Those architectures would include Windows on x86, Linux on x86, and Unix
on RISC or EPIC. It is not so unusual to also find mainframe-class systems or
older-generation distributed systems like IBM i, HP OpenVMS, or Unisys
ClearPath systems in use. Linux may be in use aboard architectures other than
x86, including POWER and IBM z.
For most datacenters, the path toward tomorrow's compute paradigm mandates
some investment and frequently significant investment in standardization
and consolidation as well as a more robust adoption of enterprise virtualization
software, along with cloud system software to extend that virtualized
infrastructure into a true private cloud environment. Realistically, before a
customer can truly utilize a private cloud, organizations must standardize and
minimize the variability in their existing environment.
Linux has emerged as one of the key elements to a modernization program for a
datacenter. The role Linux plays is one of cross-architecture standardization to a
single operating system (OS) as well as a target platform for migrated workloads
from Unix and other operating systems.

IN THIS WHITE PAPER


This IDC white paper looks at the transition that today's datacenters have to move
through to adapt to and leverage the changing technologies that will be the
underpinnings of tomorrow's compute paradigm. This transition includes the
standardization of infrastructure software, the commoditization and standardization of
hardware layers, and the changing programming model that customers will utilize for
next-generation applications. This IDC white paper looks at Linux as an enabler to
help organizations respond to and leverage these changing paradigms.

SITUATION OVERVIEW
The computer industry has a history of reinventing itself, and it's not done yet.
One of the trends that we have seen time and again in the computer industry
which is actually not surprising if you have ever read Clayton Christensen's landmark
book The Innovator's Dilemma is that new technologies come along and present
disruptive change to the incumbent players and products. New products typically win
on the basis of low price at a vastly lower level of functionality when competing
against the previously dominant products. Over time, these new low-cost solutions
move upmarket, driving incumbent players even further upmarket.
History also shows that every 1020 years, a fundamental architectural shift takes
place for compute infrastructure. These shifts have often manifested themselves as a
competitive solution that at first appears to be vastly inferior across multiple
metrics, but that initial inferiority comes with desirable attributes that the market may
find to be unavailable, or unaffordable, from the incumbent platforms.
Over time, the competitive platforms improve and evolve, although not necessarily
into an exact replacement for the previous solution. Indeed, these platforms end up
inventing new paradigms, and they can be quick to embrace new ways to solve old
problems, which over time allows these new platforms to differentiate themselves in
multiple dimensions and eventually brings far greater functionality than the solution
they followed to market.
At the same time, in the computer industry, it is rare for new solutions to fully
eradicate the need for previous solutions. Indeed, if new entrants were to
methodically chase down incumbents, it would reduce the agility and innovation in the
new entrants and sentence them to early obsolescence. Instead, each new
technology tends to both supplant and supplement existing solutions. There are
parallels for this type of replacement/improvement cycle it is not unlike how radio
supplanted and competed with newspapers and how television subsequently
supplanted and over time competed more and more directly with radio.
IDC segments modern computing into three separate paradigms:
1st Platform: The 1st Platform is mainframe- and host-centric computing; user
access is through fixed-function terminals. The 1st Platform dates back to the
dawn of commercial information technology systems, emerging in the 1960s. As
will always be the case, the emergence of subsequent platforms has not totally

#242485

2013 IDC

disenfranchised the 1st Platform, and this solution continues to be a mainstream


compute solution and is expected to continue to serve a critical role in the future.
Today, it is possible to access 1st Platform solutions with modern devices
including smart mobile devices, and in many cases, the application functionality
still housed on 1st Platform solutions has been (or is being) wrapped in
virtualized or cloud computing interfaces. Classic 1st Platforms included early
mainframe-class systems and minicomputer-class systems designed for
terminal-connected multiuser environments. Examples of classic 1st Platforms
would also include IBM's System/36, System/38, and AS/400; DEC VAX/VMS;
and even early Unix systems.
2nd Platform: The 2nd Platform involves distributed computing systems, based
on RISC, x86, and EPIC (Itanium-based) architectures, typically accessed by
PCs. The 2nd Platform initially came to market as an extension to client-side
technology but also leveraged many late-generation 1st Platform solutions
including IBM z/OS, IBM i, and HP OpenVMS. But the 2nd Platform is more
commonly associated with more recent platforms such as Unix servers, Windows
Servers, and NetWare servers, all of which were usually paired with x86 PCs for
primary access over local area networks built on Ethernet. The most recent
entrant into the 2nd Platform world was Linux, which matured very late in the life
cycle of that technology wave. Particularly in the case of Windows and Linux,
these technologies followed a classic "innovator's dilemma" and came to market
lacking the functionality, scale, and reliability/performance attributes that
competitive environments offered. Yet by delivering good-enough functionality in
a low-cost, good price/performance package, both products evolved into fullblown server platforms first through scale-out deployments and subsequently
to a mixture of scale-out and scale-up solutions that are deployed based on the
application workload's scale requirements, the desired availability attributes, and
the language that was used to construct the application itself. The access device
of choice for 2nd Platform solutions is a personal computer, with access today
quickly being extended to support smart mobile devices as well.
3rd Platform: The fast-growing smart mobile device market best illustrated by
smartphones and highly mobile, always-connected tablet devices represents
the arrowhead of the 3rd Platform. But the 3rd Platform is not only about
end-user access devices; it is empowered by a bidirectional value-add that these
devices bring to the market. Typically, these devices are data consumers,
frequently accessing resources hosted within cloud compute environments
housed aboard server farms usually (but not always) located in some faraway
locale, but they also serve as the hub for social networking. The cloud computing
world also has to replicate serving in multiple sites (across regions) to prevent
slowdowns in performance that would otherwise impact the end-user experience.
Then these same smart devices capture data or have their activities captured
that is harvested and built into big data solutions that seek to predict
consumer and business user behavior. These big data stores are mined through
next-generation analytic tools and, in turn, rendered back to commercial and
consumer customers through cloud services. The very nature of application
development and deployment is heavily influenced by these new data sets and
the application delivery paradigms that will dominate cloud solutions.

2013 IDC

#242485

Figure 1 provides a graphical depiction of the 1st, 2nd, and 3rd Platform waves.

FIGURE 1
The 1st, 2nd, and 3rd Platforms

Trillions of Things
Billions
of users

Millions
of
Intelligent industry solutions

CIOs

Apps
LOBs

Enterprises
SMBs
SPs
Consumers
Emerging
markets
Hundreds
of millions
of users

3rd Platform
Mobile broadband

Big data/analytics

Social business

Cloud services

Mobile devices and apps

Services
Information
Content
Experiences

2nd Platform
LAN/Internet

Tens of
thousands
of apps

Client/server

PC

Millions
of users

1st Platform

Thousands
of apps

Mainframe Terminal

Source: IDC, 2013

As noted previously, for an emerging platform to be successful when compared with


an incumbent platform, the new entrant must bring new thinking and different
usually less expensive ways of solving both old and new problems.

#242485

2013 IDC

The Impact of the 3rd Platform


The 3rd Platform is not specific to any size class of customers. Indeed, 3rd Platform
deployment affects small and midsize business just as it affects large businesses and
the datacenters those large businesses operate. What is different among the size
classes is that small customers are likely to respond to the disruption caused by the
3rd Platform by migrating to software as a service (SaaS). Midsize organizations are
likely to migrate some applications directly to a SaaS solution and maintain other
solutions for a long time into the future.
Large organizations, which commonly have a far more substantial IT investment,
including data and applications that are critical to the business itself, are usually left
with fewer options for a short-term adoption of the 3rd Platform. Indeed, many large
organizations are considerably more likely to make an evolutionary movement to
3rd Platform architectures for some applications while opting to modernize other
workloads so they are compatible with an evolution to a virtualized, private cloud and,
ultimately, infrastructure as a service (IaaS) at some point in the future. Either way,
large organizations are facing a wave of modernization that builds on a standardized
software stack that is virtualized and increasingly modular in nature.

Datacenter Modernization
While datacenter modernization is a key step to a move to embrace the 3rd Platform, the
reality is that datacenter modernization is an activity that has been going on for almost as
long as datacenters have been in use. The precise activities that are being completed as
part of a datacenter modernization project have changed over the years, and today's
activities are heavily focused around consolidation and standardization. Datacenter
modernization may include some or all of the following activities:
Hardware standardization. In recent years, we have seen many organizations
working toward standardizing their hardware on x86 servers in either rack-optimized
or blade configurations. The belief among many organizations is that x86 offers
good price/performance attributes. Availability and scalability of x86 servers trailed
competitive offerings in the past, but today, Intel, AMD, and server OEMs are
making good progress in closing the gap with competitive platforms. While we do
not expect that x86-based solutions will ever fully match RISC- and EPIC-based
systems in terms of scale and availability, customers are finding that with the right
hardware architectural design and the right system software stack, they can achieve
suitable levels of reliability and availability to meet the majority of their needs.
Software standardization. Customers are realizing that software complexity
leads to higher costs. The best way to reduce complexity is to reduce the
variability of software installations. It is difficult if not impossible to reduce
the number of software stacks to 23 combinations, but for many organizations
where there may be dozens, if not hundreds, of combinations of infrastructure
software stacks in use, even reducing that matrix down to 10 or 15 combinations
represents a massive achievement. Organizations that are able to further
standardize for at least some layers of their software stack are likely to see
incremental benefits that pay back in meaningful ways.

2013 IDC

#242485

Virtualization standardization. Virtualization software has had a dramatic


impact on modern datacenters. Today, most organizations have standardized on
one or two hypervisors as the foundation layer for their deployments. The
movement to virtualization software has the side effect of "standardizing" the
underlying hardware since the hypervisor layer allows each server to appear
identical to the operating system assuming all the servers are uniformly either
Intel or AMD based. The virtualization layer has also made it possible for
organizations to physically or virtually standardize at other layers, including
management and peripheral devices including network and storage.
Operating system and infrastructure software standardization. An analog
that is happening to virtualization software standardization is a standardization of
related infrastructure software layers, including the base operating system.
The use of a minimal number of operating system products (vendors), and even
within a given vendor, a limited number of release versions, reduces the support
matrix and lowers operational costs, reduces the difficulty of managing life
cycles, and, in turn, standardizes layered software product selections.

FUTURE OUTLOOK
One of the key trends over the past decade has been the continued growth of Linux
server operating environment (SOE) subscriptions and deployments. Linux emerged
in the 1990s and was initially avoided by many datacenter managers, but today, Linux
has already become a key technology deployed in enterprise datacenters as well as
in service provider datacenters. Customers are finding measurable business value
from deploying Linux as one of their primary SOEs. IDC's primary research on
operating environments finds that Linux has become one of two operating systems
that will serve as the basis for most of the 3rd Platform deployments in the future.

Customer Use of Linux


IDC interviewed the IT manager responsible for server and storage technology at a
large United Statesbased manufacturing company that operates 100 sites, operates
in 68 different countries, and generates $30 billion per year of revenue. As a
manufacturer that specializes its products for various customer segments and
markets, it sells a diverse collection of products that collectively make up 65,000
unique SKUs. This organization typifies the type of modernization activities that are
going on today in many large IT consumers, and the IT manager is responsible for the
operational aspects of the storage and server architecture used on the company's
open systems.
The company's experience with Linux dates back about six years, when the IT team
decided to migrate to Red Hat to simplify application certification. One key vendor,
JD Edwards, helped precipitate that movement. Says the IT manager, "We tend to
run pretty lean. If you look at a lot of companies our size, they tend to have a much
bigger IT organization. We try and look at how cost effective we are, and a lot of
times, we are told we are one of the most cost-effective teams they have seen."

#242485

2013 IDC

Globalization of Corporate IT
But there is another driver, the IT manager adds. "Three or four years ago, we got a
new CIO. At the time, the way IT operated was not global. We might have plant IT
people doing their own thing and we have 300 plants around the world. We might
have salespeople and different business units doing their own thing. There has been
a big focus over the past three to five years to globalize IT."
Another driver was the plan to do a global SAP implementation. The company has
two main datacenters in the Midwest. Those datacenters are about a mile apart and
are run as active-active configurations, synchronized over dedicated fiber that the
company ran between the sites. The company standardized on blade servers four
years ago and has been rolling out new deployments on that architecture. The two
datacenters are used for mirroring, clustering, and replication and are currently
provisioned to run at about 50% utilization. The datacenter includes HP hardware,
NetApp and EMC storage, and Cisco switch technology. Today, there are about
2,000 employees around the world using the SAP system.
In parallel with the move to the blade architecture, the company implemented a
virtualization strategy, and now, it is about 70% virtualized, accounting for 300 physical
servers of the company's overall inventory. Today, the company has three primary
operating systems in use, with the most heavily used products being Red Hat Enterprise
Linux and Windows Server 2008.
The company's modernization efforts have heavily landed on Red Hat Enterprise
Linux. Where Linux accounted for about 50 servers four years ago, today, the
company is at 1,400 Red Hat Enterprise Linux servers, including both virtual and
physical instances thanks in part to the SAP rollout. In the corporate datacenter,
the company has just under 5,000 virtual and physical servers in use, while globally,
the number expands to a total of 8,000 servers. "In general, we try and stay fairly
current operating systemwise," says the IT manager.

Linux Forecast: Growing Strong


Both Linux and Windows installations will grow, but Red Hat Enterprise Linux growth is
on a faster trajectory thanks to the SAP rollout. The company expects to add 1,000
Red Hat Enterprise Linux servers in the next year, primarily intended to support the SAP
workloads. Longer term, Linux growth will scale back to a more sustainable rate.
While the company never forgets that it is a conservative manufacturing company, IT
adoption does not lag in the past. Says the IT manager, "My philosophy is that you
don't want to be on the latest version, but we do want to stay current. We do have
legacy, but it's because we can't get the applications to move. Part of our job is to
look at new technology." He adds, "Being on more current releases saves money
over the long term. It costs us a lot more when we have old hardware."
According to the IT manager, the company has a long-term vision, but the length of
the timeline is "probably not as far out as a lot of people would want to think. For real
detailed planning, it's about a year at a time, because of our internal budget planning.
We do speculate on where we will be in three to five years. We were just asked how
much we were going to spend out to 2022; that is pretty difficult when it is 2013.

2013 IDC

#242485

Anything beyond two years is pure speculation." Storage is a little more predictable, if
not robust, with growth running at about 5060% year over year.

Cloud Use Is Clear


Today, the modernization efforts are heavily focused on reducing the application
sprawl and on consolidating on two major operating systems running on a virtualized
infrastructure. Some business units are consuming SaaS offerings from Siebel CRM
Systems, although those units will eventually be consolidated on the SAP
implementation. The overall goal is to standardize on SAP and Red Hat Enterprise
Linux as broadly as possible.
In addition, there are divisions that develop software products, and those software
products are delivered as software as a service.
In a third dimension of cloud adoption, the company has built a private cloud offering
for internal customers. The cloud offers two operating system options: Red Hat
Enterprise Linux 6 and Windows Server 2008. Customers that request Windows
Server 2003 or Red Hat Enterprise Linux 5 are handled individually, so IT can
determine why an earlier-generation operating system is being requested.
The IT department recognizes that it has to be competitive to win next-generation
deployments, and it views Amazon and Windows Azure as competitors against which
it needs to measure itself. "We are trying to show where the costs are going. We are
competing with Windows Azure and Amazon." The IT manager adds, "Typically we
are lower cost than them."

CHALLENGES/OPPORTUNITIES
Datacenter modernization is a logical goal to pursue, but logic and practical realities
are not always one and the same. Challenges and opportunities associated with
datacenter modernization include:
Standardization requires change. The end game of a standardized software
stack, running on a standardized virtual machine, offers lower capex and opex
costs. However, there is cost associated with migrating to a standardized
infrastructure. There is opportunity for customers in that there is a payback
associated with an investment in infrastructure improvement and modernization.
A modernized infrastructure is more compatible with public cloud. When
going to a service provider for an IaaS or a PaaS solution, the more modern and
standard that a company's IT is, the easier (and less expensive) it will be to move
to a public offering. While not every organization is in a rush to move to public
cloud infrastructure, having the technology in place to do so when the time
comes can be an advantage.
It is about more than the OS. Standardization and modernization apply to the
hardware, virtualization infrastructure, and operating system. But modernization
can go well beyond those basic infrastructure layers. Modernization means
deploying modern application frameworks, management tools, and cloud system

#242485

2013 IDC

software for cloud orchestration and management. Today, the industry is


increasingly moving toward open sourcebased application frameworks and
open source cloud system software layers such as OpenStack and KVM.
Modernization requires discipline. As the customer case study presented in
this paper indicates, it is not enough to just offer automated provisioning in a
private cloud infrastructure. It is important to encourage internal customers to
move to the most current operating systems, and if internal customers make a
request for an aging solution, it is important to maintain the discipline to move
them to the right solution.

CONCLUSION
IT modernization extends the capabilities and boundaries that a platform can support.
But it also extends the flexibility and applicability of a platform's future use scenarios.
IDC believes that as the industry evolves, progressive customers will modernize and,
in the process, standardize their IT infrastructures.
A modern IT infrastructure is better aligned with cloud computing and is more able to
move to a PaaS or an IaaS deployment scenario, when and if an organization wants
to move there. Modern IT infrastructure is more likely to support key technologies
such as datacenter-to-cloud VPN connectivity, directory federation, and storage
migration.
For most datacenters, the path toward tomorrow's compute paradigm mandates some
investment and frequently significant investment in standardization and
consolidation as well as a more robust adoption of enterprise virtualization software,
along with cloud system software to extend that virtualized infrastructure into a true
private cloud environment.
Linux has emerged as one of the key elements to a modernization program for a
datacenter. The role Linux plays is one of cross-architecture standardization to a
single operating system as well as a target platform for migrated workloads from Unix
and other operating systems. Organizations that have aging infrastructures need to
be considering the necessity and urgency of IT modernization.

Copyright Notice
External Publication of IDC Information and Data Any IDC information that is to be
used in advertising, press releases, or promotional materials requires prior written
approval from the appropriate IDC Vice President or Country Manager. A draft of the
proposed document should accompany any such request. IDC reserves the right to
deny approval of external usage for any reason.
Copyright 2013 IDC. Reproduction without written permission is completely forbidden.

2013 IDC

#242485

Вам также может понравиться