Академический Документы
Профессиональный Документы
Культура Документы
an
Networking eBook
contents] [
Getting Started with Virtualization
This content was adapted from Internet.com's ServerWatch, Enterprise Networking Planet, IT Career Planet, CIO Update, and Enterprise IT Planet Web sites. Contributors: Ryan Bass, Lynn Haber, Brian Gardner, Richard Adhikari, and Rafael Hernandez.
2
2
4 6
10 12
10
12
Getting started with Virtualization, An Internet.com Networking eBook. 2009, Jupitermedia Corp.
The reality is that not all tasks are well suited to run under a virtual OS, the usual culprits being software that's highly resource intensive and I/O bound applications, similarly there's software that just doesn't want to play nice for whatever reason and is best left to run on its own machine where it can happily run its course. Software response time under virtualization is also a key point to consider, if you start loading up a number of system images, each running a few tasks, things can get a little hairy. Some software isn't necessarily resource intensive but requires snappy system response in order to perform at its best. So when users start leaning on the application you may find it performing worse than on minimally specced hardware.
Jupiterimages
The fact of the matter is that there is a performance penalty with any action carried out on a virtualized system. An action that's inconsequential on a lowpower machine can suddenly become burdensome on even the most powerful of servers. The prime candidates for consolidating are generally low usage, low
The fact of the matter is that there is a performance penalty with any action carried out on a virtualized system
Getting Started with Virtualization, An Internet.com Networking eBook. 2009, Jupitermedia Corp.
[
resource intensive applications.
You're more likely than not to go back to the drawing board with quite a few consolidation plans once things aren't going smoothly as you hoped. Virtualization overhead can get ugly and it's not going to be cleared up anytime soon, but you can bet the large software and hardware vendors are working on this very problem.
with virtualization as detailed here. Their first and most logical step is to reduce the effect the hypervisor has on a system's performance. AMD's SVM and Intel's VT-x have improved how hypervisor emulation and the like behave when accessing system resources. Both companies also have another trick up their sleeves with regards to CPU performance penalties a system can encounter when managing the memory requirements of multiple virtual machines. AMD's Nested Page Tables, found on Quad Core Opterons and the somewhat similar Intel EPT, soon to be found on their upcoming "Nehalem" CPUs, have a dramatic effect on limiting the performance hit on memory page table access. The hardware additions along with inevitable software tweaking over time will lead to performance increases (or less of an overall system performance loss) when running virtual machines. Current performance penalties incurred by systems running multiple virtual machines can cause quite a bit of frustration when you're trying to balance costcutting and remove any possible instabilities that might arise with loading up a server full of necessary applications. Thankfully companies are improving on their products in leaps and bounds. In a few years, close to native performance may even become the norm. I
Getting Started with Virtualization, An Internet.com Networking eBook. 2009, Jupitermedia Corp.
s enterprises virtualize their data centers to cut costs and consolidate their servers, they may be setting themselves for big trouble.
So, why is it virtual servers are being left out of DR plans? Or, if they're included, why aren't they being backed up? It's because enterprise IT just does not have the right tools to back up virtual servers. The biggest problem for 44 percent of North American respondents was the plethora of different tools for physical and virtual environments. There are so many that IT doesn't know what to use and when. Another 41 percent complained about the lack of automated recovery tools. Much of the disaster recovery process is manual, although VMware has a tool to automate the run book. Another 39 percent of respondents said the backup tools available are inadequate.
According to a disaster recovery research report from Symantec based on surveys of 1,000 IT managers in large organizations worldwide, 35 percent of an organization's virtual servers are not included in its disaster recovery (DR) plans. Worse yet, not all virtual servers included in an organization's DR plan will be backed up. Only 37 percent of respondents to the survey said they back up more than 90 percent of their virtual systems. When companies virtualize, they need to overhaul their backup and DR plans; the survey found that 64 percent of organizations are doing so. "That's no surprise, because virtualization has had a huge impact on the way enterprises do disaster recovery," Symantec senior product marketing manager for high availability and disaster recovery Dan Lamorena told InternetNews.com.
Jupiterimages
Hewlett-Packard, IBM, CA, and smaller vendors such as ManageIQ, Avocent, and Apani offer tools to manage both the virtual and physical environments. And companies like Hyperic are bringing out new tools. However, virtual server management tools, being relatively new, are not as sophisticated as their counterparts for the physical environment. Also, they have
The biggest problem for 44 percent of North American respondents was the plethora of different tools for physical and virtual environments.
Getting Started with Virtualization, An Internet.com Networking eBook. 2009, Jupitermedia Corp.
not been around long enough for users to be familiar with them. For example, provisioning, or setting up, virtual machines from physical ones and vice versa can also be a problem, and tools for this have only recently emerged. "Virtualization makes some aspects of backup and disaster recovery more difficult," Symantec senior product marketing manager for NetBackup Eric Schou told InternetNews.com. "IT shops are still struggling with the steep learning curve." Porting over solutions from the physical environment won't work, Schou said. "IT shops need to get solutions that are finely tuned for virtualization," he added.
because "people didn't do what they were supposed to do," Lamorena said. This means that much of recovery is still a manual process, and companies must begin looking at automation, he added. Another cause is that tests are not run frequently enough. That's because "when you run a test, it disrupts employees and customers," Lamorena said. He added that 20 percent of the respondents said their revenue is impacted by DR tests, so "the tests cause the same pain to their customers as if they had a real disaster." Finally, the survey found that top-level executive involvement in DR planning has fallen. "Last year, the C-level involvement on disaster recovery committees was 55 percent; this year, it's 33 percent," Lamorena said. C-level executives are CIOs, CTOs, and CEOs. Lamorena finds the reduction in top-level involvement disturbing because it could lead to more problems with DR. "That's a huge drop, and we've been thinking about this day and night," he said. "What's alarming is, companies may be getting a little lax and don't think they'll be affected by a disaster." I
Getting Started with Virtualization, An Internet.com Networking eBook. 2009, Jupitermedia Corp.
s an IT manager you've read of all the advantages that come with virtualization. The next step is to feel comfortable tackling a migration to virtual infrastructure and to make sure it is protected. To begin, it is important to understand the things you need to do to plan a virtual infrastructure and choose the appropriate data protection for it. Identifying and selecting the capabilities and limitations of data protection within your virtual infrastructure is one of the most critical tasks. For simplification, this article limits the virtualization platform example to VMware ESX. The process is the same for Microsoft Hyper-V, Virtual Iron, and others until you get to the end and have to determine the right implementation.
applications can be virtualized. You just have to decide on a reasonable set of applications and then compile the following information: 1. Identify characteristics of selected apps under load It is absolutely critical that you characterize these applications under their heaviest expected load or you'll start running out of resources unexpectedly when you implement your virtual infrastructure. Total memory footprint Memory the application uses at peak load? If the application "leaks" memory (its memory footprint grows even under constant load) you'll need to allow room for that as well. Total CPU utilization How many CPUs and at what percentage used at peak load? Don't forget to note the type of CPU you used when you did your measurements.
Jupiterimages
Identifying and selecting the capabilities and limitations of data protection within your virtual infrastructure is one of the most critical tasks.
Getting Started with Virtualization, An Internet.com Networking eBook. 2009, Jupitermedia Corp.
Network bandwidth utilization Network bandwidth used by this application at peak load. Remember to account for both directions of network traffic. Storage network throughput (SCSI, FC, iSCSI, NAS) as both input and output The same thing you just did for your messaging network. Disk reads and writes The disk activity that this application requires at load. There are other disk load parameters that may need to be characterized as well, depending on the application. Memory bus utilization estimate (memory bus available bandwidth minus four times the total I/O) Years of empirical data has upheld this useful rule of thumb. This can be somewhat difficult to get since it is not always easy to identify the memory bus speed of a particular system. 2. Identify load patterns and recovery requirements for virtualized applications Is there a window during the day or night when they could reasonably be shut down and backed up? Is there a window during the day or night when the total load on the ESX physical server is low enough that backups can be performed without negatively impacting the running apps? If there is no application and ESX server available window, you will need to select a proxy backup method. Do you need to be able to recover individual files on a regular basis? If so, you will most likely need to run a backup agent directly within a virtual machine. If you've designed and implemented a few data protection architectures, the requirements gathering process was probably quite familiar to you. It doesn't change much for virtual infrastructures. Once you understand your application and data protection requirements there are some simple decisions to make: Agents in each virtual machine 7
Getting Started with Virtualization, An Internet.com Networking eBook. 2009, Jupitermedia Corp.
This is the simplest decision, since it mirrors what you are already doing with your physical infrastructure. The strengths of this approach: Low disruption to existing workflows Easy application backup and recovery File level recovery There are two significant weaknesses to this approach: Total cost of backup software agents Need to manage load on ESX server when running backups Agent in Hypervisor Service Console This is pretty simple as well. It only requires a single Red Hat Linux agent for each ESX server. Strengths: Low agent cost High-performance image backup & recovery (only working with vmdk files) Weaknesses: Need for some scripting Lack of file level recovery Lack of application awareness Proxy backup 1) VMware Consolidated Backup (VCB) VCB gives you the ability to use a Windows proxy host to backup Windows virtual machines. Strengths: Almost entirely eliminates load on virtual machines and ESX server during backup Enables hot virtual machine backup Weaknesses: Lack of non-Windows platform support Some recovery limitations VCB license cost 2) Storage server snapshots This approach is quite simple to manage once it is implemented if you have storage that provides the functionality. You can connect another host to the storage to manage the snapshots for backup and recovery. 8
initial benefits quickly, but as you move along into production and heavy database apps, the ROI is not as clear or as quickly forthcoming. Enterprises therefore, must take a long-term view. Ritter recommends finding a key metric to measure early in the process that takes flexibility and agility into account. This way, they become standard trackers and in time can be the basis of a business case. Ritter was emphatic about this, noting, "if you don't put the metric is place early to measure the return, it's going to bite you early." There are multiple approaches to measuring this way. High availability and disaster recovery, for example, are critical issues, and in some cases virtualization makes it financially feasible for organizations to set up a failover site, if they couldn't before. Benefits such as these should be quantified and taken into account. Cost reduction is another way to go. One company used cost prevention as justification for initial investment. Other things to bring up include: Shorter maintenance windows Some servers, such as Exchange, run better when virtualized Extend the life cycle of hardware as a virtual machine (this is pretty much a no-brainer, as it's taking advantage of hardware already in play, and most likely already depreciated) The ability to start and stop hardware Ritter provided one big caveat process and procedure must keep up with virtualization. Oftentimes, he explained, it's not the hardware holding things up but the human processes around it. At first blush, provisioning goes down from weeks to hours. However, most provisioning time organization face is outside of the actual virtualization process (e.g., getting the purchase order in and the time spent getting it in rack). The process that comes before the actual virtualizing must be fixed or the true impact of virtualization will not be felt. I
Getting Started with Virtualization, An Internet.com Networking eBook. 2009, Jupitermedia Corp.
Strengths: Simplicity Low application server and ESX server overhead Weaknesses: Cost of snapshot enabled storage Complexity of initial deployment (varies widely depending on implementation) What does implementing the right protection solution in a virtual environment do for you? With virtualization you can do things like physical machine to virtual machine conversion and, in some cases, you can take advantage of your existing backup images to migrate to a virtual infrastructure. If you plan your data protection, you will never have
to do a bare metal disaster recovery again since virtual storage file systems are simple, single files. Recovering an entire system can be as simple as recovering a single file. Site disaster recovery can be greatly simplified since you can bring a site up quickly on lower end physical systems and add capabilities as needed without interrupting operations. You will still need to develop a site disaster recovery plan, but there are many available resources to help you to do so. Clustering virtual machines with VMware Virtual Infrastructure is much easier and less expensive than with physical clusters. Virtual appliances can make purchasing, installing, configuring, and updating applications much simpler. In some cases they can also help simplify site disaster recovery. I
Getting Started with Virtualization, An Internet.com Networking eBook. 2009, Jupitermedia Corp.
he release to manufacturing (RTM) version of Hyper-V made its debut in June 2008. This final edition of Hyper-V includes security, stability, performance, and user experience improvements. With such a late start, Microsoft is going to have a tough time capturing a sizable portion of the enterprise virtualization market, but small- to medium-size organizations are sure to jump on board the Hyper-V train as they slowly begin to migrate from Server 2003 to Server 2008.
We're going to take a look at what to do if you've already got virtual machines (VMs) created in the Hyper-V beta or release candidate environments, and how to get started with Hyper-V if you're a beginner. If you've already been tinkering with the release candidate or beta editions of Hyper-V here's what you need to know if you want to continue using those VMs: Version Pre-existing VMs
If you are running a VM containing a pre-release version Windows Server 2008 created with a beta version of Hyper-V, then you are out of luck and will need to recreate the virtual hard disk file from scratch. If you created a VM containing a final release version of Windows Server 2008, then follow the steps here to get it working in the RTM version of HyperV. If you created VMs with RC0, all you have to do is shut down the guest OS and merge any snapshot files. If you've got VMs created with RC1, then you don't have to do anything special. We've all been barraged with the benefits of virtualization for several years now, but in case you forgot here are three good reasons to go virtual: server consolidation, business continuJupiterimages ity/disaster recovery, and testing/development. Hyper-V makes it so easy there is really no reason to hold back. Even if you run Hyper-V
10
Getting Started with Virtualization, An Internet.com Networking eBook. 2009, Jupitermedia Corp.
solely for testing and development, it is well worth it. The biggest barrier to getting started with Hyper-V is hardware. Unfortunately, you won't be able to use older equipment because Hyper-V requires a 64-bit processor with hardware-assisted virtualization and hardware data execution protection.
VMs In Production
If you're going to be running VMs in production, then you will definitely want to take a gander at the different settings available for your VM. Some of the more important options include: memory, processor, network adapter, and automatic start/stop actions. Be sure to give your virtual machine enough memory because you don't want it to hit the page file on your virtual disk. Processor settings are important because you don't want a test box or runaway app to hog all the processing power away from other production VMs. Depending on the applications you are running, you may want to install additional physical network adapters into the host server and distribute the networking load among more than one adapter. Finally, it's important to tell Hyper-V what to do when the host operating system shuts down or starts up. One final note: Beware of virtual server sprawl. With Hyper-V (and other server virtualization technologies) it becomes almost too easy to create new "servers." Remember that there is overhead associated with each additional VM that is created. It may need to have an anti-virus client, a backup client, and any other clients/agents that you install on your servers. It will need to be patched each month, and don't forget about that pesky OS licensing issue. Depending on your version of Windows Server you may need to purchase additional OS licenses. There is, however, a Microsoft tool that will help to determine how many OS licenses need to be purchased. With Server 2008 Datacenter Edition you can have as many VMs as you want, Enterprise Edition comes with the ability to run four VMs, and Standard Edition requires a license for each VM. I
Installing Hyper-V
If you've got the right hardware then follow these steps to get Hyper-V installed: 1. Setup a Windows Server 2008 x64 server 2. If the server software didn't already come with the RTM version of Hyper-V then download and install it. 3. Open Server Manager 4. Click on Roles > Add Roles > Next > Select HyperV > Next > Next 5. Select an Ethernet adapter to be available for VMs > Next > Install To open the Hyper-V Manager click on Start > All Programs > Administrative Tools > Hyper-V Manager. To create a new VM click on New from the Actions side bar and select Virtual Machine. Follow the instructions in the wizard to create a new VM. The easiest and fastest way to install a new VM is to use an ISO file containing the operation system you want to install. This option is available on the Install Options page of the New Virtual Machine Wizard. Once you've got your first VM setup you may want to make a copy of the virtual hard disk file. This will allow you to setup new VMs in a matter of minutes. Of course, before you make a copy of the virtual hard disk file you should run sysprep or another utility on the VM to roll the SID on the server. The SID is a unique identifier that the server assigns itself when it is first created. Duplicate SIDs will end up biting you in subtle ways and it may not be obvious that the duplicate SID caus-
11
Getting Started with Virtualization, An Internet.com Networking eBook. 2009, Jupitermedia Corp.
ny IT professional who's missed the buzz about virtualization might as well keep his head in the sand.
job virtualization training. "Organizations aren't training in advance of virtualization initiatives," Anderson says.
For the rest of the IT community it's clear that talk about enterprise server virtualization adoption isn't a matter of "if," but "when." So the question is whether certification in virtualization technology is a must-have. With vendors like VMware, Citrix, and now Microsoft in the virtualization certification game and the job market for IT professionals with virtualization skills sizzling, it would appear that many individuals would stand to benefit from sinking time and money into this specialized training. Red Hat offers Enterprise Linux Virtualization training for Red Hat Certified Technicians (RHCT) or individuals with equivalent knowledge. What's clear is that there's no doubt that getting certified in virtualization technology matters. "It just matters to some, not to everyone," says Cushing Anderson, program vice president at IDC. He says that today many IT professionals get on-the-
Fast-Rising Market
But where virtualization is relevant to an IT professional's career -- such as storage, server management and PC management certification can put them ahead of the curve. IDC projects that by 2011 the market for virtualization services will reach about $12 billion. Today, Tom Silver, senior vice president at Dice, reports about 1,500 open job postings out of approximately 8,500 posted on the company's IT job site reference virtualization skills a small percentage but a fast-growing job area nevertheless, he says.
Jupiterimages
Silver is on the same page as Anderson when considering a certification in virtualization, noting that it depends on an individual's career path and where they are on it. "If you're looking to get a job or move into a new area,
What's clear is that there's no doubt that getting certified in virtualization technology matters.
12
Getting Started with Virtualization, An Internet.com Networking eBook. 2009, Jupitermedia Corp.
certification can help. But certifications can be a mixed bag because once you're in the door, employers aren't as interested in certification versus whether you can do the job," says Silver. Jason Martin, vice president services at VMware, says that people who take the VMware Certified Professional (VCP) training should have some hands-on experience with virtualization already. The vendor reports that it's seeing a shift in demand for its VMware Certified Professional (VCP) on VMware Infrastructure 3 from the channel community to large enterprises. "It's becoming requisite training for IT staff who will install and manage VMware," Martin says. In fact, he expects that by year-end more corporate IT professionals than channel partners will pursue VCP education. The VCP allows IT professionals to demonstrate their virtual infrastructure expertise, according to Martin.
Most recently upping the ante for virtualization experts is Microsoft, with the launch of its new virtualization products. The vendor also announced a roadmap for certified technical specialists in virtualization. The vendor will offer four Microsoft Certified Technology Specialist (MCTS) certifications on virtualization, two are which are available now: Microsoft Desktop Optimization Pack, Configuring; and Windows Server 2008 Applications Infrastructure, Configuring. Available later this year will be: Windows Server 2008 Virtualization, Configuring; and System Center Virtual Machine Manager, Configuring. The four certifications are designed to validate skills on the features and functionality of key Microsoft technology areas such as Window Server 2008: Hyper-V; System Center: Virtual Machine Manager; Terminal Service Virtualization; and Application Virtualization, according to the company. Industry experts warn that rather then getting caught up in the virtualization buzz, individuals should only consider undertaking a certification track if they're interested in managing complex architectures. "Virtualization is very technical. So while the technology may be hot, only pursue it if it's your bliss," says Anderson. "Otherwise, you'll be a dull employee." I
13
Getting Started with Virtualization, An Internet.com Networking eBook. 2009, Jupitermedia Corp.