Tuesday, January 26, 2010


Managing Virtual Machines in Windows Server 2008 | Experts Talk About Windows Server Virtualization

The MMC snap-in for managing virtual machines that is provided with Windows Server virtualization is evolving, And here, I wanted to give you a quick preview. Figure below shows the Windows Virtualization Management console of Windows Server 2008. The console tree on the left displays the name of the server, while the Details pane in the middle shows a number of virtual machines, most of them in an Off state and two in a Saved state. The Actions pane on the right lets you manage virtualization settings, import virtual machines, connect to a virtual machine, and perform other tasks.

Figure: Windows Virtualization Management console

So that’s a very brief preview of what’s in store for virtualization in Windows Server 2008 in terms of managing virtual machines. Fortunately we also have some experts on the product team at Microsoft who provide us with some more information concerning this feature and especially the planning issues surrounding implementing Windows Server virtualization in your environment.

First, here’s one of our experts talking about using Windows Server virtualization in conjunction with the Windows server core installation option of Windows Server 2008:

From the Experts: Windows Server Virtualization and a Windows Server Core Installation:
The Windows server core installation option of Windows Server 2008 and Windows Server virtualization are two new features of Windows Server 2008 that go hand in hand. The Windows server core installation option is a new minimal GUI shell-less installation option for Window Server 2008 Standard, Enterprise and Datacenter Editions that reduces the management and maintenance required by an administrator. The Windows server core installation option provides key advantages over a full installation of Windows Server 2008 and is the perfect complement to Windows Server virtualization. Here are a couple of reasons why.
  • Reduced attack surface A Windows server core installation provides a greatly reduced attack surface because it is tailored to provide only what a role requires. By providing a minimal parent partition, this reduces the need to patch the parent partition. In the past with one workload running per server, if you needed to reboot the server for a patch, it wasn’t ideal, but generally one workload was affected. With Windows Server virtualization, you’re not just running a single workload. You could be running dozens (even hundreds) of workloads in their own virtual machine. If the virtualization server requires a reboot for a patch (and you don’t have a high availability solution in place), the result could be significant downtime.
  • Reduced resource consumption With the parent partition requiring only a fraction of the memory resources for a Windows server core installation as opposed to a full installation of Windows Server 2008, you can use that memory to run more virtual machines.
In short, it is highly recommended that you use Windows Server virtualization in conjunction with a Windows server core installation.

-Jeffrey Woolsey [ Lead Program Manager, Windows Virtualization ]

Next, let’s hear another of our experts on the virtualization team at Microsoft share about how to identify what should be virtualized in your environment and what maybe shouldn’t:

From the Experts: Virtualization Sizing
It is very important to understand how to roll out virtualization in your organization and what makes the most sense for your environment and business conditions. So often, some enthusiastic users and organizations start either attempting to virtualize everything or start with their most complex middleware environments. There are no right or wrong first candidates for virtualization but you need to ensure that you have fully thought about the impact of using virtualization in your environment and for the workloads in question.

As you think about what to virtualize and how to go about picking the right workloads, the order of deployment, and what hardware capabilities you need, find a model or a set of models that help you conceptualize the end solution. The System Center family of products provides you a set of tools that help simplify some of these issues, and other solutions from vendors like HP provide you tools to help size the deployment environment once you have figured out the candidates and the rollout process.

The next few points help identify some of the best practices in sizing your virtualization environment. Think of the following as a set of steps that will help you identify what workloads to virtualize and what the deployment schedule should look like.

1. Assessment As with any project, the first step is to fully know about where you are today and what capabilities you already have in your environment. The last thing you want to do is to sit and re-create the wheel and invest in things you already have in your environment. As you think about assessment, think about assessing all the components you have in your infrastructure, the types of workloads, and interdependencies of the various workloads. Also evaluate all the management assets you already have in your infrastructure and identify the functions that these are performing, such as monitoring, deployment, data protection, security, and so on. These are the easier items to assess, but the more critical one to assess will be the overall process discipline that exists in your organization and how you deal with change in today’s world. While this is a hard factor to quantify, this is critical in evaluating what capacity you have to deploy virtualization. To help you make this assessment from a holistic perspective, there are tools available such as Microsoft’s Infrastructure Optimization Model or Gartner’s IT Maturity Model that you can choose to use. There is one thing a customer once told me that I will never forget–“If someone tells you they have a solution for your problems when you have not identified or told them what your problems are, most likely they are giving you something you already have in a different package–that is, if you are lucky.”

2. Solution Target Once you have identified and assessed your current environment, find out where you can use virtualization today. All server virtualization solutions today provide these usage scenarios:
  • Production Server Consolidation, which encompasses all forms of consolidation of systems in existing or new environments.
  • Test and Development Environments, which addresses the use of virtualization for optimizing the test and dev cycles and not only enables you to leverage the cost saving from hardware needs but also enables easy creation and modification of the environments.
  • Business Continuance, where your primary motivator is to leverage the fact that virtualization transforms your IT infrastructure to files (in Microsoft’s case a VHD file) to enable new and interesting continuance and disaster recovery solutions.
  • Dynamic Datacenter, which is a new set of capabilities unleashed by virtualization to now enable you to not only create and manage your environment more efficiently, but provide a new level of capability to be able to dynamically modify the characteristics of the environments for workloads based on usage. The dynamic resource manipulation enables you to take the consolidation benefits and translate it to now making your IT a more agile environment.
Branch Office, which while not being a core solution, is one usage scenario where virtualization helps change how IT systems are deployed, monitored, and managed and helps extend the capabilities of the branch environment to bring in legacy and new application environments under one common infrastructure umbrella.

As you are trying to decide which solution area or areas to target for your virtualization solution, do keep in mind the level of complexity of the solutions and the need for increasing levels of management tools and process discipline. Test and dev environments are the easiest to virtualize and usually can manage to take some downtime in case of hiccups–hence this is a natural start for everyone. Server Consolidation is another area that you can start using virtualization in today. The initial cost savings here are in the hardware consolidation benefits–but the true value of consolidation is seen only when you have figured out how to use a unified management infrastructure. Business continuance and branch scenarios need you to have a management infrastructure in place to help orchestrate these solutions and again to see the true value – you will need to have a certain level of processes outlined. Dynamic datacenter is a complex solution for most customers to fully deploy and this usually applies to a certain subset of the org’s infrastructure–select the workloads that need this type of solution more carefully as adding the SLAs to maintain such a solution should mean that the workload is really critical to the organization.

3. Consolidation Candidates Most users today are deploying virtualization to help consolidate workloads and bring in legacy systems into a unified management umbrella. In this light, it becomes important to identify which workloads are the most logical ones to consolidate today and what makes sense in the future. There are some workloads that sound attractive for virtualization, but might not be ideal at any stretch because of certain I/O characteristics or purely because they are so big and critical that they easily scale up to or beyond the capabilities of the hardware being thrown at them. Operations Manager or Virtual Machine Manager has a report that is generated called the virtualization candidates report that helps scan your entire IT org and tell you exactly what workloads are ideal for virtualization based on a number of thresholds such as CPU utilization, I/O intensity, network usage, size of the workload, and so on. Based on this report and knowing the interdependencies identified during the assessment phase, you can make intelligent decisions on what workloads to virtualization and when.

4. Infrastructure Planning This is where the rubber meets the road so to speak. Once you have identified the candidates to virtualize, you need a place to host the virtualized workloads. Tools from companies such as HP (HP Virtualization Sizing Guide) help you identify the type of servers you will need in your environment to host the virtualization solution that you have identified in the previous step. There is one fundamental rule to consider as you are selecting the infrastructure for virtualization–the two biggest limiting factors for virtualization are memory and I/O throughput–so always ensure that you select a x64 platform for your hardware to ensure a large memory access, and always try to get the best disk subsystem either into the system for DAS or good SAN devices.

5. Placement This is not so much an area that is going to affect the sizing of your environment, but has the potential to impact your sizing decisions in the long run. Here we are referring to the act of taking one of the virtualization candidates and actually deploying it to one of the selected virtualization host systems. The knowledge of interdependencies of the various workloads affects some of how this placement occurs but from a high level, this is more about optimizing the placement for a few selected variables. Virtual Machine Manager has an intelligent placement tool that helps you optimize either to a load balancing algorithm or to a maximizing utilization algorithm. You can alternatively also tweak individual parameters to help optimize your environment based on your business weights of the different parameters.

As you size your virtualization environment, also keep in mind the overall manageability factor and how you can scale your management apps to help cover the new environment. Now that you have seen how to size your virtualization environments, keep two things in mind–virtualization is a great technology that can help in multiple levels and scenarios but is still not the panacea for all problems so do take the time to identify your true problems and also remember that you need to look at deploying and managing virtualized environments over a long period of time and hence the need to think about virtualization as a 3-year solution at least.

Virtualization is primarily a consolidation technology that abstracts resources and aids aggregation of workloads, so think carefully about how this affects your environment and what steps you need to have in place to avoid disasters and plan for them early.

-Rajiv Arunkundram [ Senior Product Manager, Server Virtualization ]

Finally, an important planning item for any software deployment is licensing. Here’s one of our experts explaining the current licensing plan for Windows virtualization:

From the Experts: Virtualization Licensing
One of the most talked about and often most confused areas for virtualization is licensing. Some of this is primarily caused due to the lack of one industry standard way of dealing with licensing and the other cause is that virtualization is a disruptive technology in how companies operate and hence not clear to customers on what the various policies mean in this new world.

Microsoft’s licensing goals are to provide customers and partners cost-effective, flexible, and simplified licensing for our products that will be applicable across all server virtualization products, regardless of vendor. To this effect, several changes were put in place in late 2005 to help accelerate virtualization deployments across vendors:
  • Windows server licensing was changed from installation-based licensing to instance-based licensing for server products.
  • Microsoft changed licensing to allow customers to run up to 1 physical and 4 virtual instances with a single license of Windows Server 2003 Enterprise Edition on the licensed device; and 1 physical and unlimited virtual instances with Windows Server 2003 Datacenter Edition on the licensed device.
  • With the release of SQL Server 2005 SP2, Microsoft announced expanded virtualization use rights to allow unlimited virtual instances on servers that are fully licensed for SQL Server 2005 Enterprise Edition.
With all these changes, you can now easily acquire and license Windows Server and other technologies in a much more efficient process. Virtualization also adds another level of complexity for licensing with the ability to easily move the images or instances around between machines. This is where licensing from the old era makes it tricky. The simple way to remember and ensure that you are fully licensed is to look at the host systems as the primary license holders with the instances being the deployment front. So if you want to move a workload to a system that has Windows Server Enterprise Edition running and already has 4 instances running, you will need an additional license; if it is lower than 4, you will not need an additional license to make the move happen.

Do note that the licensing policies for these apply across virtualization products in the same manner across all server virtualization platforms.


About bench3 -

Haja Peer Mohamed H, Software Engineer by profession, Author, Founder and CEO of "bench3" you can connect with me on Twitter , Facebook and also onGoogle+

Subscribe to this Blog via Email :