The Problem Defined
Managing desktops in large organizations has always been one of the harder tasks of an IT center. People tend to take the phrase ‘personal’ computer rather personally. They think of a PC as a personal item to be handled as they wish with software and hardware being added and removed at will. This leads to system crashes, incompatibility, viruses and security issues and adds unnecessary cost to managing the IT setup of an enterprise.
A second problem with managing PCs relates to their lifecycle management. Since PCs are purchased incrementally and have a defined lifespan before they are replaced, you will always have a mix of new and old PCs in your environment. This means there will be different operating systems in use. It is not unusual to find Windows XP, Vista and Windows 7 coexisting in organizations.
Application programs present a similar scenario. While most users will always use a basic set of applications such as organizers, email clients and office programs, there would always be several other users with more specialized needs. For example, a group comprising web designers would need a set of specialized tools in addition to those mentioned earlier. There could easily be compatibility issues between some of these applications.
Creating a Standard System Stack
The problems defined above can be handled with using a standard system stack. This ensures that applications are categorized and grouped into families that are tested to work together and that they provide the functionality various users need. A popular method for doing this is known as the Point of Access for Secure Services (PASS).
The PASS system stack is built of three layers.
The first layer is called the “kernel” layer and this is designed to meet all requirements of the average user in an organization. Think of the needs of an office worker when you think of the PASS kernel. It will include the core OS (say) Vista, the networking support, storage, security, communication, antivirus programs and office productivity tools. Since all employees in the organization require this capability, this set is part of every desktop. Quite possibly, the bulk of the employees in an organization will get by with just this set installed on their PCs.
The second layer comprises role based applications and software that are required by certain groups of employees in your organization. As mentioned earlier, the web designers group would have need of specific software. This software is installed as a layer over the basic PASS kernel. Once again this is built as a standardized deployment that is installed for all members of the group.
The third layer comprises ad hoc software that is often used on an individual basis. An example could be a statistical analysis package used by just one or two statisticians in the organization. This is installed individually on user PCs as required.
Finally as the desktop begins to be used, the user begins to create data in the form of various files, presentations, query outputs and the end products of the special software she may be using. If these three key components of a desktop can be abstracted from each other, then the task of desktop management can be made much easier. This is what desktop virtualization sets out to do.
Creating the Virtual Desktop
The virtual desktop is created in two different models. These are the client side model and the server hosted model.
In client side desktop virtualization, the user’s PC’s own resources are used to run the virtual desktop. Virtualization software is installed on the physical desktop and the corporate virtual desktop is provided either on removable media or over a network. The virtual desktop is now run using the user’s physical desktop.
In the server hosted model, the virtual desktop is run over a hypervisor – a software layer running on the server that coordinates the running of the virtual desktops. The device with the user in this case could be a PC or a thin client.
Models of Desktop Virtualization
Four models of desktop virtualization exist.
In the first, the virtual machine is prepared centrally and then written in an encrypted form onto memory sticks, DVDs or other removable media. The user connects the device to his desktop, provides the encryption passphrase and launches the VM. This creates a fully secure, locked down VM that is fully compliant with corporate policies.
The second solution is to use the desktop’s processing capabilities to run VMs. The user could use tools from Microsoft or VMware and run the virtual machines on the desktop itself. This introduces a large number of security issues because the solution is not centrally managed. For this reason, this is not a solution that is encouraged and is best used for experimentation and learning but not as a production system.
The third model uses ‘stateful’ virtual desktops which include centrally managed solutions. There is a server image created and maintained for each user who connects to his own particular virtual machine on the server. Physical instances are created for each user and hence there is a need to provide adequate server space.
The last model is a stateless model where the virtual machine is created for the user from the image on the server when the user logs in. All user preferences and rights are applied at the moment of log in when the user connects to work on the virtual desktop. The desktop is volatile and will be destroyed when the user shuts down. Data created by the user is saved separately in a user directory outside the virtual desktop. All the activity occurs on the server and the user is only seeing a remote desktop hence the response time etc is quite satisfactory. This is the most popular model of desktop virtualization. It is easy to manage, saves disk space and can be updated with ease.
Advantages of Desktop Virtualization
There are a number of advantages of desktop virtualization. These include the following:
- Centrally managed desktops can be deployed on any endpoint device – these could be PCs, thin clients, web clients and other devices
- The desktop is locked down by default. You just create a single desktop image and copy it to as many end points as needed.
- Management of desktops becomes far easier since they are only used as remote connection devices to the virtual desktop.
- Users can still continue to be administrators on their endpoints, but when they login to the virtual desktop; they get a centrally administered locked down virtual desktop.
- Overall management is far easier since only a few virtual desktop configurations are being managed.
- Fine control over encryption, security and usage parameters is available.
- Company intellectual property can be controlled more tightly since the virtual desktop is actually deployed in the data center. The virtual desktop can be configured to ensure there is no access to external devices such as USB memory sticks or CD writers even if such devices are physically part of the endpoint device.
- Applications that have compatibility problems can be loaded as different virtual machines. This will ensure that the application does not need to coexist with any others.
- Upgrading operating systems is easy since only the desktop image has to be changed.
Desktop Virtualization is a popular method of handling complexity and providing a consistent, safe, stable and secure computing environment in an enterprise. Its benefits increase exponentially with the number of nodes in a network. The stability provided by desktop virtualization frees IT staff to perform more value added functions rather than trying to keep the desktops functional.