Virtualization offers companies a practical solution to bypass server limitations.
Most of us spend our lives working with a personal or shared computer that we use for basic tasks: checking email, writing documents, creating spreadsheets, and accessing the Internet. Unless something goes wrong or we are using an old, tired computer, we typically use less than 25% of its processing power—with occasional bursts to 100%.
Paul Bleicher
The utility model at work is "one person, one computer." And our ability to work is rarely limited by access to one, although it is still occasionally limited by lack of access to the Internet. Today, computers are an essential part of our work (wherever we are).
There is, however, another world of computers that differs dramatically from the one we all know best. That is the world of servers.
These workhorse computers live in heavily air conditioned facilities, stacked one on top of each other on "racks," and provide the underlying infrastructure of corporate networks. They also house the central store of corporate email, documents, and other essential data, along with the applications that deploy the data to users. Nowadays, whenever a new application (from a "collaboration space" to a new EDC application) is offered to a user in a company or hospital, you can be sure that there is a server (or many servers) that are involved in managing the application.
Fortunately, the world of servers is one that few non-IT professionals need to understand or work with. Nevertheless, there is a quiet revolution taking place, one that has been growing for several years now. The concept of virtualization, or virtual machines, is a very important trend that may eventually spill over into our world of personal computers. It's worth understanding, even if you don't plan on a career in IT.
For the uninitiated, the server sounds like a very sophisticated and foreboding specialty IT hardware device. Although this may sometimes be true, it is best to think of a server as a high-end version of your personal computer.
Although any computer, including your laptop, can act as a server, the typical corporate server is a specialized hardware box that can contain several central processors, slots for rapidly plugging and unplugging large hard drives, and other modifications. Servers are found on networks both large and small, where they "serve" data and applications upon request from other computers on the network.
When someone types a URL into a Web browser, they are actually making a request of a Web server to display a page. In fact, that page may be assembled, behind the scenes, by cascading requests to several other servers. When someone accesses email on a network, they are again making a request of an email server, which provides the email to the client on the PC or workstation, or to a Web server for display in a Web email application.
Servers are not necessarily expensive hardware devices, but they can be quite inefficient and costly for companies. The additional operating system, software, management, and maintenance fees can add up to a substantial total cost for ownership.
It is also very common for each server to house a single application. The major reason for this is to avoid the conflict that sometimes happens between two applications, which leads to server crashes and downtime. However, most applications don't use nearly the entire capacity of a single server—some use 5% to 15% or less.
Since modern businesses often run many different applications, the server room can house dozens of servers, each running at very low capacity. IT staff are very reluctant (as they should be) to upgrade software or apply patches to the production server because they want to avoid taking critical systems down for hours at a time. They prefer to test updates on an identical implementation of the software to confirm it works and not crash the server. This means that every major system has a parallel "practice" system for testing, validation, etc.
Finally, server hosting is limited by two practical considerations: the cost and availability of hosting space and electricity.
Fortunately, there is a very practical solution to many of these problems known as virtualization, which uses "virtual machines" (VM).
In a traditional server (or most computers, for that matter) the hardware includes a central processing unit (CPU) and a variety of devices for inputting, saving, transmitting, and viewing data—along with a very rudimentary system for using the basic operations of these devices.
In almost every computer, the manufacturer (or occasionally the owner) installs an operating system (OS) that knows how to work with the various hardware devices to do the basics of computing: reading data, writing to a hard disk, displaying on a screen, etc. The user interface to operating systems is, nowadays, a graphical environment that gives visual feedback to these system tasks.
An example of an OS you know well is Windows (and now, Vista). Most applications are designed to be installed "on top of" the OS, and will make extensive use of OS functionality in performing operations that interact with the hardware of the system.
Virtual machines come in several different types, but the simplest to understand is the VM environment that installs on top of the OS (like a software program). Examples include VMware from EMC and Microsoft Virtual Server (available for free on their Web site).
These programs allow the user to create a virtual computer within the software that replicates the environment of the server hardware. The virtual computers can have virtual devices such as hard disks, or they can interact with the actual physical devices of the computer on which they reside. In fact, the user can create as many virtual computers as the available memory will allow.
Each of these virtual computers can be loaded with any OS and any other desired software. A VM environment can house 10 different computers, each with its own application and OS. They can be given an IP address on a network, and to an outside computer will look like a standalone computer.
One of the magical elements of VM computers is that the VM, operating system, and software can be saved to a hard disk and "shut down" so that they are no longer operating on a system—and are more or less in suspended animation. At any moment they can be started up again in the exact state they were closed. This means that a program (say, an Oracle database) could be loaded into a VM, tweaked for optimal performance, tested, validated, and saved to a disk. Another computer with a VM environment could then open the VM file and immediately have the entire VM, with validated database up and running in moments.
The applications for VMs are practically limitless. A single server could operate five or 10 different applications, replacing the same number of machines that would normally be used. Or the server could operate ten different versions of the same application and each to a different group of users.
The testing and validation version of an application could be located on a VM on the same machine as the production application, reducing the need for and cost of additional computers. Someone considering buying software could install the software on a VM, test it, and delete it when done without changing the software or OS on their computer.
The value of the VM has become very apparent to most IT organizations over the past 10 years, and they are now an accepted and often preferred part of almost every company's server strategy.
VMs do, however, have their downside. The most obvious is the amount of memory that a VM uses. Each VM has its own operating system and application software, which requires a fair amount of memory to operate, limiting the number of VMs that can be deployed on a single server. More advanced VM software can reduce these memory requirements by bypassing the underlying OS or sharing the hardware resources, but that is beyond the scope of this article.
Another downside of the VM is the additional cost of software. A virtual installation of an OS or software does not necessarily mean that the software license fee is virtual—it all depends on the software license. The operating system itself might be replicated a dozen times on a single machine, with licensing costs adding up.
Finally, in the end the VMs on a computer are sharing some of the same resources with the other VMs and the core operating system. Depending on the applications being run, this can lead to slower performance.
As a network user (as most of us are), you can assume that some, if not all, of the network software you use is being run on a VM. Aside from the aforementioned cost and convenience advantages, virtualization also means less downtime when a server crashes. VMs of many applications can be stored on hard disks and reloaded at a moments notice, without regard to the particular server available.
If you are allowed to tinker with your computer, download the free Microsoft Virtual Server and install an OS (Linux is a legal alternative). You can run Linux programs inside this VM whenever you want and put it to sleep when you don't need it.
Once you have worked with a VM, you may be surprised to find out that it's useful in ways that you didn't anticipate. If nothing else, you will understand the new world of servers just a little bit better.
Paul Bleicher MD, PhD, is the founder and chairman of Phase Forward, 880 Winter Street, Waltham, MA 02451, (888) 703-1122, paul.bleicher@phaseforward.comwww.phaseforward.com.
He is a member of the Applied Clinical Trials Editorial Advisory Board.
Driving Diversity with the Integrated Research Model
October 16th 2024Ashley Moultrie, CCRP, senior director, DEI & community engagement, Javara discusses current trends and challenges with achieving greater diversity in clinical trials, how integrated research organizations are bringing care directly to patients, and more.
AI in Clinical Trials: A Long, But Promising Road Ahead
May 29th 2024Stephen Pyke, chief clinical data and digital officer, Parexel, discusses how AI can be used in clinical trials to streamline operational processes, the importance of collaboration and data sharing in advancing the use of technology, and more.
Zerlasiran Achieves Significant Sustained Reduction in Lipoprotein(a) Levels with Infrequent Dosing
November 20th 2024Zerlasiran, a novel siRNA therapy, demonstrated over 80% sustained reductions in lipoprotein(a) levels with infrequent dosing in the Phase II ALPACAR-360 trial, highlighting its potential as a safe and effective treatment for patients at high risk of cardiovascular disease.