Virtualization

As the number of tasks in the IT business grows, the question of competent allocation of resources appears. Partially helps to solve the problem of virtualization. This term implies the creation of an infrastructure independent of the hardware. The technology itself has many advantages. First of all, it reduces costs. It requires less hardware servers, they consume less electricity, take up less space.

Virtualization

In small and medium-sized businesses there is an opinion that such technology is in demand only for large companies. But this is only one of the myths about virtualization. But they prevent managers and specialists from taking full advantage of the solution.

Virtualization is the same as cloud computing.

These two concepts are different. Cloud computing is what becomes possible, thanks to that very virtualization. The term itself implies access to certain common computing resources via the Internet. It can be data or programs. Virtualization of servers can be used without using cloud technologies. They can also be used later to expand the capabilities of the platform.

Virtualization is interesting only for large companies.

According to this myth, the solution is disadvantageous for small and medium-sized firms, who will have to deploy a complex and costly solution. In fact, regardless of the size of the business, virtualization will be a profitable solution. Within the framework of a small company, it is quite feasible that all services will be located on virtual machines that will run on the same hardware platform. So you can avoid buying additional servers. For a small company, the cost of them can be impressive. In fact, two or more servers can already be exposed to virtualization.

Virtualization dramatically reduces overall system performance.

In practice, it turns out that it is rare when modern processors use all the hardware capacities for 100 percent. Most of the time, the technician works at idle speed, in a half-sleep mode. This is especially true for domain controllers, domain name services, the center of the anti-virus application. For each such service, it is simply irrational to allocate a separate service. So it is quite possible to transfer some not labor-consuming services to virtual machines, having collected on a uniform host system. The performance drops will not happen in this case. But it is not worth taking light decisions. Any system has its own possible speed limit, which should be considered when virtualizing. Using a virtual machine, it is worthwhile to run tests for performance, and only then on this host to deploy a new service. It is necessary to take into account the fact that each virtual machine requires up to 20% of additional resources from what it requires for its own maintenance. And the host system itself needs free capacities.

Virtualization will require special equipment.

This myth is supplemented by frightening, as for an uninformed specialist, according to the blade system, specialized servers, etc. But this myth has appeared, thanks to those presentations and conferences held by manufacturers of expensive specialized equipment, for example, HP or IBM. Such meetings demonstrate equipment for building virtual solutions, the same blade systems. However, the myth is built on erroneous theses. It is convenient to use expensive and proven systems designed specifically for virtualization. However, in fact, virtual services can be deployed on the regular hardware as well, if only it satisfies the tasks in terms of power. There are, however, some limitations. Modern hypervisor programs for organizing a virtual host system may not support some hardware. So self-assembly servers can not always be a solution. Problems can arise with non-standard RAID controllers and network cards. But even in this case there are certain tricks.For example, RAID can be built programmatically or add a network card with which the hypervisor can work. Even the old HP G4 server can without additional efforts become the “home” for a couple of undemanding virtual machines. So you can save space in the rack and not spend money on the purchase of a new server.

All high-quality software for virtualization is expensive and expensive.

It is no accident that they say that free cheese is only in a mousetrap. But how applicable is it to hypervisors? In fact, the picture here is optimistic. There are many free products on the market, such as VMware ESXi, Citrix XenServer, and Windows 2008 Standard (64 Bit). Hyper-V Core solves the required tasks. All these are junior versions of powerful commercial solutions. But the engine is used exactly the same as the older paid counterparts, the same ideology of work and the format of virtual machines. The developers believe that with the growth of the company and the development of its infrastructure, there will be a transition to paid solutions that allow expanding the platform’s functionality. And this can be done without reinstalling the virtual machines. If you compare the features of paid and free programs, it turns out that you can freely use the basic functions: hypervisor, conversion, work with different types of storages, moving virtual machines between the host servers without interrupting the work.

Virtualization systems are difficult to maintain.

Today, virtually all modern virtualization systems are managed through a graphical application. Fine-tuning fans can work with the command line. The administrator does not need to go to the server to increase the amount of RAM, disk space, adding a processor. Today, all this can be done directly from your workplace, managing the virtual environment of the production server in the console.

Virtualization

Virtualization is unreliable.

This statement is based on the assumption that a failure in the host system will entail the termination of several virtual machines based on it. But this risk is compensated by the speed of system recovery if there is a backup of the virtual machine. On average, the system is restored in just a third of an hour. Recovery consists in moving the virtual machine files to another server. And large industrial solutions generally allow replication on the fly. In this case, even the failure of one hardware server will not stop the services involved.

Modern virtualization systems, such as Citrix XenServer and VMware, use the bare metal principle, that is, they are installed directly on bare metal.

The core of the system is Unix OS, extremely reliable and well protected against virus diseases. Such a system is economical and optimized in its code, devoid of anything superfluous. So the hypervisor will not be distracted by extraneous tasks. Reliability of hardware can be ensured by purchasing reliable equipment. It can be afforded in view of the overall savings on servers. And this will help to forget about hardware problems for a long time. The decision to use virtualization technology should be carefully verified. With careful planning, the result promises to be much less problematic than in the case of several obsolete low-cost servers in a traditional configuration.

To deploy a virtualization complex it is difficult to find an expert.

Good IT specialists in the market are in demand. In the case of virtualization systems, the picture is the same. I am glad that the main products in this sphere from Microsoft, Citrix and VMware are still well documented. Regular meetings of specialists with representatives of companies and system integrators are held. They will answer the most exciting questions. So in any case, even an inexperienced specialist will not be in a vacuum. Of course, you should not trust your infrastructure to a student, an underworking administrator.He will get experience, but what will happen to the company? Today, there are more and more professional system administrators who have basic skills in building virtualization systems.

Virtualization is a panacea for all problems.

When it comes to improving manageability, efficiency and energy efficiency, virtualization can really work wonders. But she will not do it by herself. Some IT specialists do not study the problem from all sides, believing that the transition to virtual solutions will solve all the problems. But this is not a magic pill. If there is no effective management and an emphasis on the benefits of virtualization, then it will not bring the desired effect.

Virtualization is not suitable for high-performance applications that work with I / O.

Such a myth has developed long ago, when the first hypervisors just appeared. Such complexes irrationally involved all the resources of the host server. But since then, the technology of virtualization has moved far ahead. So, recently Vmware demonstrated the version of its ESX Server, capable on one of its hosts to perform more than one hundred thousand operations per second of data I / O.

To use virtual machines, you need to know Linux.

In the first versions of hypervisors, in the same Vmware, it was suggested to work with the command line of the Linux console to access some control elements. And although today such a variant of working with the host is still available, most administrators do not use it anymore. Many hypervisors work on the basis of Windows with a graphical interface. Hypervisors become easier and more understandable, helping specialists master this solution.

Virtualization is a software layer that slows down applications.

This myth is only partially true. Some solution providers such as Vmware and Microsoft do offer their solutions running Windows or Linux. But the same Vmware ESX (i) is a hypervisor running on bare metal. This allows you to maximize the use of server resources without a software layer in the form of an operating system.

You can not virtualize Microsoft Exchange and SQL Server

. Several years ago, when the standard was single-core processors, such services with their constant workloads were not recommended for virtualization. But modern platforms work with 4 and 8 cores of several processors. Now even the most labor-intensive services can be successfully implemented in a virtual environment. The key to load distribution is proper planning and understanding of technology.

Virtualization

Virtualization works only with servers.

Many companies benefit from desktop virtualization. This gives the advantage of centralized management, a common approach, and improves disaster recovery options. Using a thin client or client application, you can connect to your desktop from anywhere in the world. Disc imaging technologies can reduce data storage requirements, eliminating unnecessary duplication of copies.

Virtualization is not safe.

Any software can be considered unsafe. However, using the best practices for network solutions, storage systems and operating systems, you can build a truly secure environment. Virtualization allows you to set your own security standards, set up policies, and conduct tests to ensure that they meet the minimum requirements.

Add a Comment

Your email address will not be published. Required fields are marked *