Desktop virtualisation is software technology that separates the desktop environment and associated applications from the physical client device that is used to access it. i.e. The operating system and software no longer runs directly on your PC, they run on a hypervisor platform which in turn is installed on your PC (or Server).
So the first question you may ask is why bother? – Good Question!
Well, traditionally software is installed onto an operating system, which is installed onto a PC or SERVER. See diagram below
with the task of having to rebuild your machine again. This makes upgrading hardware worrying and sometimes is the reason people don’t. 4 years later it’s off to the local store to buy the next new thing, all configured against the new hardware. And so the cycle repeats. This is also true of server operating systems and believe me, on a server there’s a whole many more people relying on you to get it up and running pronto.
In a virtualised environment, instead of installing your operating system onto your hardware, you actually install the hypervisor onto the hardware instead. Your operating system is then installed as a virtual machine within the hypervisor. At this point the hardware is abstracted from the OS and software. See diagram below
If the hardware changes, the only bit that has to be updated to reflect the change, is the hypervisor layer. This layer upgrades very easily; in fact I’ve not, to date (touch wood), ever seen a hypervisor layer NOT cope with new hardware. The virtual desktop and it’s software doesn’t need to change at all. It simply carries on talking to the hypervisor as it always did. Now, this makes hardware upgrades even easier. In fact it means you can theoretically take a virtual desktop from one PC and put it straight onto a new high spec’d updated PC and nothing need change. The whole desktop will look at run, and be identical, to how it was previously. It takes out the danger of upgrades and leaves users with the same desktop experience they’ve always had.
Alongside this, there are a whole heap of extra benefits that arise from having your desktop virtualised. The best one, I think, being snapshots – the ability to take a snapshot of a point in time, of your desktop; very useful if you’re about to install something new and want a roll back point just in case it doesn’t work.
Remote desktop virtualisation
Remote desktop virtualisation implementations operate as a client/server computing environment. Application execution takes place on a remote operating system which is linked to the local client device over a network using a remote display protocol through which the user interacts with applications. All applications and data used remain on the remote system with only display, keyboard, and mouse information communicated with the local client device, which may be a conventional PC/laptop, a thin client device, a tablet, or even a smartphone. A common implementation of this approach is to host multiple desktop operating system instances on a server hardware platform running a hypervisor. This is generally referred to as “Virtual Desktop Infrastructure” or “VDI”. It should be noted that VDI is often used incorrectly to refer to any desktop virtualisation implementation.
Remote desktop virtualisation is frequently used in the following scenarios:
- In distributed environments with high availability requirements and where desk-side technical support is not readily available, such as branch office and retail environments.
- In environments where high network latency degrades the performance of conventional client/server applications
- In environments where remote access and data security requirements create conflicting requirements that can be addressed by retaining all (application) data within the data center with only display, keyboard, and mouse information communicated with the remote client.
It is also used as a means of providing access to Windows applications on non-Windows endpoints including tablets, smart phones and non-Windows-based desktop PCs and laptops.
Remote desktop virtualisation is also used as a means of resource sharing, to provide low-cost desktop computing services in environments where providing every user with a dedicated desktop PC is either too expensive or otherwise unnecessary.
If any of the hardware changes, new disk drives, new controllers, new VGA card etc, the operating system has to be updated to reflect this new hardware. In turn, it’s possible that the software will have to be updated also. This can be as easy as starting up your PC/Server and it saying “New Hardware Detected” and “Updating Now” and all goes tickety boo and a couple of restarts later your machine is happy. BUT, it can also mean finding drivers and having the operating system make changes here there and everywhere to cope with the change in the underlying hardware. Sometimes this goes very wrong and on the odd occasion can have you reaching for the install CD, with the task of having to rebuild your machine again.
This makes upgrading hardware worrying and sometimes is the reason people don’t. 4 years later it’s off to the local store to buy the next new thing, all configured against the new hardware. And so the cycle repeats. This is also true of server operating systems and believe me, on a server there’s a whole many more people relying on you to get it up and running pronto.