Lately I’ve been seeing what I think might be an interesting yet disturbing new trend: whenever there’s a problem that initially seems unusual and/or unexplainable, it’s not uncommon to quickly question virtualization as a potential source of the issue. I fully understand that virtualization adds a new variable to the basic equation – and represents greater complexity due to more moving parts. But it seems to be somewhat rushed to the head of the list. In some respects it reminds me of the days when many tech support calls started off with how much memory do you have, oh just that much, that’s your problem.

 

Now that’s not to say that virtualization is without some additional concerns that must be added to the mix – some of which can radically affect the results. But as far as true issues that virtualization breaks – I’ve run into just two so far. Some client operating system and database monitoring tools are very sensitive to the real time clock, and any skew between the host and client VM may yield slight variations that might possibly affect some results. And second, some older Windows versions have poorer memory management that seems not to work 100% reliably on VM’s. Shy of these, I’ve yet to encounter any problems that I can point the finger to the virtual machine software (e.g. VMware).

 

So when someone says “I can’t explain problem X and it doesn’t occur on non virtualized machine that’s setup exactly the same”, please make sure only to accept such arguments with a fair and reasonable amount of verification. Let’s not waste any precious resources and cycles chasing such an easy scapegoat. There may be issues where virtualization is legitimately the culprit – just not most of the time. So let’s not promote it from obscurity to the head of the class.