This one has me tearing my hair out!
I've been running nested esxi lab on workstation for a while and a few days ago the VMs nested on the ESXi host stopped booting. They take 100% of one of my physical cores and won't boot, I can't get in to see what's hogging the cpu. I have other VMs on workstation outside of the nested environment which are running fine. My setup is listed here:
My machine:
Quad core i5 4460, 32GB of RAM, Samsung SSD
Workstation 11:
2x ESXi hosts with 2 single core vCPUs allocated
2x 2012 Core servers, 2x 2012 server with a GUI, all 4 of which have 1 single core vCPU allocated and were running on one host with no issues
I moved the files for one of the esxi host to another drive so I initially put the performance issues down to that.
However, it has not improved and has become unusable.
I uninstalled workstation entirely and reinstalled. I also upgraded to 12.5, still unusable.
I installed a fresh ESXi host and one server 2012 R2 on workstation and am running only those 2 VMs and I can't even install a VM to the esxi host as it takes 100% of the core that's allocated to it and stays at 100%. The setup files take half an hour or more to load!! Screenshots below are from initial boot of a VM on ESXi (top is the win10 VM, bottom is the esxi host).
So my question, does anyone have any idea WHY this is happening?? Or any tips on running nested virtualisation (as I say, it was running perfectly fine until a few days ago). I've messed around with allocation of resources within vSphere, given the host less vCPUs, more vCPUs, none of which is working!! I can't understand how this was working perfectly fine and now won't even let me install a VM!!
Thanks,
Dermot
As per a few other posts here, I uninstalled antivirus (Avast in my case), problem solved!! I can't believe it was that simple but there you go.
As per a few other posts here, I uninstalled antivirus (Avast in my case), problem solved!! I can't believe it was that simple but there you go.