Greetings,
Like the title says, we're having issues vMotioning VM's from one host to another. The very strange part of this is that we can vMotion from Host-1 to Host-2 instantly while the VM is running, but if we vMotion from Host-2 to Host-1, it will start at 20%, then take about 30 minutes for 1% of progress, when it hits around 70% it will die. I have this log from the events:
"Failed waiting for data. Error 195887167. Connection closed by remote host, possibly due to timeout. 2023-01-12T02:34:20.356701Z The migration was canceled because the amount of changing memory for the virtual machine was greater than the available network bandwidth. Attempt the migration again when the virtual machine is not as busy or more network bandwidth is available. 2023-01-12T02:34:20.70838Z vMotion migration [167772298:4986640234740089536] failed to read stream keepalive: Connection closed by remote host, possibly due to timeout".
I have NOT rebooted the Host-2 because it's running two very important VM's that should always be online 24/7. I have double-checked all of our networking configuration, physical as well.
We rebooted our switches, and it seems like vMotion is back to normal speeds... I'm assuming it's a 10G card issue.
Are you using VMware Cloud on AWS?
I am not, I might have posted this thread in the wrong category. I am hosting these servers myself in office. Am I able to change the category of this post?
We rebooted our switches, and it seems like vMotion is back to normal speeds... I'm assuming it's a 10G card issue.
Next time, please post these questions under vSphere and not VMware Cloud on AWS 🙂