I have a Cluster with 6 vSphere ESXi 5 hosts. They all share a distributed switch with multiple VLANs configured.
Regardless of which host I create a VM on or which VLAN, when the VM first boots up it's network connectivity is impaired. The VM boots up fine, and the NIC shows connected. However, when trying to access the network the problems start.
On Windows Server 2008 R2, pings out work fine. However web browsing in IE returns malformed data messages and the Location Awareness does not function (so the network reports limited connectivity).
On web servers, trying to hit the web page results in "abnormally terminated connect"
Somewhat by accident, I found that if I vMotion the VM to another host it work fine. This has proven out over all of my hosts. The issue seems to only affect the VM on the host it originally started on.
Any ideas would be helpful.
Hello, do you get this errors on all VMs? Or on certain VMs at some times only?
How many physical ports do you have per server? What load balancing teaming policy are you using?
My first guess is that it is configuration issues between the vmnics and the physical switches, but only on some ports which causes the problem to get solved by moving the VMs away. When ping going through, but other traffic does not could this be causes by a so called duplex mismatch. That is, that one side of the connection has "hard" settings for duplex, like full, while the other has auto.