VMware Cloud Community
stmux
Enthusiast
Enthusiast

esx linkstate down

I have two identical hosts:

AMD 8350 eight core

Asrock 990FX Extreme 3

32 GB RAM

2-Intel 82571EB Dual Gigabit NICS

2-2TB Seagate Hard Drives

Hosts ESXi 5.5.0.2068190

vSphere with Operations Manager

vCenter Server 5.5.0.2001466

Since the latest update one of the hosts keep getting a esx linkstate down error.  A red exclamation appears on the host in vCenter Client and connection to vCenter Server stops and Operations Manager shows a red badge in operations manager.  All VM's on the host appear to be running and have green arrows showing so.  I can ping all the VM's on the internal network connection. The external/internet connected NIC is no longer reachable.  This never occurs on the second, identical host.  This is very strange behavior because the hosts are connected to vCenter Server via the internal network.  Anyone have any advice on how to figure this out?

Thanks,

mux

0 Kudos
2 Replies
MyuFox
Enthusiast
Enthusiast

look at the logs, if it uplink is being lost, it is an issue with the physical network. NIC, cable, switch. Check the firmware and driver versions. Look at the switch logs what do they say?

0 Kudos
stmux
Enthusiast
Enthusiast

The error occurred again last night.  The log shows: esx.linkstate down,  Host can not communicate with vCenter Server.  Both physical NICs show up and linked.  I shut both servers down and compared the BIOS settings and all are identical.  However, I found the CPU in the host that does not have a problem idling at 48 degrees C and the CPU in the host that keeps having this problem idling at 60 degrees C.  If idling at 60C I can imagine it is well above that when loaded.  This leads me to believe that the system may be becoming unstable.  I am going to swap CPUs and see what happens.

Thanks for your feedback,

mux

0 Kudos