VMware Cloud Community
jdamhoff
Contributor
Contributor

Adding another host: new host not seeing existing SAN

I have 2 older hosts on vSphere 6.0 and a Dell Equallogic SAN. I'm trying to add a new Dell host so I can eventually retire one of the older hosts. The problem is that the new server cannot see the Equallogic. I have added the new server into the hostgroup on the SAN, copied the iSCSI identifier and added it to the identifier inside vmware on the iSCSI software adapter. I'm missing something but I'm not sure what it is. Thank you for any feedback!

6 Replies
a_p_
Leadership
Leadership

Welcome to the COmmunity,

I have added the new server into the hostgroup on the SAN, ...

I assume that you copied the ESXi host's IQN, and used it for the host object on the storage system.

... copied the iSCSI identifier and added it to the identifier inside vmware on the iSCSI software adapter

Not sure about this one. What you need to do on the ESXi host - after creating the iSCSI software adapter, configuring networking, ... - is to simply add the EqualLogic's IP address (or the group IP address) to the Dynamic Discovery targets.

André

frostyk
Enthusiast
Enthusiast

Did you set up a VMk for iSCSI on the host's switch?  Check this step by step.

DiskStation Manager - Knowledge Base | Synology Inc.

this is for synology specifically, but its a good step by step regardless of the storage being used.

jdamhoff
Contributor
Contributor

Thank you a.p. and frostyk. Both of your suggestions were very helpful. I am now able to see the SAN but I'm having issues if I attempt to vmotion a virtual machine to the server. Specifically this error: "

vMotion migration [-1408233452:1489590957618118] failed to read stream keepalive: Connection closed by remote host, possibly due to timeout

Migration to host <192.168.16.30> failed with error msg.vmk.status.VMK_MIG_CONN_CLOSED (195887167).

Migration [-1408233452:1489590957618118] failed to connect to remote host <172.16.16.20> from host <192.168.16.30>: Timeout.

vMotion migration [-1408233452:1489590957618118] failed to create a connection with remote host <172.16.16.20>: The ESX hosts failed to connect over the VMotion network

Failed waiting for data. Error 195887371. The ESX hosts failed to connect over the VMotion network."

I have the networking setup almost identical one of the old existing hosts.

  • Service console IP changed
  • vSwitch2, IP is the next in the pattern 16.30, existing are 16.10 & 16.20
  • vSwitch3, IP's are next in the pattern 255.17 & 18, existing are 13,14,15 & 16

My question is how do I narrow down where I have a problem at now?

New:

pastedImage_0.png

Existing:

pastedImage_3.png

0 Kudos
a_p_
Leadership
Leadership

Please double-check that only the vMotion port groups on the hosts have "vMotion" checked/enabled in the settings, and that all hosts can reach the other ones over the vMotion network (using vmkping <vmotion-IP-Addresses>).

Btw. it seems you've been using ESX for a long time. The "Service Console" is something that was available on the classic ESX hosts, and doesn't exist on an ESXi host anymore. I'd actually consider to stay with a single Management port group with ESXi.

Something else to consider - with the environemt you have - is to setup Multiple-NIC vMotion in vSphere (2007467) | VMware KB to take advantage of the two vmnics. This will double the vMotion speed.

André

0 Kudos
NelsonCandela
Enthusiast
Enthusiast

Dear jdamhoff,

I ran into the same problem when trying to move certain machines with the the following error message:

vMotion migration [176163381:1513151641969917] (0-74571369681560) failed to receive 68/68 bytes from the remote host <xxx.xxx.xxx.xxx>: Connection closed by remote host, possibly due to timeout

vMotion migration [176163381:1513151641969917] failed to send init message to the remote host <xxx.xxx.xxx.xxx>

Migration to host <xxx.xxx.xxx.xxx> failed with error msg.vmk.status.VMK_MIG_CONN_CLOSED (195887167).

The only thing that worked for me was to disable EVC mode on that cluster - et voilà - it worked.

Other than that I have no explanation for this weird behaviour ...

0 Kudos
NelsonCandela
Enthusiast
Enthusiast

I did some more research and thus would like to add a more detailed explanation.

The VMs I could not migrate have not been shut down in a while.

The underlying problem here was that the VM, as long as it hasn't been shut down, still ran with older EVC settings (newer: Sandy Bridge, older: Nehalem). The Cluster EVC setting these VMs reside in has been changed a few months ago and VMs weren't moved since then.
A VM will only accept the latest EVC Cluster settings when it is shut down completely and being rebooted, a software reboot won't suffice. Disabling the EVC mode then at least enables a proper vMotion, but in order to activate the EVC Cluster settings eventually the VMs will have to be shut down in the first step and then need to be started so it will all work together again.
0 Kudos