VMware Cloud Community
wbuntin
Contributor
Contributor

vMotion Network Configuration Problems

Hey all,

First post and i need some help from you folks.

Setup is as follows

2 ESX4.1 hosts, Dell R710 with 12 Nics.

2 Dell PowerConnect 6224 switches

1 Dell MD3200i PowerVault

vSwitches are configured on both hosts for vMotion traffic and are called vSwitch2 with 1 VMkernel port called vMotion. Each vSwitch has 4 NICs installed and all are active. Both servers utilise the same vmnic ports as well. Each vSwitch has a unique IP address in the same range as the other vSwitch on the other server.

Both pSwitches are configured with the same VLAN's on the same ports as the other and both work independantly of each other, not stacked. Each switch is connected to the production LAN by means of an uplink port to a separate switch on the production LAN.

The current VLAN's on the server are as follow:

Default:         Production

VLAN101:     iSCSI

VLAN102:     vMotion

Each VLAN utilises the same ports on both switches.

Now the problem is this, if i connect my vMotion pnics to the same switch vMotion works perfectly. Now to get some redundancy i would like to connect 2 pnics to each switch so that if one switch fails vMotion will still work. I have tested the setup using vmkping from each host to the other and if all are connected to the same switch then all works perfectly.

What would you recommend in this scenario???

Thanks for you help and if you need anymore info please ask.

0 Kudos
2 Replies
Ace007
Enthusiast
Enthusiast

Hi,

I suggest you to keep atleast 2 / pnics to each of your switch,

Created new switch and move the vmkernel and service console to it, and another switch for only vms, Then you will be having like this.

Switch 01 = vmkernal port, service console with 2 physical nics

switch 02 = virtual machine port group with 2 physical nics.

it is safe and i also have same thing in my network.

I hope it will help you.

0 Kudos
wbuntin
Contributor
Contributor

Hey all,

Just thought i'd drop in with an update on this issue.

Removed all the cabling from the server and found that one of the PCI devices was causing ESXi to purple screen.

Removed both the Broadcom NIC's from the server and reseated the riser and NIC's and rebooted the server.

vMotion is now working again.

I've wondered if i'd caused a loop on the switch and the switch has shutdown the ports but i checked and double checked all the cabling, even going to the extent of re-cabling to different ports and with different couloured cables.

Anyway problem resolved and i now have my systems working as they should be.

Thanks Ace for the reply but it wasn't really what i needed.

0 Kudos