1 Reply Latest reply on Feb 12, 2016 2:42 PM by a.p.

    vMotion Performance

    CCDC Lurker
      • Installed 3 new hosts – Dell R430 with 8 network ports each (1GB each)
      • Shared storage - EqualLogic PS6100SX
        • dedicated SAN with 4 nics from each host spread across two switches
      • Hosts are running ESX Essentials Plus
      • All three hosts are running 5.5u3

      I have vmotion up and running. However I am disappointed in the performance. A very small vm (2 vcpu and 4GB of ram) that runs just our anti-virus software takes about 45 seconds to migrate. I figure if I needed to migrate all the vm’s off a host for some serious reason it would take far too long before the host was dead.


      I am using one of the iSCSI vmkernel ports on our SAN for vmotion. This port is using jumbo packets and is limited to 1GB. I believe the poor vmotion performance is due to a few issues:


      1. Shared network traffic with the SAN. I am sure there is contention with SAN traffic. Not sure if jumbo packets is the best for vmotion traffic.
      2. Bandwidth being limited to 1GB. Not sure if this is as import as the issue above.


      I believe I need to physically separate the vmotion network. The big questions is does this new network need to be running at 1GB or 10GB to increase performance. Of course 10GBase-T comes at a large cost but do I need to spend big dollars on:

      • 2 port 10GBase-T nics in each host (I have room in each host for this)
      • Small 8 port 10GBase-T switch


      If I need a physically separate vmotion network I guess it will need its own dedicated IP scheme. We have 10.0.0.x for our devices network and 192.168.130.x for our SAN so we could use 192.168.140.x for vmotion


      Before I start spending a bunch of money I wanted to get others that may be more knowledgeable with vmotion them myself.


      Thanks for your time and help. I look forward to your comments and suggestions.

        • 1. Re: vMotion Performance
          a.p. Guru
          Community WarriorsvExpertUser Moderators

          vMotion can easily saturate NICs, so you shouldn't run vMotion on the same NICs as the storage traffic. Anyway, other than purchasing a 10GBit/s infrastructure, you could also add e.g. an additional 4 port NIC to each host and configure Multi-NIC vMotion, which will speed up the migration by ~4x (up to 16 NICs are supported).