somedude1234
Contributor
Contributor

Apologies in advance for the huge post...

Here are the details of my current setup:

Motherboard: SuperMicro X8SIA-F

  • Intel 3420 chipset (for Lynnfield based Xeons)
  • ICH10R SATA controller with 6x ports (Vendor: 8086, Device: 3B34)
    • I have this passed through to my Nexenta NAS VM via VT-d
    • There are 5x Samsung F4 2TB HDDs connected to this SATA controller
  • Two intel USB 2.0 controllers (Vendor: 8086, Devices: 3B34 and 3B3C)
    • One of these USB controllers (Device 3B34) is passed through to my Win7 workstation VM via VT-d
    • To the passed through controller, I connect a keyboard, mouse, sound card, and webcam
  • IPMI v2.0 via Winbond WPCM450 BMC chip
    • The BMC chip includes a legacy PCI video core that is identified as a Matrox G200eW (Vendor: 102B, Device: 532)
    • This is connected to the on-board VGA port and is also accessable via the remote IPMI console
    • This is the video adapter that ESXi is using for the console

Processor: Intel Xeon X3440

RAM: 16GB total (4x4GB) Registered ECC DDR3 (Kingston KVR1066D3Q8R7S/4G)

Add-on Cards:

  • XFX RadeonHD 6850 ZDFC (AMD GPU, Vendor: 1002, Device: 6739)
    • This is a PCIe 2.0x16 device
    • The audio device shows up as (Vendor: 1002, Device: AA88)
    • This card (including the audio device) are passed through to my Win7 workstation VM via VT-d
  • Promise SATA300 TX4 PCI (Vendor: 105A, Device: 3D17)
    • There is only a single device connected to this card, an OCZ Vertex2 60GB SSD
    • I installed ESXi onto the SSD and with the left over space I created a datastore which is used for the Nexenta and Win7 VMs

ESXi version: 4.1.0, 260247

Virtual machines:

  • NexentaStor [v3.0.4] (NAS - 1x vCPU + 4GB RAM reserved)
    • VT-d devices: On-board Intel ICH10R SATA controller
    • 5x 2TB HDDs used to create a RAIDZ1 Zpool which is exported to ESXi via NFS and the rest of the network via CIFS
    • VM Set to auto-start after ESXi power on
  • Windows 7 x64 [v6.1.7601] (Workstation - 4x vCPU+ 2816 MB RAM reserved)
    • VT-d devices: AMD 6850 GPU+audio; Intel USB controller
    • VMware Tools v8.3.2, build-257589
    • Display adapters shown in device manager (Note that both devices are ENABLED:disappointed_face:
      • AMD Radeon HD 6800 Series, Driver v8.850.0.0 dated 4/19/2011
      • VMware SVGA 3D (Microsoft Corporation - WDDM), Driver v7.14.1.40 dated 3/1/2010
    • For initial setup, I left display output enabled for both the VMware adapter (accessed via remote vSphere) as well as the physical displays connected to the AMD 6850
    • After I was confident that the 6850 was working reliably, including after rebootting the Win7 VM as well as the entire ESXi system, I right-clicked on the desktop and selected "Screen Resolution" and simply disabled screen output on the VMware adapter, that eliminates the problem of the mouse disappearing off of the desktop on the physical monitors and onto the virtual VMware display.  If I ever need to access the console via vSphere, I simply re-enable that display output, but this is rarely needed as RDP works most of the time.
    • Note that I am not disabling the VMware SVGA 3D adapter in device manager, simply disabling the display output in "Control Panel\Appearance and Personalization\Display\Screen Resolution".

Suggestions for anyone encountering BSODs when booting a Win7 VM which uses a VT-d GPU:

  1. Double check the amount of configured RAM on your VM.  It has been stated by others on this thread that anything over 2GB can cause problems.  I was able to push mine up to 2816 MB thorugh trial and error, this configuration works for me, YMMV.  My suggestion is to start at 1.5GB and get the GPU stable before trying to push the VM's RAM up.
  2. Remember to ensure that 100% of your configured VM RAM is reserved for any VM which takes VT-d devices.

Other suggestions:

  1. Disable sleep mode in control panel power management, whenever my Win7 VM went to sleep, I couldn't wake it back up via the USB keyboard or mouse (note my USB controller is passed through via VT-d).  The only thing that worked for me was to connect via the remote vSphere console and click on the black screen.  This would wake up the VM so I could log back in via the physical console.  This is something that should be tried if you ever encounter a black screen on your previously working VT-d GPU display.
  2. Set cpuid.coresPerSocket per VMware KB1010184 if you're having trouble getting all of your cores to show up in the Win7 VM (not really specific to passing through GPU's, but it is a tweak I had to do for my setup).
  3. Don't rely solely on Flash to determine if ESXi+hardware+drivers are playing nicely together.  Flash has a history issues when it comes to GPU acceleration.  My validation steps consisted of many VM reboot cycles to ensure I wasn't going to encounter any more BSODs during startup.  This was followed by full screen video playback in various players (VLC, XBMC, MPC-HC, media player), and then testing with a number of games.
  4. Note that you might not see your VM bootup sequence on the VT-d GPU display.  On my system, when the VM is rebooting, the physical screens are black until the Win7 login screen appears.  The actual boot sequence (VM POST, Win7 loading screen) are visible via the vSphere remote console.  I guess the VMware display adapter defaults to being the primary.  There might be a way to change this in the VM BIOS but I haven't bothered to do so.
Reply
0 Kudos