No, I dont know anyone else who has had success with this board with GFX passthrough. I can passthrough other items fine, just not the GFX card.
I have 2GB assigned to the VM. Tried the PCI.start and end hole config change. Yeah you need to have the SVGA vmware adapter installed otherwise it doesnt detect the ATI card properly. Have tried catalyst version 12.4, 12.6 and 12.7. Windows BSOD the moment I plug in the monitor or enable it. Installing drivers etc for it is fine. Thats why I wanted to see if anyone else could definitely say that trying different GFX definitely helped them or whether it wont work at all.
In my experience the card makes no difference, I have tried 3 cards from 2 generations and get 100% the same results with all cards and all versions of Catalyst. My opinion based on my findings is that it is dependant on the main board and also the port on the main board. On both of my boards only 1 PCIe port works for graphics pass through.
ASRock Z77 Extreme4-M with HD7850 here. Just posting to show that it seems ASRock boards seem to have trouble with passthrough through GPUs, but other devices are fine.
Like Artwright, I get BSOD the moment I plug monitor in. Win 7 x64 bootup also results in the same BSOD if monitor is already plugged in.
I swapped over to a HD4670, slightly different BSOD scenarios but also couldn't get it working.
Would be great if anyone having success with recent generation ASRock intel boards could post with any details.
I got this working on Windows Multipoint Server 2011. I am using a FirePro v5800. The key was this article: http://kb.vmware.com/selfservice/microsites/search.do?language=en_US&cmd=displayKC&externalId=1011709
The default SVGA drivers don't allow multiple monitors. The OS effectively ignores other GPUs. My AMD showed up in device manager, but was simply not usable.
Once I updated the drivers, I had control again.
I am trying to allow for RemoteFX compression for Remote Sessions (terminal services). While I'm getting close, I still dont think hardware compression is working.
So I have had my ESXi 5.0 install running for about a month or so with VMDirectPath passing through my GPU.
The build has worked for both the 4970 1GB from the previous desktop and also works with the 7970 that I know have
Though you cannot install the AMD Vision Center, results in the VM not being able to load again
32GB (4x8GB) G.Skill Ares DDR3-1333 CL9
EVGA GeForce 210
Adaptec 1430SA (FreeNAS)
VisionTek Reference 7970 (Windows)
OCZ Vertex LE 50GB
2 x WD Cavier Blue
4 x 500GB (Adaptec 1430SA)
FreeNAS - Testing iSCSI + ZFS
Ubuntu 12.04 - Software Development
Windows 7 - Gaming and ESXi Management
Windows 7 - Testing
Good info. This is one of the cheaper builds I've seen reported to work here. Most of the working builds for ESXi seem to use server class Intel boards (X58, X79). I still cannot get my Shuttle SH67H3 + i5-2500 + ATI 5670 combination to work, which seems like an ESXi limitation, because GPU passthrough has been reported to work with this system using XenServer. The same person did report success with ESXi and the Shuttle SX58J3, but I don't want to go as old as LGA1366. There is now a Shuttle SX79R5 but it's pricy-- about $500 for the barebones system and $300 for a quad core Sandy Bridge-E. On the other hand, the AMD FX-8120 you're using is an 8-core CPU for only $160 on Newegg. I prefer a box with a small footprint so I will have to do some research now on AMD Micro-ATX and Mini-ITX boards and cases.
Looking back I see that BAM279 was successful with an AMD Opteron system last year. Perhaps AMD is the way to go for an affortable desktop-class GPU passthrough solution? For Intel, I continue to wonder what is special about the server chipsets versus desktop regarding VT-d/GPU.
My i7 870 on a Q57M-SH2 is a cheap intel build and M-ATX. All in an HTPC case and running for more than 12 months without fault. Up time of over 100 days now and it only got turned iff before that to fit more ram.
I would not compare VT-d on ESXi with XEN since XEN can do GPU passthrough on systems with no VT-d, they must use a different way of emmulating the gpu into the guest OS.
I am using the secondary PCI Express slot, the GeForce 210 is in the primary slot as I would prefer to still have a console when I need it
Also, the 2 USB 3.0 controllers (4 ports total) as also given to the Windows Gaming VM as that makes it easy to have USB input to the VM
My only pet peev right now is that I have a blu-ray player, however, ESXi seems to be limiting the speed in which the VM can read the data from the drive to a speed that makes it impossible to actually watch the movie without first using software to sip it from disk (AnyDVD HD to be exact)