I am using VMware ESXi and I am trying to setup a guest that is Windows 7 that will have an ATI Radeon video card passed through to it. I actually had this working on a previous system but I had to reinstall. Now when I do this the guest fails to start and I get the following:
Error message from localhost.XXXXXXXXXXX:
PCIPassthru 004:00.0: Guest tried to (null)map
32 device pages (with base address of 0xb5d20)
to a range occupied by main memory. This is
outside of the PCI Hole. Add pciHole.start =
"2909" to the configuration file and then power
on the VM.
error
12/23/2010 1:04:36 PM
media
User
When I do as it asks, the guest now starts but gets an immediate BSOD concerning memory management. Any ideas on why this is occuring and why it worked at one point but now it fails?
and what exactly from that screen suggests you that something crashed?
the only thing that you can deduce from there is that something MAYBE freezed.
and it didn't. If you'd read the posts more carefully, you'll see that we keep saying: that message is normal when you passthrough the console card. At that moment of cnic_register, the card is passed through the vm so the console can no longer output to it.
This is the stage where you connect with VNC or whatever and install the amd drivers and try to make the VM show up on monitor
Ah!
Sorry II have been through this whole thread atleast 4 times, but there is a limit to how much you can remember
Will give this a go later
I managed to get it running by not installing 'AMD Vision Engine Control Center' but I am assuming this is what gives you the option to manage specific preferences on the graphics card?
Such as overscan I think its called, as I currently have a black border around my monitor when running it at its native resolution. Seen this before with many Ati cards where it defaults to 7.5% when it needs to be 0%
As I have no catalyst control center when this isnt installed
Have I missed something again?
You could remove the overscan via regedit, but it's a little trivial, I would try installing the catalyst control center.
I did try that, found a value of '1' and changed to '0' with no success.
Also tried installing vision center alongside the drivers, vm powers off immidently when hitting that part of the installer.
Same happens if I install the driver, reboot, then try install vision center on its own.
Besides that DXVA seems to be working, CPU usage at around 20% running a 1080p MKV H264 movie, will try some more codecs etc
CPU hit 80oC think I may need a better cooler which fits in a 2U case!!
Got 3 systems running in total
Full spec:
Gigabyte GA-990FXA-UD3
AMD FX8150
Corsair 32GB
Tagan 480w PSU
Asus HD6450
Asus HD5450
Unknown HD4350
I have successfully got DXVA running on all of them as far as I am aware using Ffdshow DXVA codec or LAV codec with CPU usage generally below 25% playback seems really smooth too
Upon further testing on the VM with the HD6450
Using MPC-HC DXVA codec results in a green screen
PowerDVD mounting an actual bluray rip results in no ability to playback the disk, if I navigate to the m2ts files and play them, same green screen issue.
Noticed that there are a few mentions of this previously too
My test setup (with oversized fan!)
EDIT: upon further though I don't have DXVA on the HD6450!! hmm
Hi All,
I've studied this thread for the past few days and have gotten my system working and thought I'd add my specs (I'll update the google doc to)
Hardware
ESXi 5.0 Update 1
Motherboard : ASUS P8Z77-M PRO
BIOS Version : 1708
CPU : i7 3770
Video Card : ATI HD3600 Series (Installed in slot 3 other slots didn't work)
BIOS Settings
BIOS Settings (from defaults found in this thread sorry i forgot who said them)
Primary Video to : PCIe
Disabled IGPU
Disabled onboard Audio
Enabled VT-D *CPU virtualisation*
VM OS
Windows 7 x64 updates as on (12/12/12)
Dual CPU
2g RAM (did try more ram but didn't work will play with those pciholes settings later)
Steps
Install OS
Update OS
shutdown
Add Radeon PCI Passthrough to VM
boot
Installed (drivers only) 12-6-legacy_vista_win7_64_dd_ccc_whql
When the drivers installed my VM screen went to default backdrop and the esxi screen went to the primary screen.
thanks for the great thread and the help!
Hello guys,
I got a HD 7770 sucessfully passed through. Is there any chance to get the video output passed to the hyperv so I can use vmware view to "stream" the desktop around?
You could use VNC or remote desktop to the VM, but I don't know of any way to re-export it to the vSphere client.
@Herb0ne nope that's the catch to passthrough. It dosent work for a desktop on remote with the View agent. This in an alternative approch for media center type PC's or maybe a network status monitor so you have the direct display output.
If you want to stream which is what I wanted to do you will need to have one of the big boy video cards. A Qudaro 2k,3k,4k series beast to do the work. Those will work for the offload which you want but a off the shelf for a home use wont. I expect View to catch up and start using more cards in the future but as of today it's limited.
If your dead set on remote 3D for a lab you'll need to switch to Xen, 2k8 r2 with RemoteFX on or Hyper-V.
Thanks for the quick help!
@rmathis1984 So you mean a quadro card can pass the videoaoutput back to the hyperv? Would it be also possible with a fire pro?
Thank you for your email.
Please note I am away from the office.
I plan to be back on December 19.
Sylvain Delangue
I get the green screen aswell, using some codecs and flash player in IE. If you disable hardware acceleration in flash this cures that problem but for the green screen on codecs I just find a codec that works. The best codec seams to be the ATI MPEG Decoder that you get with the extended packages (Hydravision or the AVIVO package), this works with h264 despite the confusing name. This for me gave the lowest CPU usage and perfect video.
I am now looking at pasing a card through to the new Hyper-V server 2012, a free version of Hyper-V that hopefully will support RemoteFX on its guest VM's. I will have to upgrade to 5.0 for nested VM's though.
MB: ASUS sabertooth x79
GPU-1: radeon HD 6870
GPU-2: asus 5440
ESXi: 5.0.1
RAM: 32GB
CPU: i7-3930k
this threas seems as good a place as any to post this question -- have a quesiton regarding using 2 graphics card in host:
i was "hoping" to be able to let the host/esxi use the asus GPU and i then could passthrough the HD6870 to a VM.
yet, when both GPUs are installed the monitor connected to the asus 5440 doesn't work. it works if i take the HD6870 out.
though i can use vSphere to login to ESXi so maybe i don't need a second card for host though i thought it was possible.
doesn't this version of ESXi support two GPUs or is this a motherboard/BIOS issue?
tia
i spoke too soon i think,
i was able to see both ATI cards via vSphere.
i was able to assign them as passthrough.
added the HD 5440 to a linux VM. lspci lists the HD 5440 controller along with the VMware SVGA II adapter. so far so good.
now to get this VM to make use of the ATI card and not the vmware virtual one
Anyone with a currently working solution try the 5.1 update that supposedly fixes some of the passthrough issues of 5.1 yet? Debating whether to try that over a few days off, but was wondering how it was looking.
It looks like ESX 5.0 update 2 was released today which supports Solaris 11 and other bugfixes, might be a good alternative for those who cannot upgrade to 5.1 due to passthrough issues.
ESXi= 5.1.0 build 838463
MB= ASUS Sabertooth x79
CPU= i7-3930k
GPU1= radeon HD 6870
GPU2= radeon HD 5450
Ram= 32GB
VMs on= synology NAS
i'm able to passthrough eith GPU to linux VM's and the linux VMs 'see's' the hardware, but can't initialize them. VMs are fedora17 xfce x86-64 and linuxmint14 x86-64 cinammon. searched many forums, tried AMD proprietary catalyst drivers and open source ones for radeon.
anyone gotten these cards to work on linux VM's and pass along some sage advice to help out?
tia
dmesg output:
[drm:radeon_get_bios] *ERROR* unable to locate a BIOS ROM
radeon 0000:13:00.0: Fatal error during GPU init
radeon: probe of 0000:13:00.0 failed with error -22
#--------------------------------------------------------------------------------------------------------
lspci -v | grep -i vga
00:0f.a VGA compatible controller: VMware SVGA II Adapter (prog-if 00 [VGA controller]) subsystem: VMware SVGA II Adapter
13:00.0 VGA compatible controller: ATI Technologies Inc Cedar PRO [Radeon HD 5450] (prog-if 00 [VGA controller])
Did you try ESXi 5.0? There are PCI passthrough issues with 5.1.
Are you really using a 3790K
Does it work on Win7 ?
I belive K model could not passed the VT-D
ref: http://ark.intel.com/products/63697/Intel-Core-i7-3930K-Processor-12M-Cache-up-to-3_80-GHz