Should be a quick question for someone who has used either vDGA or vSGA.
Can these forms of graphics acceleration be used on a bog standard ESXi 5.1 host running standard VMs (i.e. not in a VMware/Horizon View environment).
I have a customer who wants to run some beefy graphics application on their VMs but only needs a single ESXi host and say 6 VMs. Can this be done with a standalone ESXi host or is an Horizon View environment required to take advantage of vDGA / vSGA?
It depends on what you are going to do with them. If you are going to use the GPUs for processing (I like to use the example of mining for bitcoins) then it should work just fine. If you are going to RDP into the VMs and try to use them for a graphics intensive operation than you are out of luck since the software will not see the virtualized graphics cards as their display adapter and will not use them.
You also will have problems if your plan is to just use the console view through ESXi since that goes very goofy once you have the GPU.
I agree, one alternative is to use "View Agent Direct Connect plugin" that is coming with View 5.3, then you do not need any View Broker or AD to get access to a virtual desktop, only connect directly to it and login with a local account.
Then you could utilize vSGA/vDGA fully and the cost would be minimal since you only have to buy View Premier for 10 users. (vSphere licenses is included).
We have a couple of standalone ESXis and the application we are testing has increasing video needs. We already use the passthrough feature but it dedicates the video card to one VM at the time. To access our VMs we use "Dameware Mini Remote" that mirrors the video card display thus having no graphics limitation like RDP.
So back to the original poster's question: how can sVGA be used on a standalone ESXi ? I have not found any documentation about how to configure the GPU sharing in our context. Can it be done with the Vsphere Client by simply editing the VM's settings ? From the ESxi command line ?
I really want to get away from multibooting and passthrough video cards ! 😉
I don't see why you couldn't use vSGA in that way. All you have to do is install the driver on the hypervisor, start the X server on the hypervisor, then edit the virtual hardware to enable 3d graphics.
Sure but it is hard to convince management to buy a 900$ video card with a maybe and no documentation from VMware.
I have found the documentation about installing the driver but has vSGA been used on a standalone ESXi and how can it be configured ?
Have you looked in the manual?
As far as the documentation goes, I have read everything about sVGA and found nothing about a standalone ESXi. Basically, this forum is my last resort before dropping the whole idea.
So I just want to know how (if possible), one can configure the GPU sharing once everything is installed (hardware + driver). It might be implicit and obvious to everyone but some screenshots or any documentation would be really helpful. Is there a new video card in the Vsphere Client VM Settings Configuration ?
Any visual aid would be really appreciated ! 🙂
As Far as vDGA goes, the Passthrough feature does the trick. For that to work, you need compatible hardware. This is what it look like in the host's configuiration panel:
And the VM will have to add a PCI device that is the video card like this:
If your host has only one video card, the vSphere Client Console will not work anymore. In any case it works fine on XP and Win7 but only one VM at the time.
Thanks for all the replies and apologies for going AWOL on the topic for so long.
I like the idea of using the View Agent Direct Connect plugin option which sounds promising. As has been made clear it does really depend on how the VMs are being accessed (e.g. via RDP or for example a View connection using PCoIP). Our use case has not materialized but if another case arises I'll be clearer now on how to approach the use of either sVGA and sDGA.