VMware Communities
sunny88
Contributor
Contributor

Vmware doesn't detect graphics card In Linux (Bumblebee), How to use graphics card in vmware using bumblebee Linux?

Hello friends,

Recently i'm working on Linux Ubuntu 11.10 & i had tried to run MS window's games using different menthods using wine,playonlinux,crossover etc.

And i also give a shot to VMware workstation to achieve this goal. On my research i have noticed that Vmware ws doesn't detect Graphics cards in my case (dual GC Intel HD 3000 & Nvidia Geforce gt 540M 2gb) not even through bumblebee(using optirun vmware). I have Dell XPS 15 with config(Intel i7 CPU& GCs mentioned above.). I have used bumblebee to run  my games using wine & it works well for call of duty but while experimenting on vmware i got this DirectX error . By trouble shooting the problem i found that VMware had not detected mt GC (Graphics Card). So i think that VMware team has to check on this problem so that their products follows update technologies & wider support of Platforms & hardware. I'm not so much of technical guy but i think that vmware team has to Check on Bumblebee's method of utilisation of Optimus graphics card  or Dual graphics card utilisation to solve this problem.

Advice is for free it's upto VMware's mind whether to accept it or it willn't bother me if they don't care about it?Smiley Happy:smileylaugh::smileycool::smileymischief:

Reagrds,

Sunny Dhaliwal.

0 Kudos
10 Replies
continuum
Immortal
Immortal

VMs always use a virtualised VMware SVGA 2 video card - and so there is no need to detect any NVIDIA stuff as it will not be used inside the VM anyway


________________________________________________
Do you need support with a VMFS recovery problem ? - send a message via skype "sanbarrow"
I do not support Workstation 16 at this time ...

0 Kudos
Slacs8
Contributor
Contributor

Ulli Hankeln wrote:

VMs always use a virtualised VMware SVGA 2 video card - and so there is no need to detect any NVIDIA stuff as it will not be used inside the VM anyway

Your guest OS doesn't need any drivers because it's going to use the virtual card anyway, but vmware is using the default card on the system, so people with optimus cards are not getting to use the "good" card on the guest machines. I'm having the same problem. Even runing "$optirun vmware MYVM" the vm manager may be using the card, but it doesn't pass it along to the guests :S

Is anyone more savvy on vmware that understands the issue and could point me in the right direction?

0 Kudos
Jwils84
Contributor
Contributor

I was wondering the same thing, how using bumblebee to get the guests to use the NVIDIA card instead of the integrated intel graphics? Nobody knows?

0 Kudos
WoodyZ
Immortal
Immortal

Jwils84 wrote: I was wondering the same thing, how using bumblebee to get the guests to use the NVIDIA card instead of the integrated intel graphics? Nobody knows?

What explicitly and specifically did you not comprehend in continuum's reply, Re: Vmware doesn't detect graphics card In Linux (Bumblebee), How to use graphics card in vmware usi...?  The Guest OS running in the Virtual Machine does not see the Host's GPU as it uses a virtualized VMware SVGA II graphics adapter!

0 Kudos
Jwils84
Contributor
Contributor

WoodyZ wrote:

What explicitly and specifically did you not comprehend in continuum's reply, Re: Vmware doesn't detect graphics card In Linux (Bumblebee), How to use graphics card in vmware usi...?  The Guest OS running in the Virtual Machine does not see the Host's GPU as it uses a virtualized VMware SVGA II graphics adapter!

What "explicity and specifically" do you not comprehend about computer hardware? I'm no VMware expert fanboy with 17k posts that gives me special privileges to be both condescending and unhelpful to other users, but I'm pretty sure VMware isn't so "totally wizard" that it magically bypasses the use of any hardware GPU at all.

I assumed that the virtualized graphics adapter was some sort of software layer that the guest sees as its video adapter, but that was interfacing with the host's hardware gpu in reality. I wouldn't have assumed that the virtual SVGA II adapter was doing all the work of a GPU itself, since it wouldn't make much sense or be very efficient to not utilize the hardware GPU at all. Considering that pretty early on in the evolution of the personal computer GPUs became necessary to separate graphics processing from other tasks and take that load off the system memory and CPU, I wouldn't imagine that the designers of VMware would say "Hey, let's not utilize the hardware GPU, let's throw all the weight of Graphics processing onto the CPU! That should make VMs run as inefficiently as possible!"

So I guess if my assumptions are wrong and the VMs actually don't utilize the hardware GPU in any way, then that is what I did not "explicitly and specifically comprehend" about previous responses, as no one "explicitly and specifically" stated that the VMware SVGA II adapter doesn't utilize any hardware GPU. What I read was that it doesn't utilize specific drivers, and even that I'm unsure of, as I imagine if I wonked my graphics drivers on the host, the guest wouldn't display very well either.

So, if I'm correct in thinking that the virtual adapter is mostly a go between that utilizes the hosts GPU and driver set rather than having the guest interact directly with the hardware GPU and have to have specific drivers installed within the guest for the hardware GPU, then on systems with 2 GPUs it shouldn't be such a ridiculous idea that it may be possible to control which GPU the VMware SVGA II is actually interfacing with. Since no one has stated anything such as "At this point, VMware can only utilize the integrated GPU and it is not possible to have it use the discrete GPU" I thought there still might be room for discussion and exploration of further options on the matter.

If I'm wrong in my assumptions on how the virtual adapter interacts with the physical hardware feel free to correct me with some actual information. If you don't have any information to add about the workings of the virtual adapter and how to or whether it's possible to get it to utilize the discrete graphics rather than the integrated that's fine, but it also means you have nothing to contribute to this thread and your posting here serves no purpose other than to inflate your post count and your ego, neither of which really needs it.

0 Kudos
continuum
Immortal
Immortal

> and your posting here serves no purpose other than to inflate your post count and your ego

It does not work like that - a high post count only reminds you of one question: why am I so stupid to waste my time for answering the some boring questions over and over again.

Especially this question: I have a NVIDIA video card in my host - why does my guest complain when I install the Nvidia drivers ?

Every new user of Workstation, VirtualBOX or whatever will face that question while he installs his very first VM.

Most users then just follow common sense along  this line of thoughts:
1. VMs are designed to be portable from one host to another even if the hosts use very different hardware.
2. What would happen if I start this XP VM with NVIDIA drivers on my host with ATI ?

... hmmm - that would probably  in a crash .... ?

3. VMs dont crash if you move them to a different host
---------------------------------------------------------------------

this leaves only one conclusion: portabilty only works if a VM always uses the same drivers - no matter which host is used.
This also means that a VM always uses the same virtual video card.
Which one ? - and then they probably check with devicemanager to  find out which card that is.
Devicemanager will tell you that your virtual mainboard uses an Intel-chipset, Intel or VMware-networkcards, LSI-scsi-controller or LSI-SAS-scsi controller and a VMware SVGA 2 video card.

Other users that prefer reading manuals over thinking themselves read about virtual hardware in the manual.
Searching this forum or google would also bring up the answer easily.

In a Linux forum the question number one would either give you a "RTFM" or it would be moved to the idiots corner - have you ever visited the backtrack forum ?
Here we answer such questions - but we may react a little bit bored occasionally Smiley Wink


Anyway ... actually a VM has 2 main options for the SVGA card: VMware SVGA2 and VMware SVGA2 with 3d support.
Then you have further options to assign the amount of Video RAM and the max screensize.

3d support will only be offered on hosts whose video cards can execute a minimum set of instructions.
If a host can not do that 3d support will be disabled.

In no case a guest will interact with the hosts GPU directly as this would defeat the idea  "Virtual hardware must be independant from the host hardware"."

To cut it short ... what can I do to improve video performance of a VM ?

- in the guest: all you can do is try different vmware svga video drivers
- in the host: read the manual about modifying svga parameters in the vmx-file and google the "name of the hosts video driver + vmware"
Maybe there exists a better one than the one you use.

Thats all - you are done, no other options, no way to make it look or act like your NVIDIA ...


________________________________________________
Do you need support with a VMFS recovery problem ? - send a message via skype "sanbarrow"
I do not support Workstation 16 at this time ...

0 Kudos
ninjacoding
Contributor
Contributor

I recommed you to check his thread

http://communities.vmware.com/message/2112212#415654

0 Kudos
Jwils84
Contributor
Contributor

Ulli Hankeln wrote:

> and your posting here serves no purpose other than to inflate your post count and your ego

It does not work like that - a high post count only reminds you of one question: why am I so stupid to waste my time for answering the some boring questions over and over again.

Especially this question: I have a NVIDIA video card in my host - why does my guest complain when I install the Nvidia drivers ?

Every new user of Workstation, VirtualBOX or whatever will face that question while he installs his very first VM.

Most users then just follow common sense along  this line of thoughts:
1. VMs are designed to be portable from one host to another even if the hosts use very different hardware.
2. What would happen if I start this XP VM with NVIDIA drivers on my host with ATI ?

... hmmm - that would probably  in a crash .... ?

3. VMs dont crash if you move them to a different host
---------------------------------------------------------------------

this leaves only one conclusion: portabilty only works if a VM always uses the same drivers - no matter which host is used.
This also means that a VM always uses the same virtual video card.
Which one ? - and then they probably check with devicemanager to  find out which card that is.
Devicemanager will tell you that your virtual mainboard uses an Intel-chipset, Intel or VMware-networkcards, LSI-scsi-controller or LSI-SAS-scsi controller and a VMware SVGA 2 video card.

Other users that prefer reading manuals over thinking themselves read about virtual hardware in the manual.
Searching this forum or google would also bring up the answer easily.

In a Linux forum the question number one would either give you a "RTFM" or it would be moved to the idiots corner - have you ever visited the backtrack forum ?
Here we answer such questions - but we may react a little bit bored occasionally Smiley Wink


Anyway ... actually a VM has 2 main options for the SVGA card: VMware SVGA2 and VMware SVGA2 with 3d support.
Then you have further options to assign the amount of Video RAM and the max screensize.

3d support will only be offered on hosts whose video cards can execute a minimum set of instructions.
If a host can not do that 3d support will be disabled.

In no case a guest will interact with the hosts GPU directly as this would defeat the idea  "Virtual hardware must be independant from the host hardware"."

To cut it short ... what can I do to improve video performance of a VM ?

- in the guest: all you can do is try different vmware svga video drivers
- in the host: read the manual about modifying svga parameters in the vmx-file and google the "name of the hosts video driver + vmware"
Maybe there exists a better one than the one you use.

Thats all - you are done, no other options, no way to make it look or act like your NVIDIA ...

I get your point about high post count and wasting your time on the same questions over and over. Thing is, no one is forcing anyone to answer the question, and responding to a question with condescension and NOT answering it or supplying information is a waste of everyone's time, not just the poster. Sometimes it's probably best just to leave the question alone, rather than sitting on the site, looking at the most current posts, then replying with nothing to offer besides condescension.

Other than that, I appreciate the information you provided. Unfortunately none of it really applies to my question or the assumptions I put forth. I didn't try to install NVIDIA drives in a VM, I never have and I never would. Like I said multiple times, my assumption is that the virtual adapter, which is used in ALL VMs, interfaces with the current host OS and its particular drivers... that's necessary for portability as well as ease of setup.

The entire question has nothing to do with installing drivers anywhere. It's a matter of... hey, I have 2 video cards in my system because a lot of laptops come that way these days. Most mobile processors have integrated GPUs on the chip, and a lot of laptops also add in a discrete card. It's handy, integrated for low power usage, discrete for higher graphics performance. The graphics drivers for linux don't currently support the optimus technology very well for auto switching, etc., so most of us use bumblebee to switch back and forth when needed. However, switching to our discrete card, or starting VMware with the instruction to use the discrete card, does not seem to make the guest use the discrete card rather than the integrated.

Sure, it's a bit arbitrary, because of the uniform virtual adapter and drivers, we probably wouldn't gain the full benefit of using the discrete card. However, we might see some boosts, or gain some necessary compatibility for certain applications because the virtual adapter does rely on the hardware gpu in some ways, so different hardware gpu will mean a difference in how the virtual adapter works, even though the virtual adapter is always the same.

Anyway, it's all rather irrelevant. Another kindly user pointed me in the right direction with a simple, succint post of a link to another thread currently discussing ways to get this working.

Cheers.

0 Kudos
Jwils84
Contributor
Contributor

ninjacoding wrote:

I recommed you to check his thread

http://communities.vmware.com/message/2112212#415654

Perfect! Thanks so much for the helpful response.

0 Kudos
allquixotic
Enthusiast
Enthusiast

The problem is that the VMware "experts" were answering with no clue at all what Bumblebee is or does. So saying "Bumblebee" doesn't convey any information to them, and they're too lazy to look it up.

Basically Bumblebee is an LD_PRELOAD hack to load in the Nvidia implementation of OpenGL on the host side (which is ultimately what the "SVGA II" graphics in the guest uses, anyway), and the setuid-ness of vmware-vmx makes the dynamic loader ignore LD_PRELOAD, and vmware-vmx refuses to run if it's not setuid root. So we're at an impasse.

0 Kudos