I'm planning on running a quad core Intel Xeon server with 2 virtual machines using vmware server. One of the virtual machines will be running XP, and the other will be Windows Server 2003. The server 2003 will be running a funky application that is designed to run at 100% cpu utilization at all times. I curious to see how I should configure vmware as to allow the best performance for the host, server 2003 vm, and the XP vm. I was told the best way is to specifically allocate a single cpu core to the server 2003 instance in the configuration file (something similar to use.processor0=True I think), but wanted other opinions. Thanks for any input.
What you are referring to is setting processor affinity - I would not recommend that the vmkernel schedules the vcpus on to cores and will insure it will get enough resources - so with out setting affinity the vmkernel will insure that vcpu of the w2k3 server will get an entire core -
definitely - you can always add aditional cpus if you find the app needs - but remember both the o/s and application need to be able to support multiple cpus for you to utilize the additinal cpus -
If you're using VMware Server (rather than ESX), you won't have the ability to set processor affiinity because the host OS will control the scheduling of CPU resources. Your best bet will be to create your W2K3 VM with a single vCPU and let it use 100% of that. That will leave ample CPU capacity available for the host and your XP VM (which I would also create as a single vCPU VM).
Technical Director, Virtualization
VMware Communities User Moderator