Folks,
What's the maximum CPU % utilization that can't be consider for Virtualization of the server running apps..? Is it 70% or more ??
Everything can be virtualized given the right hardware and configuration.
In the past, CPU or IO intensive applications (big databases) were deployed physically to avoid any overhead from virtualization, with todays hardware and hypervisors I don't think this is an issue anymore.
If this post has been helpful/solved your issue, please mark the thread and award points as you see fit. Thanks!
CPU utilization depends on many factors......what do you want to know exactly?
BTW, are you talking about the CPU utilization of the guest or the host?
AWo
VCP 3 & 4
\[:o]===\[o:]
=Would you like to have this posting as a ringtone on your cell phone?=
=Send "Posting" to 911 for only $999999,99!=