Benchmarking

Benchmarking

Introduction

Many VMware users wish to perform analysis on their own virtual  deployments.  This page will collect information on setting up and  executing your own tests to analyze performance.

General Best Practices

Always measure performance from a native (non-virtual) system. Be aware that time measurements in virtual machines can be subject to  minute fluctuations.  Many benchmarks produce results by summing times  from large number of small operations so these small inaccuracies can be  compiled to produce a large error.  See Time-based Measurements in Virtual Machines for more information on this subject.  The only way to guarantee  correct measurement is to run the measurement tool on a native system.   This is easy for client-server test architectures but may require clever  architecture for in-guest testing.

Always ensure apples-to-apples comparison. Make sure that the  benchmark or application under test are both constrained by the same  resources.  For instance, if the virtual machine was configured with  512M of RAM and two virtual CPUs, restrict the native system to the same  resources if a virtual-to-native comparison is desired.

Collect accurate host-based performance statistics. Guest OS  performance metrics (such as CPU utilization) are not accurate.  Use  VirtualCenter or esxtop to collect accurate performance counters during  the test.  See the Performance Monitoring and Analysis for more information on analysis.

Application Benchmarking

Microsoft Exchange.

Microsoft SQL Server.

Subsystem Benchmarking

Storage

Internally at VMware we've used Iometer for a variety of storage analyses.  See the Storage System Performance Analysis with Iometer for more information.

Version history
Revision #:
1 of 1
Last update:
‎05-28-2008 12:12 PM
Updated by: