You're limited to 256 columns in 2003. I upgraded to XL 2007 for just this reason. It's something like 16k columns in 2007.
Grab a perl or awk guru and they will be able to help you parse your data and throw out all the stuff you don't need.
Message was edited by:
I've actually been through the exercise of cleaning out many of the columns I don't want but even if I just include the disk fields, there are still too many columns for Excel to import.
Guess I'm off to buy my perl guy a case of beer...
Just open up the output of the batch mode in Windows perfmon. Within perfmon point to the CSV file output as a log-file source, and select the counters you want to view.
Perl is the best for this. We use it to parse a local csv file and expose stats over snmp.
Care to elaborate on that comment? "Perl is the best for this. We use it to parse a local csv file and expose stats over snmp."
If you know specifically what columns you want in terms of the column number, you can use the 'cut' command.
eg. to run esxtop in batch mode in the background and get it to only dump out columns 1 and 6 to a file called output.log:
esxtop -b | cut -d, -f1,6 >> output.log &
Unfortunately this doesn't help when you need certain columns based on the column header (as opposed to the column number). For instance I want to only dump out the '% Used' and '% Ready' columns..ie. all columns where the header contains the strings '% Used' or '% Ready'. That would give a much more manageable output file. I'm still trying to find out how to do that (maybe someone can can help?)