I'm trying to put together a VERY cheap whitebox ESX server. Preferably in a small minitower enclosure, cheap dual core CPU (AMD or Intel) and 4GB of memory. Storage is to be external (via an Adaptec 29160 SCSI card), so not an issue here.
I have been looking everywhere for chipsets that will work with ESX. However, it is very hard to find this information. Out of a number of forum posts here and there, I figured out some chipsets that will/should work with ESX:
Nvidia C51PV / Nvidia MCP51
NVIDIA nForce4 SLI X16
NVIDIA nForce 2200 Professional
VIA KM 266
Anyone able to extend this list?
You might have a look at these links also...
Hardware recommendations to build a cheap ESX server - http://www.vmweekly.com/articles/hardware_recommendations_to_build_cheap_esx_server/1/
White box/Home ESX system - http://www.vmware.com/community/thread.jspa?messageID=620124򗙜
ESX on non-supported hardware to learn with - http://www.vmware.com/community/thread.jspa?threadID=77560
Community supported hardware/software for Vmware Infrastructure - http://www.vmware.com/vmtn/resources/communitysupport/
Thanks for your reply. I already looked into all of these links, in fact I looked through most hits on VMTN containing the word "whitebox". I derived the list of chipsets from these posts.
I am hoping people have hardware running or have been running on certain hardware, and know the chipsets used.
Hi, I have been looking at the ASUS P5M2/SAS, it is supposed to be a great platform for a whitebox ESX host. However, frankly I am not very enthousiastic about this board. The onboard SAS controller is useable by ESX (detected as SCSI), but it is a crippled version (less cache etc). Why not buy an "affordable" mainboard, and plug in the real thing? Anyway, I am looking to spend the amount of an P5M2/SAS for the entire system (excl. storage), so it is far over budget for me. I have been looking at something like:
AMD Athlon 64 X2 EE 4200+
Intel Core 2 Duo E4400 or E4500 (VT?)
Asus P5L-VM 1394
Both setups would get 4GB of memory, and external SCSI storage (using Adaptec 29160 cards). I should be able to build these with case and power supplies for about 400 a piece.
The big question is off course, WILL they run ESX? (and without wanting to start a flame war.... Which setup would be faster)
Message was edited by:
Erik Zandboer (E4500 might be the smarter choice)
@mcwill: You might be right there. However, if the list is to be useable for others, it is better to specify which ones WILL work. Otherwise you would be buying a board which chipset is not on the unsupported list... And then you can only HOPE is was not left out by mistake... Having a list stating which chipsets DO work is much more secure for potential whitebox-buyers I think...
Keep the ideas coming!
I have to disagree, just because two boards share the same chipset it doesn't mean they will work equally well with ESX. The BIOS can also affect compatibility, which means you need to specifiy boards and not chipsets.
For example on two AMD boards with the same chipset one bios allowed switching the MPS Table between versions 1.1 & 1.4 whereas the other was locked at 1.4. ESX would only run on both boards at 1.4 with APIC disabled, but as one board allowed MPS v1.1 it ran ESX without the need to disable APIC.
IMHO the two main areas of problems with whitebox compatibility are NICs and storage adapters; as these are board specific and not chipset specific it may be worthwhile compiling a list of boards along with whether the onboard NIC or storage is ESX compatible.
Abit KV8 Pro, NIC = No, Storage = No
Erik, google for ESX ASUS and you'll find plenty of blogs/sites reporting some working combination
the P5M2/SAS is listed on many but you'll find more affordable setup , for example here:
Please read the asus manuals if you plan to install 4GB. Most modern asus motherboards reserve a chunkload of memory for other "critical functions".
I have build two ESX servers based on the asus p5b-v motherboard, i've installed 4GB with 4 1GB memory modules. The Board only presents 2.8GB of memory. A setting in the bios can override the memory reservations, which makes the 4GB fully addressable, but the system isn't running stable after that point.
About the p5m2/sas board. I think the board is excellent for home/testlab use. Sure it won't be as fast as big production ESX servers, but it's still fast enough for the occassional testing.
It offers a disk controller and two onboard nics who work in ESX. I bought a p5b-v and had to buy a disk controller and a couple of Nics to get ESX to work. It would have been cheaper for me if the p5m2 was available at that time.
For a test / learning lab has anyone tried an AMD chipset like the AMD SB600 / 690G as used on the Gigabyte MA69gm-s2h motherboard. It will take 16Gb of RAM, PCIe x16 and x4 plus AMD x2 processors.
This looks like a good M/B as does the m61pm-s2, as it has the nForce chipset on it.
Anyone using either of these boards to test or learn on and know if they work with ESX 3.0.1 or 3.0.2 ?
ASRock ALIVENF6G-VSTA AM2 NVIDIA GeForce 6100 or 6150SE / nForce 430 (Depend on the version) Micro ATX AMD Motherboard - Retail
EDIT: on-board NIC of course does not work with ESX - I added two Intel Pro MT 1000 PCI Gigabit Ethernet Ports and tested connecting to openfiler VM and installed a VM on openfiler iSCSI storage and booted virtual machine this way.
Currently 4Gb of RAM installed (2x2Gb) so I have the option of going up to 8Gb of Ram if I need to.
ESX installed to IDE drive.
Could not get ESX to install to Compact Flash with an CF to IDE adapter using USB optical drive (but could install openfiler that way on another machine with same motherboard).
Mikelane, thanks for this input! I just confirmed yesterday that this board also works with ESX 3.0.2:
MSI K9AG Neo2-Digital
This board is really cheap, has an ATX formfactor, has three PCI slots (plus PCI-e x16 and two x1s), onboard VGA and supports up to 8GB of memory. Testconfig:
MSI K9AG Neo2-Digital
AMD 64 X2 6000+
2x 1GB DDR2-667 dimms
1x Adaptec 2940UW
1x SCSI hdd 146GB/10000
1x Intel pro1000GT NIC
Onboard NIC is not recognized as expected. Strange enough I got an unexpected error during the install from a 3.0.2 update1 CD, just before the copying of files started. My guess is that installing 3.0.2, then upgrading to update1 will work (not tested though)
Interesting choice. I'm looking for such a configuration. I want to connect this machine with one IDE drive (Virtual Center) to an external iSCSI box with open filer with 1,5 TB of data.
I want this config for ESX (for around 580 euro):
MSI K9AG Neo2-Digital
AMD X2 6000+
4 x 1GB DDR2-667
1 X DVD-ROM IDE
1 X IDE HD 40 GB
2x Intel Pro1000GT NIC
Is your configuration working properly now?
The reason for choosing this board, was that it was ATX (and not uATX), while having onboard video. I like the fact of having three PCI slots (allthough you might be able to use PCIe 1x networking cards).
So far I have tested the ESX install, but did not have any VMs on it yet. I was satisfied when ESX booted its kernel, and stated it was ready at IP address x.x.x.x. This system is not owned by me, I just was allowed to perform this test. Still waiting my own hardware to be shipped, so for now no further testing
I left out the DVD drive (I connect on install then remove it again). I wanted to create a cheap setup, then AMD is the way to go at the moment (mainly because Intel CPUs supporting VT technology cost 130 euro or more, and the AMD variant is available in all current AMD CPUs). When looking at an AMD 6000+, the race between Intel and AMD starts once again regarding the VT-technology which becomes available then for CPUs in that price range... I decided that 2x 2.1GHz is more than enough CPU power when you put 4GB inside (8-10 VMs max). 8GB is 4 times more expensive because you need to use 2GB DDR2 modules then, and simply not an option (I am going to use two of these systems in DRS/HA environment).
I must say that I have not tested the board with an IDE drive attached (I was using Adaptec 2940UW with a 146GB/10K SCSI disk attached). IDE should work though, but there is no guarantee. Remember that this board has only one IDE port (as most mainboards nowadays).
When testing, I also had an Intel Pro1000GT installed, which worked fine.
Thanks for your quick response.The problem is that the VT technology is to expensive. I want to use this machine for home use to virtualize an exchange server, linux firewall, a domain controller, webserver so not more then 4 to 6 VMs I think...but speed should be alright.
The mainboard has one IDE contoller as you say but I can connect one IDE drive as primary master and DVD-ROM as secondary slave right? I can disconnect the DVD after installation then (if I like). You say this is a test case for you? Do you also get whitebox hardware or just a server or something?
Good to hear that the intel nic's will work. I wanted an ASUS with the integraded SAS controller (it seems like almost everyone got this server as whitebox) but because of the iSCSI openfiler setup this is far too expensive. I couldn't find any solution with Intel processors because I could not find a cheap motherboard with 1333mhz buspeed that was supported so I could use an Intel 6750 CPU (6600 is very expensive at this moment because of 1000 Mhz).
Hope you can tell me about the VMS performance as soon as possible.
Yes you can attach two devices to a single IDE port, there should be no issue there. Only possible issue would be to install and run ESX on IDE. Works on all setups I have seen so far, but as with all whiteboxes... Try before you buy if possible. There simply is no guarantee.
For 4-6 VMs I would go for the much cheaper AMD 64 X2 4000+. More than enough speed for a home environment
I ordered two "complete" systems. I plan to run this in a test setup in conjunction with an EONstor. This enables vmotion and HA and makes a cool testing environment.
Anyone buying the SAS mainboard is not thinking in my opinion. The only reason you could pick this mainboard is the support on the integrated controller. But it is "crippled", and way too expensive. Buy a cheap mainboard (like the one I choose), and then buy "the real thing SATA controller" as an addon. Better performance, better price. The EONstor uses a simple SCSI hostbus card, so any cheap Adaptec will do for me, so buying the Asus with the SAS controller was no option for me anyway.
As soon as I get the hardware, I'll replace one of my current servers (which are both Intel P4-HT 2.8GHz). After that, I just have to vmotion all VMs to the new server and I should have an indication of speed pretty soon then
Great I think I will order the system tomorrow. I hope the IDE thing will work otherwise I will go to find a second hand Adaptec SCSI card with an cheap second hand SCSI disk.
The AMD 64 X2 4000+ is a good suggestion especially cause the low power usage and special price.
The EONstor seems like a good option for you. I didn't realize that the performance of the SAS onboard controller was so bad.
The vmotion thing is pretty cool indeed. Let me how it works out!
Please DO remember that you CANNOT store VM's on an IDE disk! You will need SCSI somewhere, even for a whitebox. Only SCSI, iSCSI and NFS is supported for creating VMFS filesystems where the VMs must be sitting. That is exactly why people buy the SAS board: You can connect SATA to it, and it reports the disk(s) as being SCSI....
You can get around this issue, and that is by installing an NFS server inside the service console. On the forum here there are some posting about the topic. In that case you would store your VMs on an NFS, which happens to be local to your system. It is not great for performance, but it works.
And I would not call the SAS board a "bad performer", it is just that the onboard controller is crippled in cache. For less money you buy a "cheap" mainboard with a separate LSI SATA controller, which will perform better.