So I've been following this thread as a lurker for a couple of weeks, and just wanted to share my specific success story. My eventual goal was to build a ESXi cluster that was not only a lab for...
See more...
So I've been following this thread as a lurker for a couple of weeks, and just wanted to share my specific success story. My eventual goal was to build a ESXi cluster that was not only a lab for work (I'm a Windows/Linux Systems admin for an enterprise hosting company), but also allowed me to virtualize some of my home servers and HTPCs. To do this cost-effectively, I went with all AMD hardware (a mix of server and desktop boards), and ended up with a 3 node, 32 core, 128GB ram cluster. You can also see a more detailed version of this with pictures of the build and screenshots of the ESXi screens at: http://thehomeserverblog.com/esxi/esxi-5-0-amd-whitebox-server-for-500-with-passthrough-iommu-build-2/ I'm also compiling a list of vetted builds for ESXi whiteboxes, and I'd love to have anyone add any specific builds there that they have running. The most success I have had is with a whitebox I built from consumer items. Considering that it took a bunch of research and some failures along with it, I thought I would post my list and specific configuration in case someone wants to duplicate or learn from this. Although I did have some headaches here and there, over and above, this has been a painless build. Note that I'm running 4 GB NICs in every node simply because this is a lab, too, and I've got all my traffic properly segregated, but they are not at all necessary in the long run. With one of the nodes, I experimented with passthrough, and got not only a domain controller running with 8 2TB drives passed through to it (4 from the mobo, 4 from a RAID card), but a working HTPC with video card that functions as my primary XBMC HTPC and gaming center for the living room (Steam, MAME, Dolphin, etc). Passed through a video card and USB ports, and ran a USB-->CAT6 to a powered USB hub in the living room. All of this is in a 2U case in a custom-built home server rack. It's all working stable, of course with a few limitations. My hardware list ended up like this: Motherboard: ASRock 970 Extreme3 CPU: AMD FX-8120 Zambezi 3.1GHz Socket AM3+ 125W Eight-Core RAM: 32GB (4x8GB) DDR3-1333 The slot configuration on the mobo looks like this: PCI-e x16: Radeon HD6670 (Passthrough to VM) PCI-e x4 : LSI SAS3041E 4-Port SAS/SATA PCI-e x4 (Passthrough to VM) PCI-e x1 : GB NIC (RealTek 8168, used by ESXi host) PCI-e x1 : GB NIC (RealTek 8168, used by ESXi host) PCI : GB NIC (RealTek 8169, used by ESXi host) PCI : ATI Rage XL Pro 8BM PCI Video Card (Console Video) Drives: Interestingly enough, if you passthrough the on-board SATA controller on this board (there are 5 SATA ports), the 5th port actually stays available for use to the ESXi host. This is nice, because, as you know, VMs with passthroughs are not available for VMotion anyway. This allowed me to install ESXi to a hard drive, and have a local datastore for the HTPC, which wasn't going anywhere anyway. This freed up the other USB ports for passthrough if I wanted them. "Local" Drive as Datastore: 1TB Hitachi Ultrastar "NAS" Drives Passed to VM: 8 x 2TB WD Green Drives HOMESERVER The first of two "passthrough" VMs in this setup is my domain controller/game server/NAS. It's running SBS 2011 Essentials, and has the motherboard SATA controller passed through as well as the LSI card for a total of 8 x 2TB green drives. This worked flawlessly, and required not a bit of configuration. The SATA contoller and LSI controller "just worked". Assigned them, booted up, Windows installed the hardware, and it was off and running. Used FlexRAID to software RAID these drives into a single ~12.75TB drive that I keep my media on (movies, TV, music), profiles for the house accounts, and serves Windows shares out for various folders. In addition, it runs a in-house WoW server and Minecraft server. HTPC The second and final "passthrough" VM on this node is the primary HTPC for the house. This has Windows7 Ultimate 32bit installed and runs XBMC, Steam (w/~200 games), MAME, Dolphin Emulator, and a small host of other games and emulators. The HD6670 showed up as two cards (one dependent on the other), so both are passed through. The second is the HDMI sound card. Had some initial flakiness with the HDMI sound, but after two reboots once I installed the drivers, this seemed to disappear. Video/sound runs over a 50' shielded HDMI cable to my TV in the living room. Once I had video on the TV, I selected it as my primary device and completely disabled the "other" display, which is the console. USB also works, which I'm running USB over CAT6 (with an adapter) where I hooked up a wireless HTPC keyboard, Xbox 360 Wireless Controller PC Adapter, bluetooth adapter for my WiiMotes (Dolphin emulator), my HTPC remote, and so on. No issues here that I've noticed either. Hardware acceleration, according to XBMC is working, and I can watch 1080p YouTube videos without issue. Thoughts About the Build RAM: I have not been able to get above 2GB of RAM on the HTPC VM and remain stable, but I haven't had much incentive. The 2GB works for my particular application. That said, my next project is to virtualize a work computer running 3 monitors using this same scheme, and I *will* need more RAM with it. Thus, I'm going to be pushing the limits there to see what I can do. USB: I realize my application is pretty specific, but I'm running USB --> CAT6 (my whole house is wired with shielded CAT6A) to a powered USB hub, and this works wonderfully. I run about 50' and get 10 USB ports at the end. Nothing I've plugged in has failed or given me any issue. Sound: HDMI sound was pretty flaky for a while, and I had almost gave up when it started, out of the blue, working. However, I was using a USB 7.1 surround sound card that was working perfectly over USB and pumping sound to my home theatre sound system. Cost of the Whitebox: Total w/deals off eBay was $530. The LSI card was $15, video card was $8, GB NICs were $6, RAM was $120 and so on. I consider this a great deal for a 8 core, 32GB ESXi node. I have two of these, and a ASUS KGPE-D16 running dual 6128s (16 cores total) w/64GB of RAM. Other equipment for the lab includes two 2-bay NAS boxes delivering iSCSI targets for high availability, 2x24 port Gigabit Smart switches, a Juniper SSG5, and a 3000VA rack mount UPS. Total lab cost was just under $2,000. Next project: With the success of the passthroughs, I'm moving on to virtualizing the PCs in the house, and the other HTPC. My eventual goal is to have all but a single PC in the house virtualized, all running on passthrough video and USB. I'll continue to share anything I learn as I move forward.