I'm hoping to pick up several (cheap) servers to each run ESXi 5.5, hopefully building out a vCD cluster for a lab environment. First question; does anyone have luck installing ESXi 5.5 on these servers?
Assuming vCD was possible, I'd have to come up with a shared storage solution. I do not have a separate NAS server, so I've been toying with the idea of installing ESXi to and booting from a 16GB USB thumb at each server. I also have 5 2TB 7200 RPM SATA drives (nothing fancy, but one for each server). For shared storage, I was then thinking of running one minimal Ubuntu VM per server (Off the USB or 2TB SATA) where each VM would be a "node" for some distributed FS like GlusterFS. The aggregated SATA disks would then be accessible to the vCD cluster via NFS.
Does that setup even sound feasible? How might you tweak it (ex. suck it up and build a separate NAS server)? Any complaints or tips and tricks with GlusterFS?
Many thanks, and do note...this is as much a science project as it is a lab build out. Suggestions are very much welcomed!
vCD is possible as soon as you have a vSphere environment. We need shared storage to turn on DRS, but it doesn't matter the type (NFS, iSCSI, FiberChannel).
I'd really suggest looking up blogs/forums on white-boxing ESXi or cheap & compatible servers. Even some of the vMUGs have white-boxing sessions if they are in your area. I attended one in Indianapolis where it was just open discussion on home labs.
Were you able to get ESXi 5.5 installed on CS24-SC? Mine doesn't load, hangs on loading module. I was able to load 5.1 successfully.
Tried the Dell .iso and the VMware .iso.
I also have a Dell CS24-SC server running ESXi 5.0, upon trying to upgrade I get stuck trying to boot ESXi 5.5 with the message
Relocating modules and starting up the kernel
Does anyone have any ideas?
Loading ESXi installer issue " Relocating modules and starting up the kernel"Loading ESXi installer issue " Relocating modules and starting up the kernel"Loading ESXi installer issue " Relocating modules and starting up the kernel"Loading ESXi installer issue " Relocating modules and starting up the kernel"https://communities.vmware.com/thread/326371?start=30
You need to do what's mentioned in dallo's post above.
Yep, I had already saw that thread, I added the additional commands to the installer and it worked fine. However once it was upgraded to 5.5 it had the same problem when booting off the ESXi install. I put in the commands one more time manually, enabled SSH and then modified the boot.cfg in the bootbank directory and added the additional commands to the kernaloptions line.
To save time for everyone, add this too the boot options and bootcfg kernel options, make sure to leave a space after the last entry
Now it boots without manual intervention.
Proof - http://i.imgur.com/P7bqcjm.jpg
I also encountered some weird issue where ESXi 5.5 wouldn't take my static IP address, I reset the networking settings to defaults and things started working again.
You'll probably realize that there is no activity on the second NIC.
See above, I haven't don't it yet.
P.S. If you find some cheap ram that can used in this server, please post.
Thanks, the other NIC is showing up fine in vSphere, does that mean it's working? I bought 16GB of ECC RAM for it off of ebay for $85 last year, it works a treat.
WOW! Here is 16GB for $36 free shipping - 1 Lot 4 Pcs Hynix 4x4GB 2Rx4 PC2 5300P 555 12 667 MHz DDR2 HYMP151P72CP4Y5 | eBay
Here is a bunch more of the RAM I bought - HYNIX 2Rx4 | eBay