C_Ruhnke
Contributor
Contributor

VMware ESX 3.5 Install/Upgrade reboot error "Mounting root failed."

Jump to solution

Configuration: Dell PE 6650, 4 CPU, 4GB, PERC 3/DC, 3 36GB drives configured as JBOD.

Trying to install ESX 3.5 from a CD burned with the ISO image from www.vmware.com/download.

After install is finished, when the server reboots it terminates with the message "Mounitng root failed. Dropping into basic maintenance shell."

The same thing happens if I try to upgrade a successful installation of ESX 3.0.2 to ESX 3.5.

Booting to the "Service Console only" mode is successful and the installation root disk is successfully mounted.

This sounds like there is something hosed in the VMware startups.

Anyone else had this problem? Any suggestions to resolve this problem?

Thanks!

Tags (3)
0 Kudos
37 Replies
99matt99
Contributor
Contributor

01:00.0 RAID bus controller: Adaptec Adaptec Rocket (rev 02)

Subsystem: International Business Machines ServeRAID 8k/8k-l8

Flags: bus master, fast devsel, latency 0, IRQ 22

Memory at e7a00000 (64-bit, non-prefetchable)

Memory at e7e00000 (64-bit, prefetchable)

I/O ports at 5000

Expansion ROM at <unassigned>

Capabilities: #10

Capabilities: Message Signalled Interrupts: 64bit+ Queue=0/2 Enable-

0 Kudos
phenneberry
Contributor
Contributor

On the IBM 336 xSeries running VMware ESX 3.5, LSPCI -v shows:

00:if.2 IDE Interface: Intel Corporation 82801EB (ICH5) SATA Controller

On the same machine when running ESX 3.02 which works fine, LSPCI -v shows:

00:if.2 IDE Interface Intel Corporation Unknown device 24d1

0 Kudos
sthoppay_wipro
Enthusiast
Enthusiast

I have Dell 6850. Executed the following command to increase the queue depth

esxcfg-module -s ql2xmaxqdepth=64 qla2300_707_vmw

and esxcfg-boot -r

After reboot, I got the same error "Mounting root failed" and busybox prompt.

I can login into troubleshooting mode without any issues & it mounts the root volume.

I tried executing

esxcfg-boot -p

esxcfg-boot -r

but, no luck.

0 Kudos
mphodge
Enthusiast
Enthusiast

Managed to fix my "mounting root failed"...

Downgraded the Perc 3/DC firmware from 199D to 199A and now ESX 3.5 boots up Smiley Happy

VMware officially do not support this RAID card, don't know why?!

0 Kudos
mtestino
Contributor
Contributor

where the 199A firmware can be downloaded? I've searched on the dell site but i could only find the latest version.

thanks

0 Kudos
wolvie724
Contributor
Contributor

Here is a the link to 199A

and if you need to do the same with other PERC cards go to the current firmware page. Click on the firmware you want to DL and under the Download Now link you will see a link called Other Versions. Click that link and you will get links to all the old firmwares.

C_Ruhnke
Contributor
Contributor

Confirmed! Downgrading the firmware on the PERC 3/DC controller from rev 199D to rev 199A resulted in a bootable VMware 3.5 server.

0 Kudos
mtestino
Contributor
Contributor

thank you!

0 Kudos
Svante
Enthusiast
Enthusiast

Same/similar problem here. HP BL460c, internal RAID used for ESX installation. Had 3.0.2, all patches installed, running just fine, used the 3.0-3.5 tarball to upgrade. Everything seemed fine, no disks anywhere near going full, no error messages during upgrade. After reboot, can't mount root. I figured the quickest solution for me was to do a clean 3.5 installation rather than trying to find out what went wrong. Feels more clean anyway, but of course a serious problem for those not having that option.

Clean 3.5 installation went fine, and it boots. Having some iSCSI problems now, but that is another story.. Back on 3.0.2 for now (I had mirrored disks so I could go back to 3.0.2 with minimal effort)

0 Kudos
henka01
Contributor
Contributor

I had the same issue during upgrade from 3.0.2 to 3.5 with esxupdate on 4 DELL PE 2950. This solved the problem for me:

0 Kudos
DeGreat
Contributor
Contributor

I had the same issue. I have a PE2600 with Perc 4Di. I had the latest firmware ans was getting the same error. I rolled back the FW to the following and 3.5 works great.

http://support.us.dell.com/support/downloads/download.aspx?c=us&l=en&s=gen&releaseid=R71754&formatcn...

Let me know if this helped.

0 Kudos
cybermage
Contributor
Contributor

Running firmware 199A does indeed allow 3.5 to boot and appear to be working.

I have however another problem. If I install 2-3 VMs on the local VMFS it works fine. Any subsequent VMs seem to get disk corruption.

I would run chkdsk on a VM and it would find errors, it will then repair and if I run chkdsk again it will again find errors, the errors are also not the same.

I have tried creating new VMs, restore exising VMs, nothing seem to work part around 2-3 VMs.

3.0.2 works just fine, Even VMs that seem to have corruption work if I leave the VMFS volume intact when downgrading from 3.5 to 3.0.2.

Rolling back to 3.0.2 again. Smiley Sad

0 Kudos
DeGreat
Contributor
Contributor

I have only 2 VMs on my PE2600. They are in production. I would love to add couple of more to test. The VMs that are getting corrupted are the new VMs or the old VMs?

0 Kudos
Rkelly
Contributor
Contributor

I was able to get mine working. Call technical support. Apperently in the ESX 3.5 Build I had there was an RPM that did not run during the upgrade. VMWare has since released a new build that corrects this problem.

0 Kudos
cybermage
Contributor
Contributor

I tried it with both existing and new VM's. I have rebuild the server between each test.

Also installing 3.0.2 over 3.5 and only keeping the VMFS then those VMs work again.. It is really strange.

I have actually installed the latest iso thinking that it might have to do with the rpm issue, but it wasn't.

0 Kudos
shaymandel
Contributor
Contributor

I ran into the same issue, with ESX 3.5, running a HP Proliant DL 145, 2 CPUs with 4GB RAM. I have tried starting in troubleshooting mode, and got this error:

Initialization of vmkernel failed, status 0xbad0013

I found a solution for this here:

and followed the description there, which said that the DIMM memory cards should be installed as 2 per each CPU, and not 4 for one CPU. This solved my issue.

I am posting this here, as the initial issue I saw was Mounting Root Failed and I invested quite some time following the messages in this thread without luck, and with a lot of work.

I hope this will help some more people.

0 Kudos
cheipler
Contributor
Contributor

I had a similar issue today after updating the ql4xmaxqdepth setting for my QLogic 4062's on 2 of my ESX boxes. After rebooting I was getting the "Mounting root failed". Ended up rebooting into troubleshooting mode and then removing the last line in the /etc/vmware/esx.conf file, which referenced the setting changes I had made (looked something like "vmkernel....esxcfg-module -options ql4xmaxqdepth=200........").

Anyway hope that helps someone now I have to find out what the proper value is seeing how the EqualLogic documents say it should be 200.

0 Kudos
TomHowarth
Leadership
Leadership

Thread moved to the correct sub-forum

If you found this or any other answer useful please consider the use of the Helpful or correct buttons to award points

Tom Howarth VCP / vExpert

VMware Communities User Moderator

Blog: www.planetvm.net

Contributing author for the upcoming book "[VMware vSphere and Virtual Infrastructure Security: Securing ESX and the Virtual Environment|http://my.safaribooksonline.com/9780136083214]”. Currently available on roughcuts

Tom Howarth VCP / VCAP / vExpert
VMware Communities User Moderator
Blog: http://www.planetvm.net
Contributing author on VMware vSphere and Virtual Infrastructure Security: Securing ESX and the Virtual Environment
Contributing author on VCP VMware Certified Professional on VSphere 4 Study Guide: Exam VCP-410
0 Kudos