10 Replies Latest reply on Aug 19, 2009 5:25 AM by nalinu

    After ESXi Upgrade VM storage Lost

    leo.freeman Lurker

      I have had a working ESXi 3.5.0 host with several VMs for several months now.

      Today I checked VMware Infrastructure Update and it showed that Firmware, VM Tools

      and Infrastructure Client updates were available. I told it to do the updates and

      that completed without incident. Following the update,

      VMware Infrastructure Client connected and everything was normal - all VMs were running.

      The Update had indicated that a server reboot was necessary, so I used VI Client to

      reboot the server (VMware ESX Sertver 3i, 3.5.0, 123629).

      Reconnecting VI Client after the reboot showed the message

      "The VMware ESX Server does not have persistent storage" and all my VMs are gone.

      Under the vmhost -> Configuration -> Storage Adapters tab I see my two 250GB sata

      disks vmhba1 and vmhba100. The hardware is a Dell PowerEdge 840 with two disks.

      The vmhost -> Configuration -> Storage tab is blank and that is where I used to see my

      VM datastore.

      Rescan from VI Client does nothing. SSH to the VM host and doing esxcfg-rescan vmhba1

      results in the message "vmhba1 does not support rescanning".


      fdisk -l shows the VMFS filesystems on the disks:

      ~ # fdisk -l


      Disk /dev/disks/vmhba100:0:0:0: 250.0 GB, 250000000000 bytes

      255 heads, 63 sectors/track, 30394 cylinders

      Units = cylinders of 16065 * 512 = 8225280 bytes


                         Device Boot    Start       End    Blocks   Id  System

      /dev/disks/vmhba100:0:0:1             1     30395 244140593+  fb  VMFS


      Disk /dev/disks/vmhba1:0:0:0: 250.0 GB, 250000000000 bytes

      64 heads, 32 sectors/track, 238418 cylinders

      Units = cylinders of 2048 * 512 = 1048576 bytes


                       Device Boot    Start       End    Blocks   Id  System

      /dev/disks/vmhba1:0:0:1             5       750    763904    5  Extended

      /dev/disks/vmhba1:0:0:2           751      4845   4193280    6  FAT16

      /dev/disks/vmhba1:0:0:3          4846    238419 239179345   fb  VMFS

      /dev/disks/vmhba1:0:0:4   *         1         4      4080    4  FAT16 <32M

      /dev/disks/vmhba1:0:0:5             5        52     49136    6  FAT16

      /dev/disks/vmhba1:0:0:6            53       100     49136    6  FAT16

      /dev/disks/vmhba1:0:0:7           101       210    112624   fc  VMKcore

      /dev/disks/vmhba1:0:0:8           211       750    552944    6  FAT16

      Partition table entries are not in disk order


      fstab shows:

      ~ # cat /etc/fstab

      none                    /proc                   procfs    defaults        0 0

      none                    /vmfs/volumes           vcfs      defaults        0 0

      none                    /tmp                    visorfs   2,128,tmp       0 0


      So - How do I gather up my marbles and get this thing functioning again?

      The disks seem to be there, but ESXi is not mounting them.

      I am looking for a clean recovery procedure before I try a bull-in-the-china-shop

      hack job. I really really do not want to lose my VMs.


      Thanks, Leo