VMware Cloud Community
WilliamOoi
Contributor
Contributor

Missing of Vmware Capacity

Hi,

I need your help for vmware issue. I create VMFS version 6 with 1.82TB, after datastore created, I only see 931.25GB datastore. Seems like missing 924.14GB more.

Local Storage (1.82TB)

WilliamOoi_0-1630395510203.png

 

After datastore created, 931.25GB only

WilliamOoi_1-1630395531587.png

 

 

I see partition (3) and (10), but no able to use.

WilliamOoi_2-1630395552679.png

 

 

0 Kudos
14 Replies
a_p_
Leadership
Leadership

The screenshot shows two VMFS partitions!?
Please remember that VMware only supports a single VMFS datatore per LUN/device.

Why did you create an additional datastore instead of expanding the existing one?
Expanding the VMFS datastore (Partition 3 by default) on the installation disk has to be done from the command line though.

André

0 Kudos
WilliamOoi
Contributor
Contributor

Thanks for pointing this out to me. I am newbie in vmware. What should i do to use another VMFS partition expand for default VMFS partition ?

0 Kudos
a_p_
Leadership
Leadership

In order to give you an advice, it's important to understand the current state.

  1. How does the partition layout look like for the RAID1 LUN (279GB)?
  2. Did you expand the RAID5 LUN after ESXi had already been installed?
  3. Do you have a backup of all your VMs?

André

0 Kudos
WilliamOoi
Contributor
Contributor

Thanks.

  1. How does the partition layout look like for the RAID1 LUN (279GB)?

Ans: RAID 1 LUN (279GB) looking great and performance as normal datastore after creation. It's does not happen to be missing partition.

 

  1. Did you expand the RAID5 LUN after ESXi had already been installed?

Ans: During RAID 5 LUN datastore creation, I had chose to use full partition. But it seems VMFS had partitioned into 2 VMFS. It is strange.

  1. Do you have a backup of all your VMs?

Ans: Yes, I do. I plan to redo eveything if this is the fastest solution.

0 Kudos
a_p_
Leadership
Leadership

I was actually thinking of the partition structure (a screenshot like the one you posted for the RAID5 LUN) when I asked you for the partition layout of the RAID1 LUN. Does the RAID1 LUN contain just a VMFS datastore, or does it have other partitions too?

André

0 Kudos
WilliamOoi
Contributor
Contributor

WilliamOoi_0-1630587077596.png

 

LUN for RAID 1 are pretty good, don't understand why it happen only to LUN RAID 5. Should I move with CLI or rebuilt ?

 

0 Kudos
a_p_
Leadership
Leadership

According to the partition layout, you have an ESXi installation on both volumes!?
It would now be interesting to find out which one is the active one (the boot volume). In case it's the RAID1 volume, then you could - after backing up your VMs - delete the partition table on the RAID5 volume, and create a new datatore using the complete disk space.

André

0 Kudos
WilliamOoi
Contributor
Contributor

My environment as below. I am using 1 host with 2 HDD, RAID1- 300GB & RAID5- 1.8TB.

I am then create datastore with these 2 RAID Hard-Disk as this: Datastore RAID1- 300GB, and Datastore RAID2- 1.8TB. Therefore, I don't know why and how RAID5 datastore hard disk partitioned into 2, and only 1 are useable. Sorry to mention I am quite newbie in this field. So I am seeking any advise or solution from you guys.

0 Kudos
a_p_
Leadership
Leadership

Ok, let's find out what may be the best way to fix this.

Please enable the SSH service (Manage -> Services) on the ESXi host and use e.g. putty to connect to the host.
Once connected resize the command window to avoid line breaks, and make it easier to read the output.

Now run the following commands, and paste the output (as plain text please) in your next reply:

  1. esxcli storage filesystem list
  2. ls -lh /dev/disks/
  3. ls -l /
  4. vmkfstools -P /vmfs/volumes/<UUID>  (the <UUID> that shows up in the output for "bootbank" from the previous command)

André

0 Kudos
WilliamOoi
Contributor
Contributor

HI ,

I attach my SSH Logs for your reference. Shall I use vmkfstools -P /vmfs/volumes/5c7d226c-f72b53ca-1efb-20677ce19980 for my UUID ? this is my RAID5 UUID.

 

 

[root@MUMSESXi1:~] esxcli storage filesystem list
Mount Point                                        Volume Name  UUID                                 Mounted  Type            S                                                                       ize          Free
-------------------------------------------------  -----------  -----------------------------------  -------  ------  ---------                                                                       ---  ------------
/vmfs/volumes/5c7d1b5c-129f5a5e-96fd-20677ce19980  VMFS_RAID1   5c7d1b5c-129f5a5e-96fd-20677ce19980     true  VMFS-6  291789340                                                                       672   55068065792
/vmfs/volumes/5c7d226c-f72b53ca-1efb-20677ce19980  VMFS_RAID5   5c7d226c-f72b53ca-1efb-20677ce19980     true  VMFS-6  999922073                                                                       600  153391988736
/vmfs/volumes/a9b8bd7e-9f22501c-df96-1ed21149dc83               a9b8bd7e-9f22501c-df96-1ed21149dc83     true  vfat       261853                                                                       184      69181440
/vmfs/volumes/5c7d1b5d-23f0614a-53e7-20677ce19980               5c7d1b5d-23f0614a-53e7-20677ce19980     true  vfat      4293591                                                                       040    4271046656
/vmfs/volumes/5c7d1b52-15bbccba-fac4-20677ce19980               5c7d1b52-15bbccba-fac4-20677ce19980     true  vfat       299712                                                                       512      80486400
/vmfs/volumes/2eae7f7f-21ea1025-dbaf-604640e515b1               2eae7f7f-21ea1025-dbaf-604640e515b1     true  vfat       261853                                                                       184     261844992
[root@MUMSESXi1:~] ls -lh /dev/disks/
total 4492787500
-rw-------    1 root     root      279.4G Sep  6 03:42 naa.600508b1001c4909f9a5fb2b2fb5d774
-rw-------    1 root     root        4.0M Sep  6 03:42 naa.600508b1001c4909f9a5fb2b2fb5d774:1
-rw-------    1 root     root        4.0G Sep  6 03:42 naa.600508b1001c4909f9a5fb2b2fb5d774:2
-rw-------    1 root     root      272.0G Sep  6 03:42 naa.600508b1001c4909f9a5fb2b2fb5d774:3
-rw-------    1 root     root      250.0M Sep  6 03:42 naa.600508b1001c4909f9a5fb2b2fb5d774:5
-rw-------    1 root     root      250.0M Sep  6 03:42 naa.600508b1001c4909f9a5fb2b2fb5d774:6
-rw-------    1 root     root      110.0M Sep  6 03:42 naa.600508b1001c4909f9a5fb2b2fb5d774:7
-rw-------    1 root     root      286.0M Sep  6 03:42 naa.600508b1001c4909f9a5fb2b2fb5d774:8
-rw-------    1 root     root        2.5G Sep  6 03:42 naa.600508b1001c4909f9a5fb2b2fb5d774:9
-rw-------    1 root     root        1.8T Sep  6 03:42 naa.600508b1001cdf29816ef9a152af1315
-rw-------    1 root     root        4.0M Sep  6 03:42 naa.600508b1001cdf29816ef9a152af1315:1
-rw-------    1 root     root      931.4G Sep  6 03:42 naa.600508b1001cdf29816ef9a152af1315:10
-rw-------    1 root     root        4.0G Sep  6 03:42 naa.600508b1001cdf29816ef9a152af1315:2
-rw-------    1 root     root      924.1G Sep  6 03:42 naa.600508b1001cdf29816ef9a152af1315:3
-rw-------    1 root     root      250.0M Sep  6 03:42 naa.600508b1001cdf29816ef9a152af1315:5
-rw-------    1 root     root      250.0M Sep  6 03:42 naa.600508b1001cdf29816ef9a152af1315:6
-rw-------    1 root     root      110.0M Sep  6 03:42 naa.600508b1001cdf29816ef9a152af1315:7
-rw-------    1 root     root      286.0M Sep  6 03:42 naa.600508b1001cdf29816ef9a152af1315:8
-rw-------    1 root     root        2.5G Sep  6 03:42 naa.600508b1001cdf29816ef9a152af1315:9
lrwxrwxrwx    1 root     root          36 Sep  6 03:42 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341 -> naa.600508b1001c4909f9a5fb2b2fb5d774
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:1 -> naa.600508b1001c4909f9a5fb2b2fb5d774:1
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:2 -> naa.600508b1001c4909f9a5fb2b2fb5d774:2
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:3 -> naa.600508b1001c4909f9a5fb2b2fb5d774:3
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:5 -> naa.600508b1001c4909f9a5fb2b2fb5d774:5
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:6 -> naa.600508b1001c4909f9a5fb2b2fb5d774:6
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:7 -> naa.600508b1001c4909f9a5fb2b2fb5d774:7
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:8 -> naa.600508b1001c4909f9a5fb2b2fb5d774:8
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:9 -> naa.600508b1001c4909f9a5fb2b2fb5d774:9
lrwxrwxrwx    1 root     root          36 Sep  6 03:42 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341 -> naa.600508b1001cdf29816ef9a152af1315
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:1 -> naa.600508b1001cdf29816ef9a152af1315:1
lrwxrwxrwx    1 root     root          39 Sep  6 03:42 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:10 -> naa.600508b1001cdf29816ef9a152af1315:10
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:2 -> naa.600508b1001cdf29816ef9a152af1315:2
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:3 -> naa.600508b1001cdf29816ef9a152af1315:3
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:5 -> naa.600508b1001cdf29816ef9a152af1315:5
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:6 -> naa.600508b1001cdf29816ef9a152af1315:6
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:7 -> naa.600508b1001cdf29816ef9a152af1315:7
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:8 -> naa.600508b1001cdf29816ef9a152af1315:8
lrwxrwxrwx    1 root     root          38 Sep  6 03:42 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:9 -> naa.600508b1001cdf29816ef9a152af1315:9
[root@MUMSESXi1:~] ls -l /
total 1197
lrwxrwxrwx    1 root     root            49 Aug 13 02:56 altbootbank -> /vmfs/volumes/2eae7f7f-21ea1025-dbaf-604640e515b1
drwxr-xr-x    1 root     root           512 Aug 13 02:55 bin
lrwxrwxrwx    1 root     root            49 Aug 13 02:56 bootbank -> /vmfs/volumes/a9b8bd7e-9f22501c-df96-1ed21149dc83
-r--r--r--    1 root     root        528545 Aug  8  2018 bootpart.gz
-r--r--r--    1 root     root        414341 Aug  8  2018 bootpart4kn.gz
drwxr-xr-x   19 root     root           512 Sep  6 03:42 dev
drwxr-xr-x    1 root     root           512 Sep  6 03:31 etc
drwxr-xr-x    1 root     root           512 Aug 13 02:55 lib
drwxr-xr-x    1 root     root           512 Aug 13 02:55 lib64
-r-x------    1 root     root         54559 Aug 13 02:01 local.tgz
lrwxrwxrwx    1 root     root             6 Aug 13 02:56 locker -> /store
drwxr-xr-x    1 root     root           512 Aug 13 02:55 mbr
drwxr-xr-x    1 root     root           512 Aug 13 02:55 opt
drwxr-xr-x    1 root     root        131072 Sep  6 03:42 proc
lrwxrwxrwx    1 root     root            29 Aug 13 02:56 productLocker -> /locker/packages/vmtoolsRepo/
lrwxrwxrwx    1 root     root             4 Aug  8  2018 sbin -> /bin
lrwxrwxrwx    1 root     root            49 Aug 13 02:56 scratch -> /vmfs/volumes/5c7d1b5d-23f0614a-53e7-20677ce19980
lrwxrwxrwx    1 root     root            49 Aug 13 02:56 store -> /vmfs/volumes/5c7d1b52-15bbccba-fac4-20677ce19980
drwxr-xr-x    1 root     root           512 Aug 13 02:55 tardisks
drwxr-xr-x    1 root     root           512 Aug 13 02:55 tardisks.noauto
drwxrwxrwt    1 root     root           512 Sep  6 03:42 tmp
drwxr-xr-x    1 root     root           512 Aug 13 02:55 usr
drwxr-xr-x    1 root     root           512 Aug 13 02:56 var
drwxr-xr-x    1 root     root           512 Aug 13 02:55 vmfs
drwxr-xr-x    1 root     root           512 Aug 13 02:55 vmimages
lrwxrwxrwx    1 root     root            18 Aug  8  2018 vmupgrade -> /locker/vmupgrade/

 

Tags (1)
0 Kudos
a_p_
Leadership
Leadership

>>> Shall I use vmkfstools -P /vmfs/volumes/5c7d226c-f72b53ca-1efb-20677ce19980 for my UUID ? this is my RAID5 UUID.

It's the bootbank's UUID that's needed, i.e.

vmkfstools -P /vmfs/volumes/a9b8bd7e-9f22501c-df96-1ed21149dc83

André

0 Kudos
WilliamOoi
Contributor
Contributor

Hi..

Attach with my cli logs and as well as I paste over here.

 

 

[root@MUMSESXi1:~] esxcli storage filesystem list
Mount Point                                        Volume Name  UUID                                 Mounted  Type            Size          Free
-------------------------------------------------  -----------  -----------------------------------  -------  ------  ------------  ------------
/vmfs/volumes/5c7d1b5c-129f5a5e-96fd-20677ce19980  VMFS_RAID1   5c7d1b5c-129f5a5e-96fd-20677ce19980     true  VMFS-6  291789340672   55068065792
/vmfs/volumes/5c7d226c-f72b53ca-1efb-20677ce19980  VMFS_RAID5   5c7d226c-f72b53ca-1efb-20677ce19980     true  VMFS-6  999922073600  153391988736
/vmfs/volumes/a9b8bd7e-9f22501c-df96-1ed21149dc83               a9b8bd7e-9f22501c-df96-1ed21149dc83     true  vfat       261853184      69181440
/vmfs/volumes/5c7d1b5d-23f0614a-53e7-20677ce19980               5c7d1b5d-23f0614a-53e7-20677ce19980     true  vfat      4293591040    4267507712
/vmfs/volumes/5c7d1b52-15bbccba-fac4-20677ce19980               5c7d1b52-15bbccba-fac4-20677ce19980     true  vfat       299712512      80486400
/vmfs/volumes/2eae7f7f-21ea1025-dbaf-604640e515b1               2eae7f7f-21ea1025-dbaf-604640e515b1     true  vfat       261853184     261844992
[root@MUMSESXi1:~] ls -lh /dev/disks/
total 4492787500
-rw-------    1 root     root      279.4G Sep  7 02:12 naa.600508b1001c4909f9a5fb2b2fb5d774
-rw-------    1 root     root        4.0M Sep  7 02:12 naa.600508b1001c4909f9a5fb2b2fb5d774:1
-rw-------    1 root     root        4.0G Sep  7 02:12 naa.600508b1001c4909f9a5fb2b2fb5d774:2
-rw-------    1 root     root      272.0G Sep  7 02:12 naa.600508b1001c4909f9a5fb2b2fb5d774:3
-rw-------    1 root     root      250.0M Sep  7 02:12 naa.600508b1001c4909f9a5fb2b2fb5d774:5
-rw-------    1 root     root      250.0M Sep  7 02:12 naa.600508b1001c4909f9a5fb2b2fb5d774:6
-rw-------    1 root     root      110.0M Sep  7 02:12 naa.600508b1001c4909f9a5fb2b2fb5d774:7
-rw-------    1 root     root      286.0M Sep  7 02:12 naa.600508b1001c4909f9a5fb2b2fb5d774:8
-rw-------    1 root     root        2.5G Sep  7 02:12 naa.600508b1001c4909f9a5fb2b2fb5d774:9
-rw-------    1 root     root        1.8T Sep  7 02:12 naa.600508b1001cdf29816ef9a152af1315
-rw-------    1 root     root        4.0M Sep  7 02:12 naa.600508b1001cdf29816ef9a152af1315:1
-rw-------    1 root     root      931.4G Sep  7 02:12 naa.600508b1001cdf29816ef9a152af1315:10
-rw-------    1 root     root        4.0G Sep  7 02:12 naa.600508b1001cdf29816ef9a152af1315:2
-rw-------    1 root     root      924.1G Sep  7 02:12 naa.600508b1001cdf29816ef9a152af1315:3
-rw-------    1 root     root      250.0M Sep  7 02:12 naa.600508b1001cdf29816ef9a152af1315:5
-rw-------    1 root     root      250.0M Sep  7 02:12 naa.600508b1001cdf29816ef9a152af1315:6
-rw-------    1 root     root      110.0M Sep  7 02:12 naa.600508b1001cdf29816ef9a152af1315:7
-rw-------    1 root     root      286.0M Sep  7 02:12 naa.600508b1001cdf29816ef9a152af1315:8
-rw-------    1 root     root        2.5G Sep  7 02:12 naa.600508b1001cdf29816ef9a152af1315:9
lrwxrwxrwx    1 root     root          36 Sep  7 02:12 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341 -> naa.600508b1001c4909f9a5fb2b2fb5d774
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:1 -> naa.600508b1001c4909f9a5fb2b2fb5d774:1
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:2 -> naa.600508b1001c4909f9a5fb2b2fb5d774:2
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:3 -> naa.600508b1001c4909f9a5fb2b2fb5d774:3
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:5 -> naa.600508b1001c4909f9a5fb2b2fb5d774:5
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:6 -> naa.600508b1001c4909f9a5fb2b2fb5d774:6
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:7 -> naa.600508b1001c4909f9a5fb2b2fb5d774:7
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:8 -> naa.600508b1001c4909f9a5fb2b2fb5d774:8
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001c4909f9a5fb2b2fb5d7744c4f47494341:9 -> naa.600508b1001c4909f9a5fb2b2fb5d774:9
lrwxrwxrwx    1 root     root          36 Sep  7 02:12 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341 -> naa.600508b1001cdf29816ef9a152af1315
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:1 -> naa.600508b1001cdf29816ef9a152af1315:1
lrwxrwxrwx    1 root     root          39 Sep  7 02:12 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:10 -> naa.600508b1001cdf29816ef9a152af1315:10
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:2 -> naa.600508b1001cdf29816ef9a152af1315:2
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:3 -> naa.600508b1001cdf29816ef9a152af1315:3
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:5 -> naa.600508b1001cdf29816ef9a152af1315:5
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:6 -> naa.600508b1001cdf29816ef9a152af1315:6
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:7 -> naa.600508b1001cdf29816ef9a152af1315:7
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:8 -> naa.600508b1001cdf29816ef9a152af1315:8
lrwxrwxrwx    1 root     root          38 Sep  7 02:12 vml.0200000000600508b1001cdf29816ef9a152af13154c4f47494341:9 -> naa.600508b1001cdf29816ef9a152af1315:9
[root@MUMSESXi1:~] ls -l /
total 1197
lrwxrwxrwx    1 root     root            49 Aug 13 02:56 altbootbank -> /vmfs/volumes/2eae7f7f-21ea1025-dbaf-604640e515b1
drwxr-xr-x    1 root     root           512 Aug 13 02:55 bin
lrwxrwxrwx    1 root     root            49 Aug 13 02:56 bootbank -> /vmfs/volumes/a9b8bd7e-9f22501c-df96-1ed21149dc83
-r--r--r--    1 root     root        528545 Aug  8  2018 bootpart.gz
-r--r--r--    1 root     root        414341 Aug  8  2018 bootpart4kn.gz
drwxr-xr-x   19 root     root           512 Sep  7 02:12 dev
drwxr-xr-x    1 root     root           512 Sep  7 00:32 etc
drwxr-xr-x    1 root     root           512 Aug 13 02:55 lib
drwxr-xr-x    1 root     root           512 Aug 13 02:55 lib64
-r-x------    1 root     root         54559 Aug 13 02:01 local.tgz
lrwxrwxrwx    1 root     root             6 Aug 13 02:56 locker -> /store
drwxr-xr-x    1 root     root           512 Aug 13 02:55 mbr
drwxr-xr-x    1 root     root           512 Aug 13 02:55 opt
drwxr-xr-x    1 root     root        131072 Sep  7 02:12 proc
lrwxrwxrwx    1 root     root            29 Aug 13 02:56 productLocker -> /locker/packages/vmtoolsRepo/
lrwxrwxrwx    1 root     root             4 Aug  8  2018 sbin -> /bin
lrwxrwxrwx    1 root     root            49 Aug 13 02:56 scratch -> /vmfs/volumes/5c7d1b5d-23f0614a-53e7-20677ce19980
lrwxrwxrwx    1 root     root            49 Aug 13 02:56 store -> /vmfs/volumes/5c7d1b52-15bbccba-fac4-20677ce19980
drwxr-xr-x    1 root     root           512 Aug 13 02:55 tardisks
drwxr-xr-x    1 root     root           512 Aug 13 02:55 tardisks.noauto
drwxrwxrwt    1 root     root           512 Sep  7 02:11 tmp
drwxr-xr-x    1 root     root           512 Aug 13 02:55 usr
drwxr-xr-x    1 root     root           512 Aug 13 02:56 var
drwxr-xr-x    1 root     root           512 Aug 13 02:55 vmfs
drwxr-xr-x    1 root     root           512 Aug 13 02:55 vmimages
lrwxrwxrwx    1 root     root            18 Aug  8  2018 vmupgrade -> /locker/vmupgrade/
[root@MUMSESXi1:~] vmkfstools -P /vmfs/volumes/a9b8bd7e-9f22501c-df96-1ed21149dc83
vfat-0.04 (Raw Major Version: 0) file system spanning 1 partitions.
File system label (if any):
Mode: private
Capacity 261853184 (63929 file blocks * 4096), 69181440 (16890 blocks) avail, max supported file size 0
Disk Block Size: 512/0/0
UUID: a9b8bd7e-9f22501c-df96-1ed21149dc83
Partitions spanned (on "disks"):
naa.600508b1001c4909f9a5fb2b2fb5d774:5
Is Native Snapshot Capable: NO
[root@MUMSESXi1:~]
Tags (1)
0 Kudos
a_p_
Leadership
Leadership

According to the commands' output, the system boots from the small device (naa.600508b1001c4909f9a5fb2b2fb5d774).

What you may do is to make sure that you backup everything from the large disk, then delete all partitions, and finally create a new datastore on it.

To delete the partitions, select Storage -> Devices -> naa.600508b1001cdf29816ef9a152af1315 -> Actions: "Clear partition table"

Since I'm not aware of the large disk's history, you may - after backing up the VMs on the VMFS_RAID5 datastore, and before deleting the partition table - delete this datastore to see whether the other VMFS partition (Partition 3) shows up in the GUI (you may need to rescan the adapter and/or reboot the host). This is just to ensure that this VMFS partition doesn't contain any valuable data.

André

0 Kudos
WilliamOoi
Contributor
Contributor

Thank You Very Much André for spending so much time to troubleshoot with me. I will do backup and delete partition.

0 Kudos