VMware Cloud Community
Tom31415
Enthusiast
Enthusiast
Jump to solution

Missing parent of the virtual disk

Hello,

I found out that one of my disk causes a problem because is missing his parent. (it is snapshot)

 

# govc disk.ls
govc: retrieve "5cca7884-aa5f-4e6e-89d3-f4bef9e1a50b": ServerFaultCode: A general system error occurred: The parent of this virtual disk could not be opened: api = DiskLib_Open, diskPath->CValue() = /vmfs/volumes/vsan:52e071236c8cc082-86e2156602f6684c/82b4bb5f-d24b-a89f-7d02-0cc47aa8de6e/a4cdb25f088244e59e3196f8aa717d26-000021.vmdk

 

 

And this is how it looks on vsan datastore I just followed parentFileNameHint parameter:

 

[root@esx1:/vmfs/volumes/vsan:52e071236c8cc082-86e2156602f6684c/82b4bb5f-d24b-a89f-7d02-0cc47aa8de6e] cat a4cdb25f088244e59e3196f8aa717d26
-000021.vmdk
# Disk DescriptorFile
version=5
encoding="UTF-8"
CID=9a705cec
parentCID=a2c64ecc
createType="vsanSparse"
parentFileNameHint="a4cdb25f088244e59e3196f8aa717d26-000010.vmdk"
# Extent description
RW 104857600 VSANSPARSE "vsan://52e071236c8cc082-86e2156602f6684c/f64a0160-3631-f53a-3b7f-0cc47aa8ddd6"

# The Disk Data Base 
#DDB

ddb.longContentID = "55db0e33c45f3d841ac1426a9a705cec"
ddb.sidecars = "fcdmdsidecar,a4cdb25f088244e59e3196f8aa717d26-000021-5b2d72ca1986b005.vmfd"
[root@esx1:/vmfs/volumes/vsan:52e071236c8cc082-86e2156602f6684c/82b4bb5f-d24b-a89f-7d02-0cc47aa8de6e] cat a4cdb25f088244e59e3196f8aa717d26
-000010.vmdk
# Disk DescriptorFile
version=5
encoding="UTF-8"
CID=a2c64ecc
parentCID=c4dbcc60
createType="vsanSparse"
parentFileNameHint="a4cdb25f088244e59e3196f8aa717d26-000008.vmdk"
# Extent description
RW 104857600 VSANSPARSE "vsan://52e071236c8cc082-86e2156602f6684c/3bb00060-11eb-d669-ef35-0cc47aa8ddd6"

# The Disk Data Base 
#DDB

ddb.fcd.snap.CreateTime = "1610697465782370"
ddb.fcd.snap.Desc = ""
ddb.fcd.snap.Id = "63 b3 eb 88 c4 d8 4f 37-89 bd 45 18 d7 d9 3f d2"
ddb.longContentID = "fd0330f6148670f6ac0e36c3a2c64ecc"
ddb.sidecars = "fcdmdsidecar,a4cdb25f088244e59e3196f8aa717d26-000010-3ef0edf410ed4a06.vmfd"
[root@esx1:/vmfs/volumes/vsan:52e071236c8cc082-86e2156602f6684c/82b4bb5f-d24b-a89f-7d02-0cc47aa8de6e] cat a4cdb25f088244e59e3196f8aa717d26
-000008.vmdk
# Disk DescriptorFile
version=5
encoding="UTF-8"
CID=c4dbcc60
parentCID=fb42af0c
createType="vsanSparse"
parentFileNameHint="a4cdb25f088244e59e3196f8aa717d26-000005.vmdk"
# Extent description
RW 104857600 VSANSPARSE "vsan://52e071236c8cc082-86e2156602f6684c/21940060-fa94-3b5c-427e-0cc47aa8ddd6"

# The Disk Data Base 
#DDB

ddb.fcd.snap.CreateTime = "1610657854344769"
ddb.fcd.snap.Desc = ""
ddb.fcd.snap.Id = "46 7a 0d f0 07 c2 47 52-84 ad a3 9d ab 02 9d 8d"
ddb.longContentID = "8ac41b028b10f4ce8451f834c4dbcc60"
ddb.sidecars = "fcdmdsidecar,a4cdb25f088244e59e3196f8aa717d26-000008-f265bfe5ae2db1de.vmfd"
[root@esx1:/vmfs/volumes/vsan:52e071236c8cc082-86e2156602f6684c/82b4bb5f-d24b-a89f-7d02-0cc47aa8de6e] cat a4cdb25f088244e59e3196f8aa717d26
-000005.vmdk
cat: can't open 'a4cdb25f088244e59e3196f8aa717d26-000005.vmdk': No such file or directory
[root@esx1:/vmfs/volumes/vsan:52e071236c8cc082-86e2156602f6684c/82b4bb5f-d24b-a89f-7d02-0cc47aa8de6e]

 

 

 There are more files from same disk but are not in same chain. 

 

-rw-------    1 root     root        135168 Jan 14  2021 a4cdb25f088244e59e3196f8aa717d26-000001-23d1e54c3f26b476.vmfd
-rw-------    1 root     root           608 Jan 14  2021 a4cdb25f088244e59e3196f8aa717d26-000001.vmdk
-rw-------    1 root     root        135168 Jan 14  2021 a4cdb25f088244e59e3196f8aa717d26-000002-3bfdd1921d0f7a89.vmfd
-rw-------    1 root     root           608 Jan 14  2021 a4cdb25f088244e59e3196f8aa717d26-000002.vmdk
-rw-------    1 root     root        135168 Jan 14  2021 a4cdb25f088244e59e3196f8aa717d26-000003-f4bb70a78877c3bf.vmfd
-rw-------    1 root     root           608 Jan 14  2021 a4cdb25f088244e59e3196f8aa717d26-000003.vmdk
-rw-------    1 root     root        135168 Feb  1 09:12 a4cdb25f088244e59e3196f8aa717d26-000004-986a405a7ff8a124.vmfd
-rw-------    1 root     root           608 Feb  1 09:12 a4cdb25f088244e59e3196f8aa717d26-000004.vmdk
-rw-------    1 root     root        135168 Feb  1 09:11 a4cdb25f088244e59e3196f8aa717d26-000006-1e479848b9a7ee2c.vmfd
-rw-------    1 root     root           608 Feb  1 09:12 a4cdb25f088244e59e3196f8aa717d26-000006.vmdk
-rw-------    1 root     root        135168 Feb  1 09:11 a4cdb25f088244e59e3196f8aa717d26-000008-f265bfe5ae2db1de.vmfd
-rw-------    1 root     root           608 Feb  1 09:12 a4cdb25f088244e59e3196f8aa717d26-000008.vmdk
-rw-------    1 root     root        135168 Feb  1 09:11 a4cdb25f088244e59e3196f8aa717d26-000010-3ef0edf410ed4a06.vmfd
-rw-------    1 root     root           608 Feb  1 09:11 a4cdb25f088244e59e3196f8aa717d26-000010.vmdk
-rw-------    1 root     root        135168 Feb  1 09:10 a4cdb25f088244e59e3196f8aa717d26-000011-4eceb58323bb5266.vmfd
-rw-------    1 root     root           608 Feb  1 09:11 a4cdb25f088244e59e3196f8aa717d26-000011.vmdk
-rw-------    1 root     root        135168 Jan 15 09:09 a4cdb25f088244e59e3196f8aa717d26-000014-b4c5ba36170adee0.vmfd
-rw-------    1 root     root           601 Jan 15 09:09 a4cdb25f088244e59e3196f8aa717d26-000014.vmdk
-rw-------    1 root     root        135168 Jan 15 08:25 a4cdb25f088244e59e3196f8aa717d26-000021-5b2d72ca1986b005.vmfd
-rw-------    1 root     root           472 Feb  1 09:11 a4cdb25f088244e59e3196f8aa717d26-000021.vmdk
-rw-------    1 root     root        135168 Jan 14  2021 a4cdb25f088244e59e3196f8aa717d26-000024-9de6bc8913f23493.vmfd
-rw-------    1 root     root           608 Jan 15 09:09 a4cdb25f088244e59e3196f8aa717d26-000024.vmdk
-rw-------    1 root     root        135168 Jan 14  2021 a4cdb25f088244e59e3196f8aa717d26-000025-5026379f45f190a8.vmfd
-rw-------    1 root     root           608 Jan 14  2021 a4cdb25f088244e59e3196f8aa717d26-000025.vmdk
-rw-------    1 root     root        135168 Jan 14 23:11 a4cdb25f088244e59e3196f8aa717d26-9d67426037a5704a.vmfd
-rw-------    1 root     root           945 Jan 14 23:11 a4cdb25f088244e59e3196f8aa717d26.vmdk

 

 

And the relations made by mentioned parameter are

 

base disk <- 14 <- 24 <- 25 <- 1 <- 2 <- 3 <- 4 <- 6
missing 5 <- 8 <- 10 <- 21
                   ^- 11

 

 

It is snapshot from kubernetes backup tool Kasten which probably get some error from vCenter and did not remove snapshots/disk. I have to use govc disk.ls command to solve some other problem and I cannot because of it. Can you help me to remove this broken disk please?

Tags (1)
Reply
0 Kudos
1 Solution

Accepted Solutions
Tom31415
Enthusiast
Enthusiast
Jump to solution

The disk was not mounted to any VM but your point just brought me to solution. I changed content of vmdk files and made valid chain by parameters CID, parentCID and parentFileNameHint than I mount the last one to my test VM and just delete the disk with option remove from datastore. I had 5 broken disks but now are gone and govc disk.ls command is working again.

Thank you

View solution in original post

Reply
0 Kudos
4 Replies
scott28tt
VMware Employee
VMware Employee
Jump to solution

There are areas of VMTN for vSphere and vSAN, thread reported asking moderators to move it to one of those areas.


-------------------------------------------------------------------------------------------------------------------------------------------------------------

Although I am a VMware employee I contribute to VMware Communities voluntarily (ie. not in any official capacity)
VMware Training & Certification blog
Tom31415
Enthusiast
Enthusiast
Jump to solution

Tom31415_0-1626425939753.png

 Highlited relations between vmdk files. I just need remove it all, there are no important data.

 

Reply
0 Kudos
TheBobkin
Champion
Champion
Jump to solution

@Tom31415 , Nothing to do with vSAN but this can be remediated by pointing the 'hard disk' of the VM to the last viable snapshot in the chain (a4cdb25f088244e59e3196f8aa717d26-000006.vmdk).

You can do this by either editing the .vmx so this is pointed to instead of the current vmdk (a4cdb25f088244e59e3196f8aa717d26-000021.vmdk) and reloading/re-registering the VM.

or

You can do it via the vSphere UI (Click VM > Edit Settings > Detach current Hard Disk > Add Existing Hard Disk > Select a4cdb25f088244e59e3196f8aa717d26-000006.vmdk).
The implication is of course that any data written to any of the snapshots 'above' this will be gone.

Reply
0 Kudos
Tom31415
Enthusiast
Enthusiast
Jump to solution

The disk was not mounted to any VM but your point just brought me to solution. I changed content of vmdk files and made valid chain by parameters CID, parentCID and parentFileNameHint than I mount the last one to my test VM and just delete the disk with option remove from datastore. I had 5 broken disks but now are gone and govc disk.ls command is working again.

Thank you

Reply
0 Kudos