VMware Cloud Community
JeremyJT
Contributor
Contributor

Failed Snapshot Deletion after Veeam Backup

As a fairly new user of VMware, I appreciate any help with what seems like such a daunting task. I'm definitely swimming out of my depth here. 

To begin, I've recently recovered a failed ESXi RAID array and copied all of the directories for my VMs to a brand new Dell R540 with 6.7.0 installed on it. The failed host was part of a vCenter setup with a replica server but both are now out of commission. With a new host server and VMs back up and running, I've set up Veeam (v10) and most of the backups have gone smoothly.

Unfortunately, three of my production VMs cause the Veeam backup jobs to fail and each has a snapshot that I can't delete. What's more, Veeam's temporary backups don't get removed when the job fails so I'm now facing an ever-growing number of snapshots. Using the Delete All and the Consolidate commands fails and I'm not sure how to proceed.

JeremyJT_0-1613683777862.png

JeremyJT_1-1613683842448.png

JeremyJT_2-1613683906366.png

Here's a snippet of the vmware.log that comes up often:

2021-02-14T06:32:22.118Z| vmx| I125: VigorTransportProcessClientPayload: opID=4fd39f31-22-1ccb seq=1228406: Receiving Snapshot.Delete request.
2021-02-14T06:32:22.122Z| vmx| I125: SNAPSHOT: SnapshotDeleteWork '/vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone.vmx' : 139
2021-02-14T06:32:22.122Z| vmx| I125: DISKLIB-VMFS : "/vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone-000007-sesparse.vmdk" : open successful (5) size = 4677914624, hd = 0. Type 19
2021-02-14T06:32:22.123Z| vmx| I125: DISKLIB-VMFS : "/vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone-000007-sesparse.vmdk" : closed.
2021-02-14T06:32:22.123Z| vmx| I125: DISKLIB-VMFS : "/vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone-000006-sesparse.vmdk" : open successful (5) size = 493924352, hd = 0. Type 19
2021-02-14T06:32:22.123Z| vmx| I125: DISKLIB-VMFS : "/vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone-000006-sesparse.vmdk" : closed.
2021-02-14T06:32:22.123Z| vmx| I125: DISKLIB-VMFS : "/vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone-000005-sesparse.vmdk" : open successful (5) size = 511524864, hd = 0. Type 19
2021-02-14T06:32:22.123Z| vmx| I125: DISKLIB-VMFS : "/vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone-000005-sesparse.vmdk" : closed.
2021-02-14T06:32:22.123Z| vmx| I125: DISKLIB-VMFS : "/vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone-000004-sesparse.vmdk" : open successful (5) size = 1261408256, hd = 0. Type 19
2021-02-14T06:32:22.123Z| vmx| I125: DISKLIB-VMFS : "/vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone-000004-sesparse.vmdk" : closed.
2021-02-14T06:32:22.124Z| vmx| I125: DISKLIB-VMFS : "/vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone-000003-sesparse.vmdk" : open successful (5) size = 2211381248, hd = 0. Type 19
2021-02-14T06:32:22.124Z| vmx| I125: DISKLIB-VMFS : "/vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone-000003-sesparse.vmdk" : closed.
2021-02-14T06:32:22.124Z| vmx| I125: DISKLIB-VMFS : "/vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone-000002-sesparse.vmdk" : open successful (5) size = 443559936, hd = 0. Type 19
2021-02-14T06:32:22.124Z| vmx| I125: DISKLIB-VMFS : "/vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone-000002-sesparse.vmdk" : closed.
2021-02-14T06:32:22.124Z| vmx| I125: DISKLIB-VMFS : "/vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone-flat.vmdk" : open successful (5) size = 107374182400, hd = 0. Type 3
2021-02-14T06:32:22.124Z| vmx| I125: DISKLIB-VMFS : "/vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone-flat.vmdk" : closed.
2021-02-14T06:32:22.124Z| vmx| I125: SNAPSHOT: SnapshotDiskTreeAddFromSnapshot: Trying to add snapshot 374EPS_Clone-Snapshot138.vmsn to disk /vmfs/volumes/6001b534-4c2a9dc6-f56e-2cea7f7f7cc5/374EPS_Clone/374EPS_Clone.vmdk which already has snapshot 374EPS_Clone-Snapshot5.vmsn.
2021-02-14T06:32:22.124Z| vmx| I125: SNAPSHOT: SnapshotGenerateDeleteDisks Failed to fetch disk tree: One of the disks in this virtual machine is already in use by a virtual machine or by a snapshot (21)
2021-02-14T06:32:22.124Z| vmx| I125: SNAPSHOT: SnapshotDeleteNode failed: One of the disks in this virtual machine is already in use by a virtual machine or by a snapshot (21)
2021-02-14T06:32:22.124Z| vmx| I125: SNAPSHOT: Snapshot_Delete failed: One of the disks in this virtual machine is already in use by a virtual machine or by a snapshot (21)
2021-02-14T06:32:22.124Z| vmx| I125: VigorTransport_ServerSendResponse opID=4fd39f31-22-1ccb seq=1228406: Completed Snapshot request with messages.

The VM directory:

JeremyJT_3-1613684136054.png

Hardware usage:

JeremyJT_4-1613684236289.png

 

Here's the snapshot tree:

JeremyJT_5-1613684315170.png

Thank you all in advance!

 

Reply
0 Kudos
1 Reply
Arvind_Kumar11
Enthusiast
Enthusiast

Pls check below KB if it resolves your issue:

https://kb.vmware.com/s/article/2150414 

Also, you can restore most recent backup of this which will not have snapshots available. Then you can use that machine for your use.

You can either backup the content of this single HDD and remove it from the hardware.

Once removed, consolidate the disk and add a new HDD and restore the content from available backup.

 

Regards,

Arvind

Reply
0 Kudos