VMware Cloud Community
opoet3
Enthusiast
Enthusiast

An error occurred while saving the snapshot

Hi

When i want to create snapshot on ESXi 7.0.2, 17867351 i get this error.

An error occurred while saving the snapshot: msg.changetracker.MIRRORCOPYSTATUS. An error occurred while taking a snapshot: msg.changetracker.MIRRORCOPYSTATUS.

How can i solve this error?

 

Reply
0 Kudos
16 Replies
kenobi79
Enthusiast
Enthusiast

hi

verify if this kb of veeam talk about your problem

https://www.veeam.com/kb4003

 

Bye - Riccardo Panzieri
https://www.i3piccioni.it
Tags (1)
Reply
0 Kudos
opoet3
Enthusiast
Enthusiast

Thanks

This is not related to my problem.

Reply
0 Kudos
eode
Enthusiast
Enthusiast

Hi!

Got the same error today, basically triggered from Veeam, but related to taking snapshot of my VM, before backup.

Finding
I did a quick troubleshoot in the vmware.log of the affected VM, and found a locking issue, like this:
"2021-05-16T15:38:49.953Z| vmx| | W003: DISKLIB-CBT : ChangeTrackerESX_GetMirrorCopyProgress: Failed to copy mirror: Lost previously held disk lock".

Resolution/workaround
Since the locking is held locally by the host, I just did a vMotion of the VM (to another host, in my cluster), to re-issue the lock by another host. Re-trying snapshot, it now completed successfully (and backup is now working again).

opoet3
Enthusiast
Enthusiast

Thanks.

I had reached this solution before, but this solution is a temporary solution, and after a few days the error is repeated for other virtual machines, and by vmotion them to other hosts, the error is repeated for other virtual machines.

Reply
0 Kudos
eode
Enthusiast
Enthusiast

Interesting, I'll have to wait to see if the same pattern happens here. Backup went fine yesterday, and this night (as usual). Only occured on one (1) VM for me, so far.

 

- I did also notice a DRS vMotion (auto) occured on the VM earlier the same day (some hours before I got the snapshot/locking issues).

Q: Maybe you could check, if same pattern occured in your environment; e.g. vMotions/changes on the VM before snapshot issues occured? If so, lowering the DRS threshold could also help "slow down" the occurence (not a permanent solution, but the impact/frequency may be reduced).

 

I'm also running ESXi 7.0.2 build-17867351, in a 4-node vSAN cluster.

Q: Are you also running vSAN, or is this regular VMFS/NFS datastore?

Reply
0 Kudos
opoet3
Enthusiast
Enthusiast

I disabled DRS a long time ago but it didn't have much effect, our infrastructure uses a VMFS datastore.

Reply
0 Kudos
nachogonzalez
Commander
Commander

Do you see any consolidation errors on the VM?
What happens if you open the Snapshot manager?

Reply
0 Kudos
opoet3
Enthusiast
Enthusiast

No consolidation error.

Snapshot manager is empty.

Reply
0 Kudos
continuum
Immortal
Immortal

1. check your Veeam VM wether it still has mounted vmdks that should be not mounted.
2. check your Windows VMs wether they report VSS-problems - and deal with the errors.


________________________________________________
Do you need support with a VMFS recovery problem ? - send a message via skype "sanbarrow"
I do not support Workstation 16 at this time ...

Reply
0 Kudos
opoet3
Enthusiast
Enthusiast

Thanks.

1. No.

2. No.

Reply
0 Kudos
NathanosBlightc
Commander
Commander

Please search for more details of possible causes in the vmware.log file via running the following command, especially when you try to generate the snapshot:

cat  /vmfs/volumes/[datastore-name]/[vm-name]/vmware.log

You can also mix the command with the (  | grep -i "snapshot ) to search for related keywords

Please mark my comment as the Correct Answer if this solution resolved your problem
Reply
0 Kudos
eode
Enthusiast
Enthusiast

If you see my first reply, I'm getting error with disk lock, and from the process DISKLIB-CBT. Workaround to clear lock for me; migrate the VM to another host. I have not had any issues with locking since. May be issues with CBT/CTK, which also may be disabled on VM level (if only occuring on some VMs, etc.). May be other underlying issues with hosts, etc.

Previous post
https://communities.vmware.com/t5/VMware-vSphere-Discussions/An-error-occurred-while-saving-the-snap...

The error I'm referring to

2021-05-16T15:38:49.953Z| vmx| | W003: DISKLIB-CBT   : ChangeTrackerESX_GetMirrorCopyProgress: Failed to copy mirror: Lost previously held disk lock

 

Since this process fail, the following snapshot request process (SnapshotPrepareTakeDoneCB) will also fail, hence the error seen in vSphere Client (my guess)

2021-05-16T15:38:49.976Z| vmx| | I005: SNAPSHOT: SnapshotPrepareTakeDoneCB: Failed to prepare block track.
2021-05-16T15:38:49.976Z| vmx| | I005: SNAPSHOT: SnapshotPrepareTakeDoneCB: Prepare phase complete (Could not get mirror copy status).

 

I wrote a short guest post, which includes the full vmware.log for a VM, while having snapshot issues, and trying to trigger a manual snapshot (regular snapshot, via vSphere Client), which includes some more details. The relevant part is of course the warning from DISKLIB-CBT, regarding the lock.

https://vninja.net/2021/05/18/error-occurred-while-saving-snapshot-msg.changetracker.mirrorcopystatu...

 

I would start with verifying if it's the same error (see the same error in the vmware.log), then go from there. If Production w/SnS, just open a SR with VMware, OFC.

Reply
0 Kudos
opoet3
Enthusiast
Enthusiast

VM log:

Reply
0 Kudos
opoet3
Enthusiast
Enthusiast

VM Log:

Spoiler

 

 

2021-06-01T12:19:06.622Z| vmx| | I005: SnapshotVMX_TakeSnapshot start: 'Snap1', deviceState=0, lazy=0, quiesced=0, forceNative=0, tryNative=1, saveAllocMaps=0
2021-06-01T12:19:06.634Z| vmx| | I005: DiskLib_IsVMFSSparseSupported: vmfssparse is not supported on /vmfs/volumes/5b29d5a0-b2fa8328-8e5f-d89d6716de70/Dashboard: f532.
2021-06-01T12:19:06.635Z| vmx| | I005: DISKLIB-LIB_CREATE   : DiskLibCreateCreateParam: Selecting the default child type as SeSparse for /vmfs/volumes/5b29d5a0-b2fa8328-8e5f-d89d6716de70/Dashboard/Dashboard_2-000001.vmdk.
2021-06-01T12:19:06.635Z| vmx| | I005: DISKLIB-LIB_CREATE   : DiskLibCreateCreateParam: seSparse grain size is set to 8 for '/vmfs/volumes/5b29d5a0-b2fa8328-8e5f-d89d6716de70/Dashboard/Dashboard_2-000001.vmdk'
2021-06-01T12:19:06.648Z| vmx| | I005: DISKLIB-CBT   :ChangeTrackerESX_CreateMirror: Created mirror node /vmfs/devices/svm/8c44171-c79195b-cbtmirror.
2021-06-01T12:23:49.935Z| vmx| | W003: DISKLIB-CBT   : ChangeTrackerESX_GetMirrorCopyProgress: Failed to copy mirror: Busy
2021-06-01T12:23:49.936Z| vmx| | I005: DISKLIB-LIB_BLOCKTRACK   : DiskLibBlockTrackMirrorProgress: Failed to get mirror status of block track info file /vmfs/volumes/5b29d5a0-b2fa8328-8e5f-d89d6716de70/Dashboard/Dashboard_2-ctk.vmdk.
2021-06-01T12:23:49.936Z| vmx| | I005: DISKLIB-CBT   :ChangeTrackerESX_DestroyMirror: Destroyed mirror node 8c44171-c79195b-cbtmirror.
2021-06-01T12:23:50.095Z| vmx| | I005: SNAPSHOT: SnapshotPrepareTakeDoneCB: Failed to prepare block track.
2021-06-01T12:23:50.095Z| vmx| | I005: SNAPSHOT: SnapshotPrepareTakeDoneCB: Prepare phase complete (Could not get mirror copy status).
2021-06-01T12:23:50.095Z| vmx| | I005: SnapshotVMXPrepareTakeDoneCB: Prepare phase failed: Could not get mirror copy status (5).
2021-06-01T12:23:50.095Z| vmx| | I005: SnapshotVMXTakeSnapshotComplete: Done with snapshot 'Snap1': 0
2021-06-01T12:23:50.095Z| vmx| | I005: SnapshotVMXTakeSnapshotComplete: Snapshot 0 failed: Could not get mirror copy status (5).
2021-06-01T12:23:50.095Z| vmx| | I005: VigorTransport_ServerSendResponse opID=kpdh768k-19175-auto-eso-h5:70004817-18-12-91a8 seq=1539514: Completed Snapshot request with messages.
2021-06-01T12:23:57.957Z| vmx| | I005: VigorTransportProcessClientPayload: opID=kpdh768k-21286-auto-gfc-h5:70004983-8b-55-9919 seq=1539862: Receiving PowerState.InitiatePowerOff request.
2021-06-01T12:23:57.958Z| vmx| | I005: Vix: [vmxCommands.c:558]: VMAutomation_InitiatePowerOff. Trying hard powerOff
2021-06-01T12:23:57.958Z| vmx| | I005: VigorTransport_ServerSendResponse opID=kpdh768k-21286-auto-gfc-h5:70004983-8b-55-9919 seq=1539862: Completed PowerState request with messages.
2021-06-01T12:23:57.958Z| vmx| | I005: Stopping VCPU threads...
2021-06-01T12:23:57.958Z| vcpu-1| | I005: VMMon_WaitForExit: vcpu-1: worldID=2531273
2021-06-01T12:23:57.958Z| vcpu-2| | I005: VMMon_WaitForExit: vcpu-2: worldID=2531274
2021-06-01T12:23:57.959Z| vcpu-0| | I005: VMMon_WaitForExit: vcpu-0: worldID=2531269
2021-06-01T12:23:57.959Z| vcpu-3| | I005: VMMon_WaitForExit: vcpu-3: worldID=2531275
2021-06-01T12:23:57.961Z| svga| | I005: SVGA thread is exiting the main loop
2021-06-01T12:23:57.961Z| vmx| | I005: MKS/SVGA threads are stopped
2021-06-01T12:23:57.962Z| vmx| | I005: 
2021-06-01T12:23:57.962Z| vmx| | I005+ OvhdMem: Final (Power Off) Overheads
2021-06-01T12:23:57.962Z| vmx| | I005:                                                       reserved      |          used
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem excluded                                  cur    max    avg |    cur    max    avg
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_MainMem                    :  2097152 2097152      - | 2097152 2097152      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_VmxText                    :    7168   7168      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_VmxTextLibs                :   15360  15360      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem Total excluded                      :  2119680 2119680      - |      -      -      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem Actual maximum                      :         2119680        |             -
2021-06-01T12:23:57.962Z| vmx| | I005+ 
2021-06-01T12:23:57.962Z| vmx| | I005:                                                       reserved      |          used
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem paged                                     cur    max    avg |    cur    max    avg
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_STATS_vmm                  :       8      8      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_STATS_device               :       4      4      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_SvgaMobFallback            :    4096   4096      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_DiskLibMemUsed             :    3075   3075      - |      0    129      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_SvgaSurfaceTable           :       6      6      - |      1      1      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_SvgaBESurfaceTable         :       4      4      - |      4      4      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_SvgaSDirtyCache            :      96     96      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_SvgaCursor                 :      10     10      - |     10     10      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_SvgaPPNList                :     768    768      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_VmxGlobals                 :    2450   2450      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_VmxGlobalsLibs             :    3584   3584      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_VmxHeap                    :    8704   8704      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_VmxMks                     :      33     33      - |      1      1      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_VmxMksRenderOps            :     675    675      - |    675    675      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_VmxMks3d                   :    8192   8192      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_VmxMksScreenTemp           :    8450   8450      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_VmxMksVnc                  :    6123   6123      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_VmxMksScreen               :    8195   8195      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_VmxMksSVGAVO               :    4096   4096      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_VmxMksSwbCursor            :    2560   2560      - |      5      5      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_VmxPhysMemErrPages         :      10     10      - |      0      0      -
2021-06-01T12:23:57.962Z| vmx| | I005: OvhdMem OvhdUser_VmxSLEntryBuf              :     128    128      - |      0      0      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_VmxThreads                 :   10240  10240      - |      0      0      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem Total paged                         :   71507  71507      - |    696    825      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem Actual maximum                      :          71507        |           825
2021-06-01T12:23:57.963Z| vmx| | I005+ 
2021-06-01T12:23:57.963Z| vmx| | I005:                                                       reserved      |          used
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem nonpaged                                  cur    max    avg |    cur    max    avg
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_SharedArea                 :     139    139      - |    100    100      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_BusMemTraceBitmap          :      67     67      - |      0      0      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_VIDE_KSEG                  :      16     16      - |     16     16      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_VGA                        :      64     64      - |     64     64      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_BalloonMPN                 :       1      1      - |      1      1      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_LocalApic                  :       4      4      - |      4      4      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_BusError                   :       1      1      - |      1      1      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_VBIOS                      :       8      8      - |      8      8      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_VnicGuest                  :      32     32      - |     32     32      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_VnicMmap                   :       2      2      - |      2      2      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_LSIBIOS                    :       4      4      - |      4      4      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_LSIRings                   :       4      4      - |      0      0      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_SAS1068BIOS                :       4      4      - |      0      0      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_SBIOS                      :      16     16      - |      0      0      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_AHCIBIOS                   :      16     16      - |     16     16      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_FlashRam                   :     128    128      - |    128    128      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_SVGAFB                     :    1024   1024      - |   1024   1024      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_SVGAMEM                    :      64    512      - |     64     64      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_HDAudioReg                 :       3      3      - |      0      0      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_EHCIRegister               :       1      1      - |      0      0      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_XhciRegister               :       1      1      - |      0      0      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_HyperV                     :       2      2      - |      0      0      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_ExtCfg                     :       4      4      - |      4      4      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_vhvCachedVMCS              :       4      4      - |      0      0      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_vhvNestedAPIC              :       4      4      - |      0      0      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_MonNuma                    :     338    338      - |      0      0      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdUser_NVDC                       :       1      1      - |      0      0      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem Total nonpaged                      :    1952   2400      - |   1468   1468      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem Actual maximum                      :           2400        |          1468
2021-06-01T12:23:57.963Z| vmx| | I005+ 
2021-06-01T12:23:57.963Z| vmx| | I005:                                                       reserved      |          used
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem anonymous                                 cur    max    avg |    cur    max    avg
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdMon_Alloc                       :     316    316      - |    111    231      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdMon_BusMemFrame                 :    2114   2123      - |   2114   2114      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdMon_BusMem2MInfo                :      32     32      - |     32     32      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdMon_BusMem1GInfo                :       1      1      - |      1      1      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdMon_BusMemZapListMPN            :       1      1      - |      1      1      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdMon_BusMemPreval                :      16     16      - |      0     16      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdMon_MonAS                       :       4      4      - |      1      1      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdMon_GuestMem                    :      96     96      - |     96     96      -
2021-06-01T12:23:57.963Z| vmx| | I005: OvhdMem OvhdMon_TC                          :    2052   2528      - |   1908   2387      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_BusMemMonAS                 :       8      8      - |      8      8      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_HVNuma                      :       8      8      - |      0      0      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_HV                          :       4      4      - |      4      4      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_HVMSRBitmap                 :       1      1      - |      0      0      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_VHVGuestMSRBitmap           :       4      4      - |      0      0      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_VHV                         :      12     12      - |      0      0      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_Numa                        :      46     46      - |     14     38      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_NumaTextRodata              :     230    427      - |      0    197      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_NumaDataBss                 :     108    108      - |    104    104      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_NumaLargeData               :    2048   2048      - |      0   1981      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_BaseWired                   :      64     68      - |     60     60      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_Bootstrap                   :       0   1883      - |      0    345      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_GPhysTraced                 :     860    860      - |     87    105      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_GPhysHWMMU                  :    4282   4282      - |    634    838      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_GPhysNoTrace                :     270    270      - |     73     73      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_PhysMemGart                 :     104    104      - |     96     96      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_PhysMemErr                  :       9      9      - |      0      0      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_VIDE                        :       4      4      - |      0      0      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_BusLogic                    :       4      4      - |      0      0      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_Ahci                        :      32     32      - |      1      1      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_LSIRings                    :       8      8      - |      8      8      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_Hba                         :       2      2      - |      2      2      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem OvhdMon_VProbe                      :       1      1      - |      0      0      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem Total anonymous                     :   12741  15310      - |   5355   8739      -
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem Actual maximum                      :          13414        |          7062
2021-06-01T12:23:57.964Z| vmx| | I005+ 
2021-06-01T12:23:57.964Z| vmx| | I005: OvhdMem: memsize 8192 MB VMK fixed 3349 pages var(mem) 529 pages var(cpu) 22 cbrcOverhead 0 pages total 7673 pages
2021-06-01T12:23:57.964Z| vmx| | I005: VMMEM: Maximum Reservation: 348MB (MainMem=8192MB) VMK=29MB
2021-06-01T12:23:57.965Z| vmx| | I005: Tools: ToolsRunningStatus_Exit, delayedRequest is 0x0
2021-06-01T12:23:57.965Z| vmx| | I005: Tools: Changing running status: 1 => 0.
2021-06-01T12:23:57.965Z| vmx| | I005: Tools: [RunningStatus] Last heartbeat value 2023813 (last received 0s ago)
2021-06-01T12:23:57.965Z| vmx| | I005: SVMotion_PowerOff: Not running Storage vMotion. Nothing to do
2021-06-01T12:23:57.966Z| vmx| | I005: GuestRpc: Closing channel 0 connection 5
2021-06-01T12:23:57.966Z| vmx| | I005: GuestRpc: Reinitializing Channel 0(toolbox)
2021-06-01T12:23:57.966Z| vmx| | I005: GuestMsg: Channel 0, Cannot unpost because the previous post is already completed
2021-06-01T12:23:57.966Z| vmx| | I005: Tools: [AppStatus] Last heartbeat value 2023813 (last received 0s ago)
2021-06-01T12:23:57.966Z| vmx| | I005: TOOLS: appName=toolbox, oldStatus=1, status=0, guestInitiated=0.
2021-06-01T12:23:57.971Z| vmx| | I005: Destroying virtual dev for scsi0:0 vscsi=9881
2021-06-01T12:23:57.971Z| vmx| | I005: VMMon_VSCSIStopVports: No such target on adapter
2021-06-01T12:23:57.981Z| vmx| | I005: SVMotion_PowerOff: Not running Storage vMotion. Nothing to do
2021-06-01T12:23:57.982Z| mks| | I005: MKS-RenderMain: Stopped MKSBasicOps
2021-06-01T12:23:57.982Z| mks| | I005: MKS PowerOff
2021-06-01T12:23:57.982Z| svga| | I005: SVGA thread is exiting
2021-06-01T12:23:57.983Z| mks| | I005: MKS thread is exiting
2021-06-01T12:23:57.983Z| vmx| | W003: 
2021-06-01T12:23:57.983Z| vmx| | I005: scsi0:0: numIOs = 0 numMergedIOs = 0 numSplitIOs = 0 ( 0.0%)
2021-06-01T12:23:57.983Z| vmx| | I005: Closing disk 'scsi0:0'
2021-06-01T12:23:57.997Z| vmx| | I005: DISKLIB-CBT   : Shutting down change tracking for untracked fid 174997872.
2021-06-01T12:23:57.997Z| vmx| | I005: DISKLIB-CBT   : Successfully disconnected CBT node.
2021-06-01T12:23:58.026Z| vmx| | I005: DISKLIB-VMFS  : "/vmfs/volumes/5b29d5a0-b2fa8328-8e5f-d89d6716de70/Dashboard/Dashboard_2-flat.vmdk" : closed.
2021-06-01T12:23:58.037Z| vmx| | I005: Vix: [mainDispatch.c:1164]: VMAutomationPowerOff: Powering off.
2021-06-01T12:23:58.038Z| vmx| | I005: WORKER: asyncOps=1024 maxActiveOps=1 maxPending=2 maxCompleted=1
2021-06-01T12:24:01.700Z| vmx| | I005: Vix: [mainDispatch.c:4205]: VMAutomation_ReportPowerOpFinished: statevar=1, newAppState=1873, success=1 additionalError=0
2021-06-01T12:24:01.700Z| vmx| | I005: Vix: [mainDispatch.c:4223]: VMAutomation: Ignoring ReportPowerOpFinished because the VMX is shutting down.
2021-06-01T12:24:01.701Z| vmx| | A000: ConfigDB: Setting cleanShutdown = "TRUE"
2021-06-01T12:24:01.750Z| vmx| | I005: Vix: [mainDispatch.c:4205]: VMAutomation_ReportPowerOpFinished: statevar=0, newAppState=1870, success=1 additionalError=0
2021-06-01T12:24:01.750Z| vmx| | I005: Vix: [mainDispatch.c:4223]: VMAutomation: Ignoring ReportPowerOpFinished because the VMX is shutting down.
2021-06-01T12:24:01.750Z| vmx| | I005: Transitioned vmx/execState/val to poweredOff
2021-06-01T12:24:01.750Z| vmx| | I005: Vigor_ClientRequestCb: failed to do op=5 on unregistered device 'PowerState' (cmd=(null))
2021-06-01T12:24:01.751Z| vmx| | I005: VMX idle exit
2021-06-01T12:24:01.785Z| vmx| | I005: Vix: [mainDispatch.c:815]: VMAutomation_LateShutdown()
2021-06-01T12:24:01.785Z| vmx| | I005: Vix: [mainDispatch.c:770]: VMAutomationCloseListenerSocket. Closing listener socket.
2021-06-01T12:24:01.787Z| vmx| | I005: Flushing VMX VMDB connections
2021-06-01T12:24:01.787Z| vmx| | I005: VigorTransport_ServerCloseClient: Closing transport D5045A2770 (err = 0)
2021-06-01T12:24:01.787Z| vmx| | I005: VigorTransport_ServerDestroy: server destroyed.
2021-06-01T12:24:01.789Z| vmx| | I005: VMX exit (0).
2021-06-01T12:24:01.790Z| vmx| | I005: OBJLIB-LIB: ObjLib cleanup done.
2021-06-01T12:24:01.790Z| vmx| | I005: AIOMGR-S : stat o=584 r=900 w=294 i=22804 br=13063668 bw=639480
2021-06-01T12:24:01.790Z| vmx| | W003: VMX has left the building: 0.

Moderator note by wila: Pulled your post from the spam queue and moved the log details into a spoiler.

 

Reply
0 Kudos
Dibiase41
Contributor
Contributor

Reduce the number of search terms. Each term you use focuses the search further. · Check your spelling. A single misspelled or incorrectly typed.

 

Official website

Reply
0 Kudos
opoet3
Enthusiast
Enthusiast

UP

Reply
0 Kudos