VMware Cloud Community
cypherx
Hot Shot
Hot Shot
Jump to solution

Migrate 6.0u3 windows to 6.7u3f VCSA failure "Analytics Service registration with Component Manager failed"

Hi all,

I post here due to the slow response time of our case with VMWare.  I had to roll back (power back on and rejoin to AD) our Windows vCenter server 6 since this migration did not work.  I thought I had a smooth upgrade going after I found out I had to add our vmware service account the Replace a Process Level Token privelage in windows.  Once I fixed that minor thing the migration assistant worked well.


The VCSA deployed, powered off the Windows vCenter machine, the new one joined to AD and then eventually after some time it gave an error.

Analytics Service registration with Component Manager failed.  I downloaded a log file and inside the log package more details of the error is found in this file /var/log/firstboot/analytics_firstboot.py_14589_stderr.log

2020-04-13T16:28:06.223Z  Failed to register Analytics Service with Component Manager: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:719)

2020-04-13T16:28:06.231Z  Traceback (most recent call last):

  File "/usr/lib/vmware-analytics/firstboot/analytics_firstboot.py", line 214, in register_with_cm

    cloudvm_sso_cm_register(keystore, cisreg_spec, key_alias, dyn_vars, isPatch=is_patch)

  File "/usr/lib/vmware-cm/bin/cloudvmcisreg.py", line 706, in cloudvm_sso_cm_register

    serviceId = do_lsauthz_operation(cisreg_opts_dict)

  File "/usr/lib/vmware/site-packages/cis/cisreglib.py", line 997, in do_lsauthz_operation

    ls_obj = LookupServiceClient(ls_url, retry_count=60)

  File "/usr/lib/vmware/site-packages/cis/cisreglib.py", line 307, in __init__

    self._init_service_content()

  File "/usr/lib/vmware/site-packages/cis/cisreglib.py", line 287, in do_retry

    return req_method(self, *args, **kargs)

  File "/usr/lib/vmware/site-packages/cis/cisreglib.py", line 297, in _init_service_content

    self.service_content = si.RetrieveServiceContent()

  File "/usr/lib/vmware/site-packages/pyVmomi/VmomiSupport.py", line 557, in <lambda>

    self.f(*(self.args + (obj,) + args), **kwargs)

  File "/usr/lib/vmware/site-packages/pyVmomi/VmomiSupport.py", line 363, in _InvokeMethod

    return self._stub.InvokeMethod(self, info, args)

  File "/usr/lib/vmware/site-packages/pyVmomi/SoapAdapter.py", line 1385, in InvokeMethod

    conn.request('POST', self.path, req, headers)

  File "/usr/lib/python3.5/http/client.py", line 1123, in request

    self._send_request(method, url, body, headers)

  File "/usr/lib/python3.5/http/client.py", line 1168, in _send_request

    self.endheaders(body)

  File "/usr/lib/python3.5/http/client.py", line 1119, in endheaders

    self._send_output(message_body)

  File "/usr/lib/python3.5/http/client.py", line 944, in _send_output

    self.send(msg)

  File "/usr/lib/python3.5/http/client.py", line 887, in send

    self.connect()

  File "/usr/lib/vmware/site-packages/pyVmomi/SoapAdapter.py", line 1032, in connect

    six.moves.http_client.HTTPSConnection.connect(self)

  File "/usr/lib/python3.5/http/client.py", line 1277, in connect

    server_hostname=server_hostname)

  File "/usr/lib/python3.5/ssl.py", line 385, in wrap_socket

    _context=self)

  File "/usr/lib/python3.5/ssl.py", line 760, in __init__

    self.do_handshake()

  File "/usr/lib/python3.5/ssl.py", line 996, in do_handshake

    self._sslobj.do_handshake()

  File "/usr/lib/python3.5/ssl.py", line 641, in do_handshake

    self._sslobj.do_handshake()

ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:719)

2020-04-13T16:28:06.233Z  Exception: Traceback (most recent call last):

  File "/usr/lib/vmware-analytics/firstboot/analytics_firstboot.py", line 214, in register_with_cm

    cloudvm_sso_cm_register(keystore, cisreg_spec, key_alias, dyn_vars, isPatch=is_patch)

  File "/usr/lib/vmware-cm/bin/cloudvmcisreg.py", line 706, in cloudvm_sso_cm_register

    serviceId = do_lsauthz_operation(cisreg_opts_dict)

  File "/usr/lib/vmware/site-packages/cis/cisreglib.py", line 997, in do_lsauthz_operation

    ls_obj = LookupServiceClient(ls_url, retry_count=60)

  File "/usr/lib/vmware/site-packages/cis/cisreglib.py", line 307, in __init__

    self._init_service_content()

  File "/usr/lib/vmware/site-packages/cis/cisreglib.py", line 287, in do_retry

    return req_method(self, *args, **kargs)

  File "/usr/lib/vmware/site-packages/cis/cisreglib.py", line 297, in _init_service_content

    self.service_content = si.RetrieveServiceContent()

  File "/usr/lib/vmware/site-packages/pyVmomi/VmomiSupport.py", line 557, in <lambda>

    self.f(*(self.args + (obj,) + args), **kwargs)

  File "/usr/lib/vmware/site-packages/pyVmomi/VmomiSupport.py", line 363, in _InvokeMethod

    return self._stub.InvokeMethod(self, info, args)

  File "/usr/lib/vmware/site-packages/pyVmomi/SoapAdapter.py", line 1385, in InvokeMethod

    conn.request('POST', self.path, req, headers)

  File "/usr/lib/python3.5/http/client.py", line 1123, in request

    self._send_request(method, url, body, headers)

  File "/usr/lib/python3.5/http/client.py", line 1168, in _send_request

    self.endheaders(body)

  File "/usr/lib/python3.5/http/client.py", line 1119, in endheaders

    self._send_output(message_body)

  File "/usr/lib/python3.5/http/client.py", line 944, in _send_output

    self.send(msg)

  File "/usr/lib/python3.5/http/client.py", line 887, in send

    self.connect()

  File "/usr/lib/vmware/site-packages/pyVmomi/SoapAdapter.py", line 1032, in connect

    six.moves.http_client.HTTPSConnection.connect(self)

  File "/usr/lib/python3.5/http/client.py", line 1277, in connect

    server_hostname=server_hostname)

  File "/usr/lib/python3.5/ssl.py", line 385, in wrap_socket

    _context=self)

  File "/usr/lib/python3.5/ssl.py", line 760, in __init__

    self.do_handshake()

  File "/usr/lib/python3.5/ssl.py", line 996, in do_handshake

    self._sslobj.do_handshake()

  File "/usr/lib/python3.5/ssl.py", line 641, in do_handshake

    self._sslobj.do_handshake()

ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:719)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

  File "/usr/lib/vmware-analytics/firstboot/analytics_firstboot.py", line 314, in main

    fb.register_with_cm(analytics_int_http, is_patch)

  File "/usr/lib/vmware-analytics/firstboot/analytics_firstboot.py", line 225, in register_with_cm

    problem_id='install.analytics.cmregistration.failed')

cis.baseCISException.BaseInstallException: {

    "componentKey": "analytics",

    "detail": [

        {

            "id": "install.analytics.cmregistration.failed",

            "localized": "Analytics Service registration with Component Manager failed.",

            "translatable": "Analytics Service registration with Component Manager failed."

        }

    ],

    "problemId": "install.analytics.cmregistration.failed",

    "resolution": {

        "id": "install.analytics.cmregistration.failed.res",

        "localized": "Please search for these symptoms in the VMware Knowledge Base for any known issues and possible resolutions. If none can be found, collect a support bundle and open a support request.",

        "translatable": "Please search for these symptoms in the VMware Knowledge Base for any known issues and possible resolutions. If none can be found, collect a support bundle and open a support request."

    }

}

2020-04-13T16:28:06.233Z  VMware Analytics Service firstboot failed

I google search this and found this vmware kb

https://kb.vmware.com/s/article/67198

This lead me down the rabbit hole of enabling ssh and bash so I could winscp our root cert ca pem file onto the box. I did this and copied our root ca certificate which is a windows 2012 R2 certificate authority to /etc/ssl/certs.

/usr/lib/vmware-vmafd/bin/dir-cli trustedcert publish --cert /etc/ssl/certs/Root-CA.cert.pem

It imported successfully (or so it told me)

I hit retry and it was the same error. I then rebooted the new VCSA thinking maybe it needs to be rebooted to take affect. It rebooted but changed its IP address to the final IP that the old windows vcenter was. So I changed it back to the temporary ip address using DCUI so the migration wizard could contact it. I still get the same error when retrying the migration wizard that is running on my windows box "Analytics Service registration with Component Manager failed".

I then followed this guide to try to replace the vSPhere 6.0 Machine SSL certificate with a VMCA issued certificate

https://kb.vmware.com/s/article/2112279

However I get an error and tells me to check the and in the certificate-manager.log.  An excerpt from that log right around the ERROR lines:

2020-04-13T16:52:47.565Z INFO certificate-manager MACHINE_SSL_CERT certificate replaced successfully. SerialNumber and Thumbprint changed.

2020-04-13T16:52:47.662Z INFO certificate-manager lstool command currently being executed is : ['/usr/java/jre-vmware/bin/java', '-Djava.security.properties=/etc/vmware/java/vmware-override-java.security', '-cp', '/usr/lib/vmidentity/tools/lib/lookup-client.jar:/usr/lib/vmidentity/tools/lib/*', '-Dlog4j.configuration=tool-log4j.properties', 'com.vmware.vim.lookup.client.tool.LsTool', 'get-site-id', '--no-check-cert', '--url',

2020-04-13T16:52:49.487Z ERROR certificate-manager 'lstool get-site-id' failed: 1

2020-04-13T16:52:49.490Z ERROR certificate-manager Error while replacing Machine SSL Cert, please see /var/log/vmware/vmcad/certificate-manager.log for more information.

2020-04-13T16:52:49.490Z ERROR certificate-manager 'lstool get-site-id' failed: 1

2020-04-13T16:52:49.492Z INFO certificate-manager Performing rollback of Machine SSL Cert...

Now, not sure what to do, and just crickets with vmware support... I powered off the VCSA, powered back on the windows vcenter, rejoined to to AD and rebooted and now were back to vcenter 6.0 update 3.

Any ideas?  The vmware certificates always put us through utter hell on the windows environment and it seems its going to continue to be that way on the VCSA.

We really would like to migrate off of the Windows Server 2008 R2 vms running vcenter 6.0.  This is our DR site.  I haven't even touched production yet.  I wouldn't be surprised if our production site will give us difficulties.  I can't install the latest windows vcenter 6.0 patch... error 1603 starting some service... and none of the kb articles helped (removing some vmware java stuff, etc..).  So I can't wait to get off of the Windows platform and upgrade.

DR Vcenter 6.0.0 build 14510545 - attempting first.  - Running on fully patched Windows Server 2008 R2

SRM virtual appliance 8.2

vSphere replicaiton virtual appliance 8.2 (its receiving inbound from hq)

2 ESXi 6.0.0, 15169789 hosts - HP 380g8's, eventually will take them to latest ESXi 6.5 build using HP's custom image.  Hosts are not officially supported past 6.5.

HQ Vcenter 6.0.0 build 9313458 - will do second.  - Running on fully patched Windows Server 2008 R2

SRM virtual appliance 8.2

vSphere replicaiton virtual appliance 8.2 (Its replicating 28 vms to the DR site).

8 ESXi 6.0.0, 15169789 hosts.  Dell FC640's, eventually will take them to latest ESXi 6.7 build using Dells custom image.

Reply
0 Kudos
1 Solution

Accepted Solutions
cypherx
Hot Shot
Hot Shot
Jump to solution

vmware support had to get involved.

To successfully migrate windows vcenter 6.0u3 to VMCA 6.7 in my situation:

1 - First have a snapshot of your vcenter

2 - on Windows reset all certificates (option 8 in certificate manager)

     When certificate manager is hung for a few minutes on 85% starting services, just reboot the windows vcenter.

3 - Now proceed with the upgrade / migration using the VMCA 6.7 appliance installer iso mounted to the vcenter.  Then mount it to another vm on the same host and run the windows portion.

4 - Because the certs were reset the migration works.

5 - When done I access it but need to use incognito mode to bypass the HSTS browser warning since the cert is defaulted and not trusted.

6 - Fixing the machine certs was giving me issues.  in VMCA the certificate manager script would hang at 85% starting services and roll back.  At this point I brought in vmware tech support.

VMWare support took over from here, here is some of their notes and they also while not noted downloaded a program called JXplorer to view and delete some things.

Total call took about 2 1/2 hours, including pulling in a Site Recovery Manager expert to help with the repairing and upgrade licencing of SRM and vSphere Replication.  The vSphere Replication "Bad exit code 1" was fixed by powering off the VR appliance, while the cert expert at vmware deleted a tree in JXplorer having to do with it.  When the VR replication appliance came back on I was able to save the changes and have it register, accepting the new certificate when prompted.  We had to repair the HQ and DR sites in SRM again as well as restart the srm service on each sites srm virtual appliance.

VCSA : drvcenter.domain.com

Number of store in the vCenter :

root@drvcenter [ /tmp ]# /usr/lib/vmware-vmafd/bin/vecs-cli store list
MACHINE_SSL_CERT
TRUSTED_ROOTS
TRUSTED_ROOT_CRLS
machine
vsphere-webclient
vpxd
vpxd-extension
SMS
BACKUP_STORE
STS_INTERNAL_SSL_CERT
APPLMGMT_PASSWORD
data-encipherment

STS_INTERNAL_SSL_CERT is the store which would get inherited from the 5.x version of vcenter :

*************************

Below are the certs in the trusted store:

Alias : 61a0c6df223b0d3c875b1777e58d03f84fbe9a5f : issuer for STS, vmdird cert
Alias : 7d719c7ce6fae2313767c71e5fd1e2bd0d9de14a
Alias : bdab88a13be919ea69686b270e92286f8c54230b
Alias : 46f3c6c1314f570e9ceba58191a94b9656c26ae9
Alias : d66832bcc10c01a0b33ae246e5fb0c3eba9ab024
Alias : 3393868114bf27419c76075359ba098577f9a429
Alias : 0fce94ed8eeaae8a79c3376db87b2d5bc86cc12e : VMCA root

Took a output of the machine ssl cert and sts cert, deleted the sts store and update it with machine ssl cert:

/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store MACHINE_SSL_CERT --alias __MACHINE_CERT --output /backup/Machine_SSL.crt

/usr/lib/vmware-vmafd/bin/vecs-cli entry getkey --store MACHINE_SSL_CERT --alias __MACHINE_CERT --output /backup/Machine_SSL.key

/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store STS_INTERNAL_SSL_CERT --alias __MACHINE_CERT --output /backup/STS.crt

/usr/lib/vmware-vmafd/bin/vecs-cli entry getkey --store STS_INTERNAL_SSL_CERT --alias __MACHINE_CERT --output /backup/STS.key

/usr/lib/vmware-vmafd/bin/vecs-cli entry delete --store STS_INTERNAL_SSL_CERT --alias __MACHINE_CERT --server localhost --upn administrator@drvsphere.local -y

/usr/lib/vmware-vmafd/bin/vecs-cli entry create --store STS_INTERNAL_SSL_CERT --alias __MACHINE_CERT --cert /backup/Machine_SSL.crt --key /backup/Machine_SSL.key --server localhost --upn administrator@drvsphere.local

Replaced the STS cert using the script.

Took a backup of all the certs except vmca root :

/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store TRUSTED_ROOTS --alias 61a0c6df223b0d3c875b1777e58d03f84fbe9a5f --output /backup/61a0c6df223b0d3c875b1777e58d03f84fbe9a5f.cer
/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store TRUSTED_ROOTS --alias 7d719c7ce6fae2313767c71e5fd1e2bd0d9de14a --output /backup/7d719c7ce6fae2313767c71e5fd1e2bd0d9de14a.cer
/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store TRUSTED_ROOTS --alias bdab88a13be919ea69686b270e92286f8c54230b --output /backup/bdab88a13be919ea69686b270e92286f8c54230b.cer
/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store TRUSTED_ROOTS --alias 46f3c6c1314f570e9ceba58191a94b9656c26ae9 --output /backup/46f3c6c1314f570e9ceba58191a94b9656c26ae9.cer
/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store TRUSTED_ROOTS --alias d66832bcc10c01a0b33ae246e5fb0c3eba9ab024 --output /backup/d66832bcc10c01a0b33ae246e5fb0c3eba9ab024.cer
/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store TRUSTED_ROOTS --alias 3393868114bf27419c76075359ba098577f9a429 --output /backup/3393868114bf27419c76075359ba098577f9a429.cer

Deleted it from the trusted root

/usr/lib/vmware-vmafd/bin/vecs-cli entry delete --store TRUSTED_ROOTS --alias 61a0c6df223b0d3c875b1777e58d03f84fbe9a5f -y
/usr/lib/vmware-vmafd/bin/vecs-cli entry delete --store TRUSTED_ROOTS --alias 7d719c7ce6fae2313767c71e5fd1e2bd0d9de14a -y
/usr/lib/vmware-vmafd/bin/vecs-cli entry delete --store TRUSTED_ROOTS --alias bdab88a13be919ea69686b270e92286f8c54230b -y
/usr/lib/vmware-vmafd/bin/vecs-cli entry delete --store TRUSTED_ROOTS --alias 46f3c6c1314f570e9ceba58191a94b9656c26ae9 -y
/usr/lib/vmware-vmafd/bin/vecs-cli entry delete --store TRUSTED_ROOTS --alias d66832bcc10c01a0b33ae246e5fb0c3eba9ab024 -y
/usr/lib/vmware-vmafd/bin/vecs-cli entry delete --store TRUSTED_ROOTS --alias 3393868114bf27419c76075359ba098577f9a429 -y

unpublished from the database:

/usr/lib/vmware-vmafd/bin/dir-cli trustedcert unpublish --cert /backup/61a0c6df223b0d3c875b1777e58d03f84fbe9a5f.cer --login administrator@drvsphere.local
/usr/lib/vmware-vmafd/bin/dir-cli trustedcert unpublish --cert /backup/7d719c7ce6fae2313767c71e5fd1e2bd0d9de14a.cer --login administrator@drvsphere.local
/usr/lib/vmware-vmafd/bin/dir-cli trustedcert unpublish --cert /backup/bdab88a13be919ea69686b270e92286f8c54230b.cer --login administrator@drvsphere.local
/usr/lib/vmware-vmafd/bin/dir-cli trustedcert unpublish --cert /backup/46f3c6c1314f570e9ceba58191a94b9656c26ae9.cer --login administrator@drvsphere.local
/usr/lib/vmware-vmafd/bin/dir-cli trustedcert unpublish --cert /backup/d66832bcc10c01a0b33ae246e5fb0c3eba9ab024.cer --login administrator@drvsphere.local
/usr/lib/vmware-vmafd/bin/dir-cli trustedcert unpublish --cert /backup/3393868114bf27419c76075359ba098577f9a429.cer --login administrator@drvsphere.local

And then replaced the cert on vcenter which went fine.

_______________________________________________________________________________________________________

"Did you find this helpful? Let us know by completing this survey (takes 1 minute!)"

View solution in original post

Reply
0 Kudos
3 Replies
cypherx
Hot Shot
Hot Shot
Jump to solution

Support says in the Windows vcenter reset machine SSL certificate and then do the upgrade.  I believe that is option 3 in the certificate manager.  I did try that and it hung at 85% starting services for about an hour, and then rolled back.  I reverted to my prior snapshot and this time I'm trying option 8, reset all certificates.  So far I'm in the same boat.  Its currently at 85%.  In services.msc I see the vCenter service in "starting" state, but it never starts.  I'll wait it out before reverting back to my snapshot.

Reply
0 Kudos
cypherx
Hot Shot
Hot Shot
Jump to solution

Even option 8 - Reset ALL certificates in certificate-manager script hangs at 85% starting services.  What I see in services.msc is the VMWare vCenter service is stuck in the "starting" state.

I seem to remember going through this before when our certificates expired and we had to renew them.  The trick is once you wait a bit and as long as you see the service in "starting", it never will start, just reboot the windows machine.  When Windows server reboots and you log in, all of the certs have changed anyway (it seems).  I mean when I try to log into the web site it can't since its now untrusted, an the cert is signed by itself... not our CA.  And the C# client shows a cert warning you can ignore and log in.

So now that the certs appear reset, I'm trying the migration again.  Currently its copying data from source vCenter Server to target vCenter Server.  The estimated time the vCenter Server Appliance Installer gave me to migrate everything is just over 2 hours.  This is a DR site on spinning disks so its not a speed daemon.  If successful then I can work with support to get new certificates on the new appliance so the web client is trusted.  I know without it I can't use chrome... some message about HSTS security and there's no option to "proceed anyway".

When our DR site is fully operational, along with its other components, we can look to doing the HQ site, which is larger and more critical.  We have dv switches at HQ, but they are version 6.  We also have 8 ESXi hosts at HQ, but everything is on SSD storage.

Reply
0 Kudos
cypherx
Hot Shot
Hot Shot
Jump to solution

vmware support had to get involved.

To successfully migrate windows vcenter 6.0u3 to VMCA 6.7 in my situation:

1 - First have a snapshot of your vcenter

2 - on Windows reset all certificates (option 8 in certificate manager)

     When certificate manager is hung for a few minutes on 85% starting services, just reboot the windows vcenter.

3 - Now proceed with the upgrade / migration using the VMCA 6.7 appliance installer iso mounted to the vcenter.  Then mount it to another vm on the same host and run the windows portion.

4 - Because the certs were reset the migration works.

5 - When done I access it but need to use incognito mode to bypass the HSTS browser warning since the cert is defaulted and not trusted.

6 - Fixing the machine certs was giving me issues.  in VMCA the certificate manager script would hang at 85% starting services and roll back.  At this point I brought in vmware tech support.

VMWare support took over from here, here is some of their notes and they also while not noted downloaded a program called JXplorer to view and delete some things.

Total call took about 2 1/2 hours, including pulling in a Site Recovery Manager expert to help with the repairing and upgrade licencing of SRM and vSphere Replication.  The vSphere Replication "Bad exit code 1" was fixed by powering off the VR appliance, while the cert expert at vmware deleted a tree in JXplorer having to do with it.  When the VR replication appliance came back on I was able to save the changes and have it register, accepting the new certificate when prompted.  We had to repair the HQ and DR sites in SRM again as well as restart the srm service on each sites srm virtual appliance.

VCSA : drvcenter.domain.com

Number of store in the vCenter :

root@drvcenter [ /tmp ]# /usr/lib/vmware-vmafd/bin/vecs-cli store list
MACHINE_SSL_CERT
TRUSTED_ROOTS
TRUSTED_ROOT_CRLS
machine
vsphere-webclient
vpxd
vpxd-extension
SMS
BACKUP_STORE
STS_INTERNAL_SSL_CERT
APPLMGMT_PASSWORD
data-encipherment

STS_INTERNAL_SSL_CERT is the store which would get inherited from the 5.x version of vcenter :

*************************

Below are the certs in the trusted store:

Alias : 61a0c6df223b0d3c875b1777e58d03f84fbe9a5f : issuer for STS, vmdird cert
Alias : 7d719c7ce6fae2313767c71e5fd1e2bd0d9de14a
Alias : bdab88a13be919ea69686b270e92286f8c54230b
Alias : 46f3c6c1314f570e9ceba58191a94b9656c26ae9
Alias : d66832bcc10c01a0b33ae246e5fb0c3eba9ab024
Alias : 3393868114bf27419c76075359ba098577f9a429
Alias : 0fce94ed8eeaae8a79c3376db87b2d5bc86cc12e : VMCA root

Took a output of the machine ssl cert and sts cert, deleted the sts store and update it with machine ssl cert:

/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store MACHINE_SSL_CERT --alias __MACHINE_CERT --output /backup/Machine_SSL.crt

/usr/lib/vmware-vmafd/bin/vecs-cli entry getkey --store MACHINE_SSL_CERT --alias __MACHINE_CERT --output /backup/Machine_SSL.key

/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store STS_INTERNAL_SSL_CERT --alias __MACHINE_CERT --output /backup/STS.crt

/usr/lib/vmware-vmafd/bin/vecs-cli entry getkey --store STS_INTERNAL_SSL_CERT --alias __MACHINE_CERT --output /backup/STS.key

/usr/lib/vmware-vmafd/bin/vecs-cli entry delete --store STS_INTERNAL_SSL_CERT --alias __MACHINE_CERT --server localhost --upn administrator@drvsphere.local -y

/usr/lib/vmware-vmafd/bin/vecs-cli entry create --store STS_INTERNAL_SSL_CERT --alias __MACHINE_CERT --cert /backup/Machine_SSL.crt --key /backup/Machine_SSL.key --server localhost --upn administrator@drvsphere.local

Replaced the STS cert using the script.

Took a backup of all the certs except vmca root :

/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store TRUSTED_ROOTS --alias 61a0c6df223b0d3c875b1777e58d03f84fbe9a5f --output /backup/61a0c6df223b0d3c875b1777e58d03f84fbe9a5f.cer
/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store TRUSTED_ROOTS --alias 7d719c7ce6fae2313767c71e5fd1e2bd0d9de14a --output /backup/7d719c7ce6fae2313767c71e5fd1e2bd0d9de14a.cer
/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store TRUSTED_ROOTS --alias bdab88a13be919ea69686b270e92286f8c54230b --output /backup/bdab88a13be919ea69686b270e92286f8c54230b.cer
/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store TRUSTED_ROOTS --alias 46f3c6c1314f570e9ceba58191a94b9656c26ae9 --output /backup/46f3c6c1314f570e9ceba58191a94b9656c26ae9.cer
/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store TRUSTED_ROOTS --alias d66832bcc10c01a0b33ae246e5fb0c3eba9ab024 --output /backup/d66832bcc10c01a0b33ae246e5fb0c3eba9ab024.cer
/usr/lib/vmware-vmafd/bin/vecs-cli entry getcert --store TRUSTED_ROOTS --alias 3393868114bf27419c76075359ba098577f9a429 --output /backup/3393868114bf27419c76075359ba098577f9a429.cer

Deleted it from the trusted root

/usr/lib/vmware-vmafd/bin/vecs-cli entry delete --store TRUSTED_ROOTS --alias 61a0c6df223b0d3c875b1777e58d03f84fbe9a5f -y
/usr/lib/vmware-vmafd/bin/vecs-cli entry delete --store TRUSTED_ROOTS --alias 7d719c7ce6fae2313767c71e5fd1e2bd0d9de14a -y
/usr/lib/vmware-vmafd/bin/vecs-cli entry delete --store TRUSTED_ROOTS --alias bdab88a13be919ea69686b270e92286f8c54230b -y
/usr/lib/vmware-vmafd/bin/vecs-cli entry delete --store TRUSTED_ROOTS --alias 46f3c6c1314f570e9ceba58191a94b9656c26ae9 -y
/usr/lib/vmware-vmafd/bin/vecs-cli entry delete --store TRUSTED_ROOTS --alias d66832bcc10c01a0b33ae246e5fb0c3eba9ab024 -y
/usr/lib/vmware-vmafd/bin/vecs-cli entry delete --store TRUSTED_ROOTS --alias 3393868114bf27419c76075359ba098577f9a429 -y

unpublished from the database:

/usr/lib/vmware-vmafd/bin/dir-cli trustedcert unpublish --cert /backup/61a0c6df223b0d3c875b1777e58d03f84fbe9a5f.cer --login administrator@drvsphere.local
/usr/lib/vmware-vmafd/bin/dir-cli trustedcert unpublish --cert /backup/7d719c7ce6fae2313767c71e5fd1e2bd0d9de14a.cer --login administrator@drvsphere.local
/usr/lib/vmware-vmafd/bin/dir-cli trustedcert unpublish --cert /backup/bdab88a13be919ea69686b270e92286f8c54230b.cer --login administrator@drvsphere.local
/usr/lib/vmware-vmafd/bin/dir-cli trustedcert unpublish --cert /backup/46f3c6c1314f570e9ceba58191a94b9656c26ae9.cer --login administrator@drvsphere.local
/usr/lib/vmware-vmafd/bin/dir-cli trustedcert unpublish --cert /backup/d66832bcc10c01a0b33ae246e5fb0c3eba9ab024.cer --login administrator@drvsphere.local
/usr/lib/vmware-vmafd/bin/dir-cli trustedcert unpublish --cert /backup/3393868114bf27419c76075359ba098577f9a429.cer --login administrator@drvsphere.local

And then replaced the cert on vcenter which went fine.

_______________________________________________________________________________________________________

"Did you find this helpful? Let us know by completing this survey (takes 1 minute!)"

Reply
0 Kudos