1 person found this helpful
This is same discussed in the kb as below but removing the vib should not harm anything. I am unsure why it shows not compliant, possibly you can remove the one with 1493 kb and install the one with 1602kb
If you remove existing one available in the list, then there is only one which is removed so when scanned against the baseline, the vib is missing could lead to non-compliant behavior
Interesting. I see in the kb: "Workaround: Create a custom baseline without the conflicting vib."
I've done this and it does seem to work. With that vib removed, the host shows as non-compliant with the default built-in baseline. I'll have to make sure that default baseline remains unattached since another admin would be likely to come along in a few days/weeks and apply it (thus reinstalling the conflicting vib).
So I think I may brought this on myself by having the HPE vibsdepot as a patch source in addition to the default VMware repository. I just reset VUM back to defaults and it seems to have cleared all of this up.
Just to add to this discussion, we have an identical problem to the one you've reported. In our case, we are adding two new HPE Gen10 Blade Servers to a system with 630FLB and 534M HBAs, Virtual Connect switches, and a 3PAR system, all operating over an iSCSI SAN, so we are definitely using hardware that depends on this VIB and related drivers.
Like you, we started with the HPE 6.7U2 Custom ISO. Based on HPE's Technical White Paper at:
we were aware of the conflicting VIB at the outset of this exercise (see Appendix O in this document). However, as you have noted, deleting this VIB does not resolve the problem, and after many attempts, we just keep going around in circles, always coming back to the state where we have duplicate, incompatible "elx-esx-libelxima" VIBs. One VIB shows up as "elx-esx-libelxima.so" while the other usually shows up as "elx-esx-libelxima.so-8169922." The Creation Dates for these two VIB variants vary, depending on the order In which we try to remediate these hosts. I've also tried manually installing the latest version of this VIB after deleting the duplicates, but that actually made the remediation problem worse.
We also see problems with the "hpe-driver-bundle-670," of which there are five versions, including 10.2.0, 10.3.0, 10.3.5, 10.4.0, and 10.4.1. Our ESXi host is actually using the "hpe-driver-bundle-618.104.22.168," which is the latest from April 10, 2019. After we remediate one of our new ESXi Gen10 hosts, we see that we are non-compliant with the VMware Predefined "Non-Critical Host Patches" baseline. What is interesting is that there are only two patches that are not compliant:
But this makes no sense, given that hpe-driver-bundle-622.214.171.124 is what we actually have installed.
Honestly, this is a maddening problem. Even following HPE's instructions to delete this "elx-esx-libelxima" VIB results in a situation where remediating restores the two conflicting VIBs, which then prevent further remediation. It isn't clear which vendor is the source of this problem, but it seems absurd to be having this problem no matter what the source is.
We're still rolling with the VUM defaults. We haven't put the HPE vibsdepot URL back in (as much as we'd like to).
We have exactly the same problem here with our new HPE Gen10 server. Did someone already find a newer version of this VIB?
I ran into the same issue with some Synergy SY480 Compute nodes. Followed this guide:
esxcli software vib list | grep ELX
Command Result –
elx-esx-libelxima.so 12.0.1108.0-03 ELX VMwareCertified 2018-08-23
Now to remove the VIB, use the command mentioned below.
esxcli software vib remove -n elx-esx-libelxima.so
For me this works. After the reboot I was able to scan the hosts and update them with UM.
I'm hitting these problems too. I used the vibsdepot with 5.5 and 6.5 without issues. Why problems with 6.7? I'm confused.
I am hitting the same problem here - is there any solution how to get rid of it and
a) Still to follow HPE Baseline
b) Still follow ESXi Patches
I will add to this, as this has been my entire weekend and then some thus far. I have tried nearly everything myself. The issue is both HPE device driver / bundles: hpe-driver-bundle-6126.96.36.199 and 10.3.5. Both of these, whether, installed independently, together, 3.5 first, 3.0 second... nothing.
As someone else said, I even installed 10.4.1 by itself. It works, but the VUM complains that the 10.3.0 and 10.3.5 patches are needed. These are showing up in the default non-critical patches.
I have also tried just updating the host with ESXi670-Update02 and it still complains that the above are missing. The other thing is if I install U2 and then the 10.3.x updates, when I uninstall them, VUM complains about U2 not being installed and I have to do that again.
This problem has been lingering since April and after hours of digging, there is nothing anywhere I can find to fix this, other than work around it?
I opened a SR at VMware, they told me it's a third-party problem and I have to open an SR at HPE. So I opened one at HPE. The painful answer and solution was first to uninstall the conflicting VIB and then to add a modified download resource http://vibsdepot.hpe.com/hpe/apr.11.2019/index-drv.xml
With this it works again. But keep two things in mind. First the download resource should only be modified till the next ISO image is provided by HPE and second you will only be able to update v6.7!