Update command:
# esxcli software profile update -p ESXi-6.7.0-20190402001-standard -d https://hostupdate.vmware.com/software/VUM/PRODUCTION/main/vmw-depot-index.xml
[OSError]
[Errno 28] No space left on device
Please refer to the log file for more details.
Swap config:
All swap is on, datastore use datastore1 and more than 100G space.
Please see the /var/log/esxupdata.log below:
2019-04-16T00:41:48Z esxupdate: 2099451: Ramdisk: INFO: Unmounting manual tardisk /tardisks.noauto/esxupdt-2099451^@
2019-04-16T00:41:48Z esxupdate: 2099451: Ramdisk: INFO: Unmounting manual tardisk /tardisks.noauto/weaselin-2099451^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: Traceback (most recent call last):^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: File "/usr/lib/vmware/esxcli-software", line 470, in <module>^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: main()^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: File "/usr/lib/vmware/esxcli-software", line 461, in main^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: ret = CMDTABLE[command](options)^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: File "/usr/lib/vmware/esxcli-software", line 213, in ProfileUpdateCmd^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: nohwwarning=opts.nohwwarning)^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: File "/build/mts/release/bora-10764712/bora/build/esx/release/vmvisor/esxupdate/lib64/python3.5/site-packages/vmware/esx
image/Transaction.py", line 375, in UpdateProfileFromDepot^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: File "/tmp/esx-update-2099451/usr/lib/vmware/weasel/util/upgrade_precheck.py", line 2161, in cliUpgradeAction^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: File "/tmp/esx-update-2099451/usr/lib/vmware/weasel/util/upgrade_precheck.py", line 997, in _parseVmwareVersion^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: File "/build/mts/release/bora-10764712/bora/build/esx/release/vmvisor/sys-boot/lib64/python3.5/subprocess.py", line 514,
in getoutput^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: File "/build/mts/release/bora-10764712/bora/build/esx/release/vmvisor/sys-boot/lib64/python3.5/subprocess.py", line 495,
in getstatusoutput^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: File "/build/mts/release/bora-10764712/bora/build/esx/release/vmvisor/sys-boot/lib64/python3.5/subprocess.py", line 316,
in check_output^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: File "/build/mts/release/bora-10764712/bora/build/esx/release/vmvisor/sys-boot/lib64/python3.5/subprocess.py", line 383,
in run^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: File "/build/mts/release/bora-10764712/bora/build/esx/release/vmvisor/sys-boot/lib64/python3.5/subprocess.py", line 676,
in __init__^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: File "/build/mts/release/bora-10764712/bora/build/esx/release/vmvisor/sys-boot/lib64/python3.5/subprocess.py", line 1228
, in _execute_child^@
2019-04-16T00:41:48Z esxupdate: 2099451: root: ERROR: OSError: [Errno 28] No space left on device
Hi jimgreeny and welcome to the community!
Can you run vdf -h and return the results here?
Kind regards.
Hi ThompsG, Nice to receive your reply.
1. VDF Output:
[root@DESKTOP-974PNND:~] vdf -h
Tardisk Space Used
vmx.v00 102M 102M
vim.v00 110M 110M
sb.v00 186M 186M
s.v00 59M 59M
net55_r8.t00 4M 4M
ata_liba.v00 296K 294K
ata_pata.v00 48K 44K
ata_pata.v01 36K 35K
ata_pata.v02 40K 36K
ata_pata.v03 40K 37K
ata_pata.v04 44K 41K
ata_pata.v05 40K 38K
ata_pata.v06 36K 35K
ata_pata.v07 40K 39K
block_cc.v00 88K 84K
bnxtnet.v00 544K 542K
bnxtroce.v00 304K 303K
brcmfcoe.v00 2M 2M
char_ran.v00 52K 51K
ehci_ehc.v00 96K 95K
elxiscsi.v00 516K 514K
elxnet.v00 644K 641K
hid_hid.v00 64K 60K
i40en.v00 540K 537K
iavmd.v00 180K 176K
igbn.v00 328K 325K
ima_qla4.v00 1M 1M
ipmi_ipm.v00 44K 40K
ipmi_ipm.v01 84K 82K
ipmi_ipm.v02 104K 102K
iser.v00 236K 234K
ixgben.v00 492K 490K
lpfc.v00 2M 2M
lpnic.v00 644K 641K
lsi_mr3.v00 348K 344K
lsi_msgp.v00 512K 508K
lsi_msgp.v01 532K 531K
lsi_msgp.v02 508K 507K
misc_cni.v00 24K 22K
misc_dri.v00 1M 1M
mtip32xx.v00 256K 254K
ne1000.v00 640K 639K
nenic.v00 268K 267K
net_bnx2.v00 288K 286K
net_bnx2.v01 2M 2M
net_cdc_.v00 28K 25K
net_cnic.v00 148K 145K
net_e100.v00 312K 311K
net_e100.v01 356K 353K
net_enic.v00 176K 175K
net_fcoe.v00 80K 79K
net_forc.v00 128K 125K
net_igb.v00 324K 321K
net_ixgb.v00 412K 409K
net_libf.v00 76K 75K
net_mlx4.v00 356K 355K
net_mlx4.v01 240K 236K
net_nx_n.v00 1M 1M
net_tg3.v00 316K 312K
net_usbn.v00 56K 53K
net_vmxn.v00 112K 108K
nfnic.v00 500K 496K
nhpsa.v00 568K 565K
nmlx4_co.v00 728K 724K
nmlx4_en.v00 756K 752K
nmlx4_rd.v00 296K 294K
nmlx5_co.v00 1M 1M
nmlx5_rd.v00 256K 255K
ntg3.v00 116K 115K
nvme.v00 288K 284K
nvmxnet3.v00 172K 171K
nvmxnet3.v01 172K 170K
ohci_usb.v00 64K 63K
pvscsi.v00 120K 118K
qcnic.v00 284K 281K
qedentv.v00 2M 2M
qfle3.v00 2M 2M
qfle3f.v00 1M 1M
qfle3i.v00 336K 332K
qflge.v00 504K 500K
sata_ahc.v00 88K 87K
sata_ata.v00 60K 56K
sata_sat.v00 68K 66K
sata_sat.v01 48K 46K
sata_sat.v02 52K 48K
sata_sat.v03 40K 39K
sata_sat.v04 36K 34K
scsi_aac.v00 180K 178K
scsi_adp.v00 452K 450K
scsi_aic.v00 296K 292K
scsi_bnx.v00 284K 280K
scsi_bnx.v01 204K 203K
scsi_fni.v00 244K 242K
scsi_hps.v00 212K 211K
scsi_ips.v00 108K 106K
scsi_isc.v00 44K 42K
scsi_lib.v00 212K 211K
scsi_meg.v00 100K 99K
scsi_meg.v01 176K 173K
scsi_meg.v02 96K 93K
scsi_mpt.v00 460K 459K
scsi_mpt.v01 504K 501K
scsi_mpt.v02 432K 429K
scsi_qla.v00 292K 289K
shim_isc.v00 16K 15K
shim_isc.v01 16K 15K
shim_lib.v00 44K 43K
shim_lib.v01 44K 43K
shim_lib.v02 24K 20K
shim_lib.v03 24K 20K
shim_lib.v04 12K 9K
shim_lib.v05 12K 9K
shim_vmk.v00 220K 217K
shim_vmk.v01 232K 228K
shim_vmk.v02 236K 232K
smartpqi.v00 268K 265K
uhci_usb.v00 64K 63K
usb_stor.v00 164K 162K
usbcore_.v00 320K 318K
vmkata.v00 204K 201K
vmkfcoe.v00 936K 935K
vmkplexe.v00 48K 46K
vmkusb.v00 984K 980K
vmw_ahci.v00 280K 276K
xhci_xhc.v00 236K 235K
elx_esx_.v00 4M 4M
btldr.t00 956K 952K
esx_dvfi.v00 492K 488K
esx_ui.v00 14M 14M
esxupdt.v00 580K 576K
weaselin.t00 4M 4M
lsu_hp_h.v00 152K 149K
lsu_inte.v00 44K 41K
lsu_lsi_.v00 240K 237K
lsu_lsi_.v01 480K 477K
lsu_lsi_.v02 264K 260K
lsu_lsi_.v03 560K 557K
lsu_smar.v00 96K 93K
native_m.v00 816K 815K
qlnative.v00 2M 2M
rste.v00 828K 825K
vmware_e.v00 188K 186K
vsan.v00 41M 41M
vsanheal.v00 7M 7M
vsanmgmt.v00 23M 23M
xorg.v00 3M 3M
imgdb.tgz 1M 1M
state.tgz 16K 14K
-----
Ramdisk Size Used Available Use% Mounted on
root 32M 2M 29M 7% --
etc 28M 172K 27M 0% --
opt 32M 0B 32M 0% --
var 48M 344K 47M 0% --
tmp 256M 8K 255M 0% --
iofilters 32M 0B 32M 0% --
shm 1024M 0B 1024M 0% --
hostdstats 182M 1M 180M 0% --
2. Swap Config:
i am having exact the same issue with 4 hosts,
all the hosts has 80% free storage but it still says " full disk "
Same here!
Trying to upgrade two 6.5.x Hosts which use dual 8GB SD cards with booting from Dell Custom 6.7u1 custom ISO and get the same error. After booting back into the old 6.5 i see 100% usage of one of the VFAT partitions.
It was a Essentials Plus customer so we decide to reinstall ESXi instead upgrading.
Regards
Joerg
i managed to get it fixed
run the below command
) Install the locker vib manually:
cd /tmp
esxcli software vib install -f -v /tmp/VMware_locker_tools-light_6.5.0-0.23.5969300.vib
Hi Centosuser,
Thanks for reply.
I have try to reinstall tools-light component from ssh, and reboot my system, then try to update image profile, but still have the error.
# esxcli software vib list | grep tools
tools-light 10.3.5.10430147-12986307 VMware VMwareCertified 2019-04-17
Anyway, I will create a customized ISO of 6.7 U2 and reinstall my system.
Hi,
when I tried to upgrade to 6.7 U2 the same error occurred. But by executing the following commands I've successfully upgraded my Server:
cd /tmp
esxcli software vib install -f -v /tmp/VMware_locker_tools-light_10.3.5.10430147-12986307.vib
esxcli software profile update -d https://hostupdate.vmware.com/software/VUM/PRODUCTION/main/vmw-depot-index.xml -p ESXi-6.7.0-20190402001-standard
I have the same problem that i can not upgrade via the command line from update 1 to update 2. I already adjusted the swap and upgraded the tools-locker vib as mentioned earlier to the mentioned version and rebooted the system afterwards. But still I receive the same message that there is not enough space.
Check df -h . Mostly one of the vfat partition is filled up causing the issue.
Something like this below and clear space. You might have to remove some unwanted vibs.
What I Wish Everyone Knew About ESXi Partition
[root@xxxx:~] ls -l
total 1189
lrwxrwxrwx 1 root root 49 May 20 04:47 altbootbank -> /vmfs/volumes/63f8bd77-7e77496c-db06-c9243800b975
drwxr-xr-x 1 root root 512 May 20 04:45 bin
lrwxrwxrwx 1 root root 49 May 20 04:47 bootbank -> /vmfs/volumes/7e6fbb7b-04b7c69d-edc1-acc0ac21db86
-r--r--r-- 1 root root 530628 Jan 8 03:30 bootpart.gz
-r--r--r-- 1 root root 414642 Jan 8 03:30 bootpart4kn.gz
drwxr-xr-x 19 root root 512 Jun 9 06:33 dev
drwxr-xr-x 1 root root 512 Jun 9 05:48 etc
drwxr-xr-x 1 root root 512 May 20 04:45 lib
drwxr-xr-x 1 root root 512 May 20 04:45 lib64
-r-x------ 1 root root 49116 May 20 04:43 local.tgz
lrwxrwxrwx 1 root root 6 May 20 04:47 locker -> /store
drwxr-xr-x 1 root root 512 May 20 04:45 mbr
drwxr-xr-x 1 root root 512 May 20 04:45 opt
drwxr-xr-x 1 root root 131072 Jun 9 06:33 proc
lrwxrwxrwx 1 root root 29 May 20 04:47 productLocker -> /locker/packages/vmtoolsRepo/
lrwxrwxrwx 1 root root 4 Jan 8 03:13 sbin -> /bin
lrwxrwxrwx 1 root root 49 May 20 04:47 scratch -> /vmfs/volumes/5b3f22be-08fa7fbb-6f98-000af7704150
lrwxrwxrwx 1 root root 49 May 20 04:47 store -> /vmfs/volumes/5b3f22b8-6652cebc-9236-000af7704150
drwxr-xr-x 1 root root 512 May 20 04:45 tardisks
drwxr-xr-x 1 root root 512 May 20 04:45 tardisks.noauto
drwxrwxrwt 1 root root 512 Jun 9 06:06 tmp
drwxr-xr-x 1 root root 512 May 20 04:45 usr
drwxr-xr-x 1 root root 512 May 20 04:47 var
drwxr-xr-x 1 root root 512 May 20 04:45 vmfs
drwxr-xr-x 1 root root 512 May 20 04:45 vmimages
lrwxrwxrwx 1 root root 18 Jan 8 03:13 vmupgrade -> /locker/vmupgrade/
[root@xxxx:~] df -h
Filesystem Size Used Available Use% Mounted on
NFS 7.7T 7.3T 390.5G 95% /vmfs/volumes/exit15_ISOs
VMFS-5 271.2G 85.3G 185.9G 31% /vmfs/volumes/is-tse-d127_1
VMFS-6 499.8G 245.0G 254.7G 49% /vmfs/volumes/Shared-Datastore
vfat 249.7M 157.4M 92.3M 63% /vmfs/volumes/7e6fbb7b-04b7c69d-edc1-acc0ac21db86
vfat 285.8M 174.3M 111.5M 61% /vmfs/volumes/5b3f22b8-6652cebc-9236-000af7704150
vfat 4.0G 46.2M 4.0G 1% /vmfs/volumes/5b3f22be-08fa7fbb-6f98-000af7704150
vfat 249.7M 157.2M 92.5M 63% /vmfs/volumes/63f8bd77-7e77496c-db06-c9243800b975
I'm having the same problem with a fresh install of 6.7u2. Same error message and log output as the original poster. Also like the original poster, my issue does not seem related to the locker vib. jimgreeny did you ever figure this out?
I was getting exactly the same problem and in the end I gave up trying to update from a scripted download instead I grabbed the update bundle zip file from https://my.vmware.com/group/vmware/patch#search
1. download the zip file
2. upload it to a datastore
3. i renamed it to update.zip (optional really)
4. esxcli software vib update -d /vmfs/volumes/<DATASTORE-UUID>/update.zip
5. reboot
I still have no idea why the other method was causing the 28 error despite trying all of the swap/locker/clean solutions but this one worked and as I am typing it is now up to date.
Hope it helps someone else.
Many thanks - it helped me, all the other hints did not work but this one.
This was driving me crazy too, until i found your solution, THANKYOU.
Thank you VERY much PsyMan2000 for taking the time to provide this solution. It worked for me too.
Thank you, not only this allowed me to upgrade my ESXi 6.7 system, but for the first time I used an "offline" method.
Since my work involves ESXi systems that are NOT connected to the Internet, this will be very helpfull!
Thanks again!