SergiiM
Contributor
Contributor

Ramdisk is full, no space left on root

Jump to solution

it

Hi all,

I have been spending several days to figure out the issue, but I gave up and decided to ask the community to help me.

So, there is a standard error: no space left on device

vdf -h

-----

Ramdisk                   Size      Used Available Use% Mounted on

root                       32M       32M        0B 100% --

etc                        28M      324K       27M   1% --

tmp                       192M       60K      191M   0% --

hostdstats                303M        4M      298M   1% --

I've deleted all logs in /var/log, it helped, but then, after several hours, the same issue come back.

I've seen that /var/lib/sfcb/registration was about 30MB and moved some files related to LSI controller.

So,

/var # du -sh

18.1M   .

But it did not help at all.

vdf -h

-----

Ramdisk                   Size      Used Available Use% Mounted on

root                       32M       32M        0B 100% --

etc                        28M      324K       27M   1% --

tmp                       192M       60K      191M   0% --

hostdstats                303M        4M      298M   1% --

Guys, could you please help me to figure it out and describe what the issue is?

Thank you for any help.

0 Kudos
1 Solution

Accepted Solutions
daphnissov
Immortal
Immortal

I'm thinking you should reboot this host to reset the root ramdisk as it may not be releasing space properly. Also, it's probably worth applying patch 11 that came out in mid-September as it fixes a number of issues.

View solution in original post

0 Kudos
17 Replies
daphnissov
Immortal
Immortal

From / do a du -sh * and find out which filesystem is consuming the majority of the space.

0 Kudos
SergiiM
Contributor
Contributor

Hi aphnissov,

Here is an output of the command:

du -sh *

0       0

4.0K    altbootbank

122.4M  bin

4.0K    bootbank

296.0K  bootpart.gz

19.9T   dev

12.0M   etc

176.2M  lib

28.5M   lib64

4.0K    locker

116.0K  mbr

33.3M   opt

3.0M    proc

4.0K    productLocker

4.0K    sbin

4.0K    scratch

4.0K    store

538.7M  tardisks

4.0K    tardisks.noauto

68.0K   tmp

175.7M  usr

18.1M   var

4.7T    vmfs

12.0K   vmimages

4.0K    vmupgrade

0 Kudos
daphnissov
Immortal
Immortal

Also from / do a df -h and post the output.

0 Kudos
SergiiM
Contributor
Contributor

~ # df -h

Filesystem   Size   Used Available Use% Mounted on

VMFS-5     556.8G 380.3G    176.4G  68% /vmfs/volumes/DS-2-SASR5

VMFS-5     459.8G 450.4G      9.3G  98% /vmfs/volumes/DS-1-SATAR1

VMFS-5       5.5T   3.9T      1.6T  71% /vmfs/volumes/DS-3-SATAR5

vfat         4.0G   7.8M      4.0G   0% /vmfs/volumes/52503a3f-3cd0e176-d335-0025900000df

vfat       249.7M 172.0M     77.7M  69% /vmfs/volumes/38ac0f7b-66d0957e-466d-f5900f1ac9f5

vfat       249.7M 172.7M     77.0M  69% /vmfs/volumes/816b367d-043df0d7-be3d-c68ed45ff302

vfat       285.8M 203.6M     82.2M  71% /vmfs/volumes/52503a26-7e96441c-ce45-0025900000df

0 Kudos
daphnissov
Immortal
Immortal

That appears to be fine. Post output of ls -lh /

0 Kudos
bhards4
Hot Shot
Hot Shot

Hi,

Remove the unnecessary files from /var & temp folder.

-Sachin

0 Kudos
SergiiM
Contributor
Contributor

ls -lh /

total 509

-rw-r--r--    1 root     root           0 Sep 12 15:15 0

lrwxrwxrwx    1 root     root          49 May 25 17:23 altbootbank -> /vmfs/volumes/816b367d-043df0d7-be3d-c68ed45ff302

drwxr-xr-x    1 root     root         512 May 29 10:34 bin

lrwxrwxrwx    1 root     root          49 May 25 17:23 bootbank -> /vmfs/volumes/38ac0f7b-66d0957e-466d-f5900f1ac9f5

-r--r--r--    1 root     root      293.0K Mar 21  2017 bootpart.gz

drwxr-xr-x    1 root     root         512 Nov  2 16:05 dev

drwxr-xr-x    1 root     root         512 Nov  2 14:20 etc

drwxr-xr-x    1 root     root         512 May 25 17:22 lib

drwxr-xr-x    1 root     root         512 May 29 10:34 lib64

lrwxrwxrwx    1 root     root           6 May 25 17:23 locker -> /store

drwxr-xr-x    1 root     root         512 May 25 17:22 mbr

drwxr-xr-x    1 root     root         512 Jun  2 14:27 opt

drwxr-xr-x    1 root     root      128.0K Nov  2 16:05 proc

lrwxrwxrwx    1 root     root          22 May 25 17:23 productLocker -> /locker/packages/5.5.0

lrwxrwxrwx    1 root     root           4 Mar 21  2017 sbin -> /bin

lrwxrwxrwx    1 root     root          49 May 25 17:23 scratch -> /vmfs/volumes/52503a3f-3cd0e176-d335-0025900000df

lrwxrwxrwx    1 root     root          49 May 25 17:23 store -> /vmfs/volumes/52503a26-7e96441c-ce45-0025900000df

drwxr-xr-x    1 root     root         512 Jun  2 14:27 tardisks

drwxr-xr-x    1 root     root         512 May 25 17:22 tardisks.noauto

drwxrwxrwt    1 root     root         512 Nov  2 16:01 tmp

drwxr-xr-x    1 root     root         512 May 25 17:22 usr

drwxr-xr-x    1 root     root         512 May 25 17:23 var

drwxr-xr-x    1 root     root         512 May 25 17:22 vmfs

drwxr-xr-x    1 root     root         512 May 25 17:22 vmimages

lrwxrwxrwx    1 root     root          17 Mar 21  2017 vmupgrade -> /locker/vmupgrade

0 Kudos
SergiiM
Contributor
Contributor

Hi Sachin,

I did.

/tmp # du -sh

52.0K   .

/var # du -sh

18.1M   .

Most of disk space is in /var/lib/sfcb, but I am not sure if I can delete something.

0 Kudos
daphnissov
Immortal
Immortal

Show your version:  vmware -v

0 Kudos
SergiiM
Contributor
Contributor

vmware -v

VMware ESXi 5.5.0 build-5230635

0 Kudos
daphnissov
Immortal
Immortal

Show output of /var filesystem:  du -sh /var/*

0 Kudos
SergiiM
Contributor
Contributor

du -sh /var/*

4.0K    /var/core

544.0K  /var/db

17.0M   /var/lib

8.0K    /var/lock

132.0K  /var/log

4.0K    /var/opt

380.0K  /var/run

20.0K   /var/spool

4.0K    /var/tmp

0 Kudos
daphnissov
Immortal
Immortal

Strange. What hardware is this running? Can you post your vmkwarning.log file at /var/log? And check on your available inodes with stat -f / .

0 Kudos
SergiiM
Contributor
Contributor

It is running on Supermicro server, X8DTL motherboard.

The last update in the vmkwarning.log file was a half year ago. 

stat -f /

  File: "/"

    ID: 100000000 Namelen: 127     Type: visorfs

Block size: 4096

Blocks: Total: 279975     Free: 132683     Available: 132683

Inodes: Total: 524288     Free: 515563

0 Kudos
daphnissov
Immortal
Immortal

I'm thinking you should reboot this host to reset the root ramdisk as it may not be releasing space properly. Also, it's probably worth applying patch 11 that came out in mid-September as it fixes a number of issues.

View solution in original post

0 Kudos
ssavant
Contributor
Contributor

We had a similar problem after upgrading some drivers the other day, but just on a few of our hosts.

If you type; (on the host)

"esxcli system visorfs ramdisk list"

you can verify if the ramdisk is actually full.

We spoke to vmware's support about this and the solution was actually just to reboot the host one more time, and the problem was gone.

0 Kudos
SergiiM
Contributor
Contributor

Hi all,

A reboot saved the day. So, the issue is solved. Also, I have updated the server to the latest ESXi version.

Thanks for your help guys!

0 Kudos