ncarde
Enthusiast
Enthusiast

Fresh VI3 Builds: Current Patch list won't fit in 2048 GB Partition?

Jump to solution

Hello,

How many of you build a new VI-3 Host and then apply all patches?

I just found out that a 2GB /tmp partition won't cut it if you want to apply all of them after initial build -- I had to temporarily tuck them in our 4GB '/' space.

Will this cause anyone to re-think their partition size scheme (discussed in this excellent thread here: http://www.vmware.com/community/thread.jspa?threadID=46345&start=100&tstart=0)

That 3 or 4 GB /tmp space that some of you chose is looking mighty nice now -- too bad I already built my hosts. Smiley Sad

0 Kudos
1 Solution

Accepted Solutions
depping
Leadership
Leadership

http://vmprofessional.com/material/esx-autopatch.html

use the one above, it will: unpack / install / delete tar -> next patch. store you patches on a nfs share or a SAN-Based VMFS volume, this way you don't need an enormous amount of extra diskspace on you /var/updates.

If you find these posts helpful, please award points using the Helpful/Correct buttons.

Duncan

View solution in original post

0 Kudos
8 Replies
conradsia
Hot Shot
Hot Shot

Are you using the update script to run them or unpacking them and running them manually?

ncarde
Enthusiast
Enthusiast

Are you using the update script to run them or

unpacking them and running them manually?

I stage all of the update tar balls in:

/tmp/updates

Then I use the following against each individual update (by Month):

esxupdate --noreboot -r file:

When I look at /tmp/updates I'm seeing 2.1GB disk consumption

\[root@host root]# vdf -h /tmp/updates

Filesystem Size Used Avail Use% Mounted on

/dev/cciss/c0d0p2 4.0G 2.1G 1.8G 54% /tmp/updates

I suppose I could untar them to a different location or just use the / partition and make sure I rm -rf it afterward...

0 Kudos
conradsia
Hot Shot
Hot Shot

You should look into using one of these scripts for your patches I prefer this one http://www.rtfm-ed.co.uk/?page_id=343

http://www.rtfm-ed.co.uk/downloads/esx_apply_patches.sh

But there is another one that I haven't tried here:

http://vmprofessional.com/material/esx-autopatch.html

For the first script, all that you do is download the patches into /var/updates and then run the perl script and it will extract the patches and run them in the correct order from oldest to newest. I put the patches on CD, copy them into the folder and just let it run, I have never had an issue with space because it delete them after they are run.

There is also a script on the forum somewhere that you can run on your ftp server as a task that will automatically go out and fetch new patches.

ncarde
Enthusiast
Enthusiast

Very cool - thank you.

If I'm reading the script correctly though it would still fail in my Scenario -- unless I'm missing something it doesn't remove the patches or extracted folders until the very end.

What I saw was that I am able to stage all of the patches to \tmp\updates and also extract them there in place but that through the course of updating via esxupdate it complained that:

"Error locking /var/run/esxupdate.pid \[Error 28] No space left on device"

Now - I fully believe I'm missing something because otherwise the script you mentioned would be failing for many people besides myself; however, in the event that the recently added June patches pushed something over the tipping point for 2GB /var partitions...

0 Kudos
conradsia
Hot Shot
Hot Shot

The july patches added about another 100mb, so it probably wont fit nicely on a cd any more. I only have 2GB in /var and I have never run into this problem, I just ran some updates about an hour ago using these steps.

can you post your df -h when there are no patches in /var/?

Can you du -h /var as well? What is the total size of /var without the patches?

From my patch folder the patches were 686MB compressed, uncompressed that shouldn't fill up two gigs even if they were 50% compressed so I suspect there is some more data lingering around your /var partition.

Message was edited by:

conradsia

Message was edited by:

conradsia

0 Kudos
ncarde
Enthusiast
Enthusiast

Thank you -- this what my /var looks like after a fresh build:

\[root@system root]# df -h /var

Filesystem Size Used Avail Use% Mounted on

/dev/cciss/c0d0p6 2.0G 155M 1.8G 9% /var

\[root@system root]# du -h /var

16K /var/lost+found

16M /var/lib/rpm

4.0K /var/lib/games

4.0K /var/lib/misc

4.0K /var/lib/alternatives

8.0K /var/lib/ntp

4.0K /var/lib/dhcp

4.0K /var/lib/nfs/statd

8.0K /var/lib/nfs

59M /var/lib/vmware/hostd/docroot/client

4.0K /var/lib/vmware/hostd/docroot/downloads

948K /var/lib/vmware/hostd/docroot/sdk

60M /var/lib/vmware/hostd/docroot

4.0K /var/lib/vmware/hostd/journal

1.6M /var/lib/vmware/hostd/stats

62M /var/lib/vmware/hostd

62M /var/lib/vmware

77M /var/lib

4.0K /var/tmp

4.0K /var/log/vmware/webAccess

372K /var/log/vmware

160K /var/log/initrdlogs

4.0K /var/log/vmksummary.d

24K /var/log/oldconf

908K /var/log

4.0K /var/cache/man/X11R6/cat1

4.0K /var/cache/man/X11R6/cat2

4.0K /var/cache/man/X11R6/cat3

4.0K /var/cache/man/X11R6/cat4

4.0K /var/cache/man/X11R6/cat5

4.0K /var/cache/man/X11R6/cat6

4.0K /var/cache/man/X11R6/cat7

4.0K /var/cache/man/X11R6/cat8

4.0K /var/cache/man/X11R6/cat9

4.0K /var/cache/man/X11R6/catn

44K /var/cache/man/X11R6

4.0K /var/cache/man/cat1

4.0K /var/cache/man/cat2

4.0K /var/cache/man/cat3

4.0K /var/cache/man/cat4

4.0K /var/cache/man/cat5

4.0K /var/cache/man/cat6

4.0K /var/cache/man/cat7

4.0K /var/cache/man/cat8

4.0K /var/cache/man/cat9

4.0K /var/cache/man/catn

4.0K /var/cache/man/local/cat1

4.0K /var/cache/man/local/cat2

4.0K /var/cache/man/local/cat3

4.0K /var/cache/man/local/cat4

4.0K /var/cache/man/local/cat5

4.0K /var/cache/man/local/cat6

4.0K /var/cache/man/local/cat7

4.0K /var/cache/man/local/cat8

4.0K /var/cache/man/local/cat9

4.0K /var/cache/man/local/catn

44K /var/cache/man/local

324K /var/cache/man

4.0K /var/cache/samba/winbindd_privileged

8.0K /var/cache/samba

4.0K /var/cache/yum

340K /var/cache

4.0K /var/db

4.0K /var/local

4.0K /var/lock/subsys

8.0K /var/lock

4.0K /var/nis

4.0K /var/opt

4.0K /var/preserve

4.0K /var/run/console

4.0K /var/run/sudo

4.0K /var/run/netreport

4.0K /var/run/saslauthd

4.0K /var/run/winbindd

4.0K /var/run/vmware/root_0/1184505652291137_101060

4.0K /var/run/vmware/root_0/1184505678143804_1261

4.0K /var/run/vmware/root_0/1184588815042812_6053

16K /var/run/vmware/root_0

36K /var/run/vmware

92K /var/run

4.0K /var/spool/lpd

4.0K /var/spool/mail

4.0K /var/spool/cron

4.0K /var/spool/repackage

20K /var/spool

4.0K /var/yp/binding

12K /var/yp

4.0K /var/core

4.0K /var/empty/sshd

8.0K /var/empty

5.7M /var/pegasus/bin

14M /var/pegasus/lib

4.0K /var/pegasus/repository/root/classes

4.0K /var/pegasus/repository/root/instances

4.0K /var/pegasus/repository/root/qualifiers

16K /var/pegasus/repository/root

1.8M /var/pegasus/repository/root#PG_InterOp/classes

1.4M /var/pegasus/repository/root#PG_InterOp/instances

252K /var/pegasus/repository/root#PG_InterOp/qualifiers

3.4M /var/pegasus/repository/root#PG_InterOp

40K /var/pegasus/repository/root#PG_Internal/classes

4.0K /var/pegasus/repository/root#PG_Internal/instances

252K /var/pegasus/repository/root#PG_Internal/qualifiers

300K /var/pegasus/repository/root#PG_Internal

21M /var/pegasus/repository/vmware#esxv2/classes

20K /var/pegasus/repository/vmware#esxv2/instances

256K /var/pegasus/repository/vmware#esxv2/qualifiers

22M /var/pegasus/repository/vmware#esxv2

25M /var/pegasus/repository

120K /var/pegasus/mofs

4.0K /var/pegasus/vmware/install_queue

8.0K /var/pegasus/vmware

45M /var/pegasus

123M /var

0 Kudos
depping
Leadership
Leadership

http://vmprofessional.com/material/esx-autopatch.html

use the one above, it will: unpack / install / delete tar -> next patch. store you patches on a nfs share or a SAN-Based VMFS volume, this way you don't need an enormous amount of extra diskspace on you /var/updates.

If you find these posts helpful, please award points using the Helpful/Correct buttons.

Duncan

View solution in original post

0 Kudos
ncarde
Enthusiast
Enthusiast

Thank you, all - for the replies.

0 Kudos