VMware Cloud Community
loungehostmaste
Enthusiast
Enthusiast

VDP and gsan.log

is there any way to reduce the log-spam of the VDP appliance

fresh deployment of vSphereDataProtection-6.1.5.ova at 2016-12-06 and until tody 280 such 25 MB files
since the appliance itself is thin-provisioned on NFS that thing grows useless and 50-75 MB per day is way too much

what makes this also bad is that on NFS UNMAp don't work and we copy the whole appliance regulary to a USB3 harddisk with "cp -R --sparse=always"
at the bottom a log-sample - frankly if something goes wrong i prefer re-deploy anyways

[root@vmware-recovery:/etc/cron.daily]$ cat /usr/local/bin/remove-gsan-logs.sh

#!/bin/bash

/usr/bin/find /data01/ -mtime +3 -type f -regex ".*\/gsan\.log\.[0-9][0-9].*" -exec /bin/ls -lh {} \;

/usr/bin/find /data01/ -mtime +3 -type f -regex ".*\/gsan\.log\.[0-9][0-9].*" -exec /bin/rm -f {} \;

/usr/bin/find /data02/ -mtime +3 -type f -regex ".*\/gsan\.log\.[0-9][0-9].*" -exec /bin/ls -lh {} \;

/usr/bin/find /data02/ -mtime +3 -type f -regex ".*\/gsan\.log\.[0-9][0-9].*" -exec /bin/rm -f {} \;

/usr/bin/find /data03/ -mtime +3 -type f -regex ".*\/gsan\.log\.[0-9][0-9].*" -exec /bin/ls -lh {} \;

/usr/bin/find /data03/ -mtime +3 -type f -regex ".*\/gsan\.log\.[0-9][0-9].*" -exec /bin/rm -f {} \;

[harry@srv-rhsoft:/downloads]$ cat remove-gsan-logs.log | wc -l

280


part of the delete-log:

-r--r----- 4 admin admin 25M Dec 29 00:33 /data01/cur/gsan.log.014

-r--r----- 4 admin admin 25M Dec 28 12:53 /data01/cur/gsan.log.015

-r--r----- 4 admin admin 25M Dec 28 03:49 /data01/cur/gsan.log.016

-r--r----- 4 admin admin 25M Dec 27 20:29 /data01/cur/gsan.log.017

-r--r----- 4 admin admin 25M Dec 27 08:51 /data01/cur/gsan.log.018

-r--r----- 4 admin admin 25M Dec 27 00:36 /data01/cur/gsan.log.019

-r--r----- 4 admin admin 25M Dec 26 12:59 /data01/cur/gsan.log.020

-r--r----- 4 admin admin 25M Dec 26 03:53 /data01/cur/gsan.log.021

-r--r----- 4 admin admin 25M Dec 25 17:10 /data01/cur/gsan.log.022

-r--r----- 4 admin admin 25M Dec 25 07:05 /data01/cur/gsan.log.023

-r--r----- 4 admin admin 25M Dec 24 21:17 /data01/cur/gsan.log.024

-r--r----- 4 admin admin 25M Dec 24 09:42 /data01/cur/gsan.log.025

-r--r----- 4 admin admin 25M Dec 24 01:20 /data01/cur/gsan.log.026

-r--r----- 4 admin admin 25M Dec 23 14:07 /data01/cur/gsan.log.027

-r--r----- 4 admin admin 25M Dec 23 05:00 /data01/cur/gsan.log.028

-r--r----- 4 admin admin 25M Dec 22 18:36 /data01/cur/gsan.log.029

-r--r----- 4 admin admin 25M Dec 22 07:53 /data01/cur/gsan.log.030

-r--r----- 4 admin admin 25M Dec 21 23:45 /data01/cur/gsan.log.031

-r--r----- 4 admin admin 25M Dec 21 11:05 /data01/cur/gsan.log.032

-r--r----- 4 admin admin 25M Dec 21 01:34 /data01/cur/gsan.log.033
......................

[root@vmware-recovery:/data01/cp.20180101051643]$ tail -n 10 gsan.log.010

2017/12/29-07:59:26.78420 {0.0} [srvm-14292621#srv:2729]  servloop::acctrequest login username=root

2017/12/29-07:59:26.78507 {0.0} [srvm-14292621#srv:104]  syslog: msg=[/usr/local/avamar/bin/avmaint --flagfile=/usr/local/avamar/etc/usersettings.cfg --password=**************** --vardir=/usr/local/avamar/var --server=vmware-recovery --id=root --bindir=/usr/local/avamar/bin getclientmsgs --server=vmware-recovery.thelounge.net --hfsport=27000 --conntimeout=120 --timeout=120 --vardir=/usr/local/avamar/var --startoffset=22188496] host=[vmware-recovery] time=[Fri Dec 29 08:59:26 2017 CEST] build=[version=7.2.80-133 date=Jul 13 2017 08:07:49 msg=13-10 SSL=TLSv1   OpenSSL 1.0.2a-fips 19 Mar 2015 Zlib=1.2.7 LZO=1.08 Jul 12 2002 platform=Linux OS=SLES-64 Processor=x86_64]

2017/12/29-07:59:26.78872 {0.0} [srvm-14292621#0x8ea22b0 clientaddr=192.168.196.114:57533 ismaint=1 type=avmaint access=uname=root uid=0 priv=enabled,create,read,backup,access,move,delete,maint,manage,fullmanage,noticketrequired,superuser,ignoreacls,readdir,mclogin,opt1,opt2 avail=modes=00pu:3179]  servmain::dispatcherloop end

2017/12/29-07:59:26.78882 {0.0} [srvm-14292621#0x8ea22b0 clientaddr=192.168.196.114:57533 ismaint=1 type=avmaint access=uname=root uid=0 priv=enabled,create,read,backup,access,move,delete,maint,manage,fullmanage,noticketrequired,superuser,ignoreacls,readdir,mclogin,opt1,opt2 avail=modes=00pu:3179]  servmain::execute end

2017/12/29-07:59:27.03422 {0.0} [acpt0-27000-192.168.196.114]  Socket buffer size (recv) requested 131072, was set to 131072

2017/12/29-07:59:27.03425 {0.0} [acpt0-27000-192.168.196.114]  Socket buffer size (send) requested 131072, was set to 131072

2017/12/29-07:59:27.03457 {0.0} [srvm-14292637#0x8ea22b0 clientaddr=192.168.196.114:51043 ismaint=1 type=unknown access=<noaccess> avail=<noavail>:3358]  servmain::execute start

2017/12/29-07:59:27.03459 {0.0} [srvm-14292637#0x8ea22b0 clientaddr=192.168.196.114:51043 ismaint=1 type=unknown access=<noaccess> avail=modes=0000:3358]  servmain::dispatcherloop start

2017/12/29-07:59:27.03484 {0.0} [srvm-14292637#srv:078]  servloop::handle_getserverconnection 192.168.196.114:27000 -> 192.168.196.114:27000

2017/12/29-07:59:27.03666 {0.0} [srvm-14292637#srv:872]  servloop::acctrequest login username=root

0 Kudos
1 Reply
admin
Immortal
Immortal

You can try this -VMware Knowledge Base

Regards,

Randhir

0 Kudos