VMware Cloud Community
NePet
Contributor
Contributor

vpxd-svcs logs with full of tomcat-exec "VMware vim-java 1.0" POST /invsvc/vmomi/sdk/ HTTP/1

Hello

I have a classis error "The /storage/log filesystem is out of disk space or inodes"
vcsa appliance Version:6.5.0.33200 

So I check the log folder and find 6.5GB log files in /storage/log/vmware/vpxd-svcs/ folder
with vpxd-svcs-access.**date** logs with full of this content

[25/Apr/2021:21:55:34 +0000] 127.0.0.1 488 tomcat-exec-23 200 "VMware vim-java 1.0" POST /invsvc/vmomi/sdk/ HTTP/1.1
[25/Apr/2021:21:55:34 +0000] 127.0.0.1 6494 tomcat-exec-221 200 "VMware vim-java 1.0" POST /invsvc/vmomi/sdk/ HTTP/1.1
[25/Apr/2021:21:55:34 +0000] 127.0.0.1 6482 tomcat-exec-3 200 "VMware vim-java 1.0" POST /invsvc/vmomi/sdk/ HTTP/1.1
[25/Apr/2021:21:55:34 +0000] 127.0.0.1 6480 tomcat-exec-2 200 "VMware vim-java 1.0" POST /invsvc/vmomi/sdk/ HTTP/1.1
[25/Apr/2021:21:55:34 +0000] 127.0.0.1 490 tomcat-exec-2 200 "VMware vim-java 1.0" POST /invsvc/vmomi/sdk/ HTTP/1.1
[25/Apr/2021:21:55:34 +0000] 127.0.0.1 6472 tomcat-exec-228 200 "VMware vim-java 1.0" POST /invsvc/vmomi/sdk/ HTTP/1.1
[25/Apr/2021:21:55:34 +0000] 127.0.0.1 488 tomcat-exec-70 200 "VMware vim-java 1.0" POST /invsvc/vmomi/sdk/ HTTP/1.1
[25/Apr/2021:21:55:34 +0000] 127.0.0.1 6482 tomcat-exec-269 200 "VMware vim-java 1.0" POST /invsvc/vmomi/sdk

I did not find any tips for this. I will be happy to clarify what it is. Thank you


Reply
0 Kudos
5 Replies
Ajay1988
VMware Employee
VMware Employee

I don't think that's an issue .. That's expected in those log files.    Share df -h

Can u share the output for "du -shc *" under   /storage/log and   /storage/log/vmware/vpxd-svcs first

If you think your queries have been answered
Mark this response as "Correct" or "Helpful".

Regards,
AJ
Reply
0 Kudos
NePet
Contributor
Contributor

Hi AJ,

---
Filesystem Size Used Avail Use% Mounted on
devtmpfs 4.9G 0 4.9G 0% /dev
tmpfs 4.9G 16K 4.9G 1% /dev/shm
tmpfs 4.9G 684K 4.9G 1% /run
tmpfs 4.9G 0 4.9G 0% /sys/fs/cgroup
/dev/sda3 11G 5.9G 4.2G 59% /
tmpfs 4.9G 1.3M 4.9G 1% /tmp
/dev/mapper/netdump_vg-netdump 985M 1.3M 932M 1% /storage/netdump
/dev/mapper/imagebuilder_vg-imagebuilder 9.8G 23M 9.2G 1% /storage/imagebuilder
/dev/sda1 120M 35M 80M 31% /boot
/dev/mapper/seat_vg-seat 9.8G 175M 9.1G 2% /storage/seat
/dev/mapper/autodeploy_vg-autodeploy 9.8G 25M 9.2G 1% /storage/autodeploy
/dev/mapper/dblog_vg-dblog 15G 1.7G 13G 12% /storage/dblog
/dev/mapper/core_vg-core 25G 45M 24G 1% /storage/core
/dev/mapper/db_vg-db 9.8G 218M 9.0G 3% /storage/db
/dev/mapper/updatemgr_vg-updatemgr 99G 1.7G 92G 2% /storage/updatemgr
/dev/mapper/log_vg-log 9.8G 9.8G 0 100% /storage/log

----
root@vcsa [ /var/log ]# du -sh *
120K audit
0 auth.log
0 btmp
4.0K btmp-20210501
724K cloud-init.log
0 cm
0 cron
720K devicelist
8.0K fbStatusInt.json
1.6M firstboot
0 installer.log
4.0K invoked_procs
1.3G journal
4.0K lastlog
4.0K lastlog.1
4.0K lastlog.16.gz
0 messages
176K messages.1
20K messages.10.gz
20K messages.11.gz
20K messages.12.gz
20K messages.13.gz
20K messages.14.gz
20K messages.15.gz
20K messages.16.gz
20K messages.2.gz
20K messages.3.gz
20K messages.4.gz
20K messages.5.gz
20K messages.6.gz
20K messages.7.gz
20K messages.8.gz
20K messages.9.gz
4.0K postthaw
4.0K prefreeze
5.0M procstate
1.8M procstate-20190226.gz
1.8M procstate-20190227.gz
1.8M procstate-20190228.gz
1.8M procstate-20190301.gz
1.8M procstate-20190302.gz
1.9M procstate-20190303.gz
1.8M procstate-20190304.gz
0 remote
8.0K restore
27M rpmcheck
28K sa
720K sgidlist
84K stigreport.log
720K suidlist
4.0K tallylog
4.0K tallylog.1
4.0K tallylog.16.gz
24K tmp-cmd.out
0 vmware
4.0K vmware-network.1.log
4.0K vmware-network.2.log
4.0K vmware-network.3.log
4.0K vmware-network.4.log
4.0K vmware-network.5.log
4.0K vmware-network.6.log
4.0K vmware-network.log
28K vmware-vgauthsvc.log.0
60K vmware-vmsvc.log
28K wtmp
4.0K YaST2

---
root@vcsa [ /storage/log/vmware/vpxd-svcs ]# du -sh *
116K authz-event.log
140K authz-ldapdump.ldif
0 startup-error.log
4.0K tagging-ldapdump.ldif
20K vmware-vpxd-svcs-gc.log.0
8.0K vmware-vpxd-svcs-gc.log.1
8.0K vmware-vpxd-svcs-gc.log.2
4.0K vmware-vpxd-svcs-gc.log.3
12K vmware-vpxd-svcs-gc.log.4
0 vmware-vpxd-svcs-gc.log.5
8.0K vmware-vpxd-svcs-gc.log.6
12K vmware-vpxd-svcs-gc.log.7
0 vmware-vpxd-svcs-gc.log.8
0 vmware-vpxd-svcs-gc.log.9.current
115M vpxd-svcs-access-.2021-04-08.log.gz
115M vpxd-svcs-access-.2021-04-09.log.gz
118M vpxd-svcs-access-.2021-04-10.log.gz
118M vpxd-svcs-access-.2021-04-11.log.gz
116M vpxd-svcs-access-.2021-04-12.log.gz
116M vpxd-svcs-access-.2021-04-13.log.gz
116M vpxd-svcs-access-.2021-04-14.log.gz
116M vpxd-svcs-access-.2021-04-15.log.gz
118M vpxd-svcs-access-.2021-04-16.log.gz
125M vpxd-svcs-access-.2021-04-17.log.gz
125M vpxd-svcs-access-.2021-04-18.log.gz
123M vpxd-svcs-access-.2021-04-19.log.gz
123M vpxd-svcs-access-.2021-04-20.log.gz
123M vpxd-svcs-access-.2021-04-21.log.gz
123M vpxd-svcs-access-.2021-04-22.log.gz
121M vpxd-svcs-access-.2021-04-23.log.gz
125M vpxd-svcs-access-.2021-04-24.log.gz
125M vpxd-svcs-access-.2021-04-25.log.gz
123M vpxd-svcs-access-.2021-04-26.log.gz
123M vpxd-svcs-access-.2021-04-27.log.gz
123M vpxd-svcs-access-.2021-04-28.log.gz
123M vpxd-svcs-access-.2021-04-29.log.gz
123M vpxd-svcs-access-.2021-04-30.log.gz
123M vpxd-svcs-access-.2021-05-01.log.gz
120M vpxd-svcs-access-.2021-05-02.log.gz
122M vpxd-svcs-access-.2021-05-03.log.gz
122M vpxd-svcs-access-.2021-05-04.log.gz
3.4G vpxd-svcs-access-.2021-05-05.log
85M vpxd-svcs-access-.2021-05-05.log.gz
187M vpxd-svcs-access-.2021-05-06.log
0 vpxd-svcs-access-.2021-05-06.log.gz
408K vpxd-svcs-access-.2021-05-07.log.gz
464K vpxd-svcs-access-.2021-05-08.log.gz
684K vpxd-svcs-access-.2021-05-09.log
0 vpxd-svcs-access-.2021-05-09.log.gz
132K vpxd-svcs-access-.2021-05-10.log
0 vpxd-svcs-access-.2021-05-10.log.gz
24K vpxd-svcs-access-.2021-05-11.log.gz
36K vpxd-svcs-access-.2021-05-12.log.gz
32K vpxd-svcs-access-.2021-05-13.log.gz
32K vpxd-svcs-access-.2021-05-14.log.gz
28K vpxd-svcs-access-.2021-05-15.log.gz
2.2M vpxd-svcs-access-.2021-05-16.log
0 vpxd-svcs-access-.2021-05-16.log.gz
212K vpxd-svcs-access-.2021-05-17.log
0 vpxd-svcs-access-.2021-05-17.log.gz
1.3M vpxd-svcs-access-.2021-05-18.log
0 vpxd-svcs-access-.2021-05-18.log.gz
1.9M vpxd-svcs-access-.2021-05-19.log
0 vpxd-svcs-access-.2021-05-19.log.gz
40K vpxd-svcs-access-.2021-05-20.log
368K vpxd-svcs.log
112K vpxd-svcs.log.10.gz
112K vpxd-svcs.log.11.gz
112K vpxd-svcs.log.12.gz
112K vpxd-svcs.log.13.gz
108K vpxd-svcs.log.14.gz
112K vpxd-svcs.log.15.gz
112K vpxd-svcs.log.16.gz
116K vpxd-svcs.log.17.gz
112K vpxd-svcs.log.18.gz
112K vpxd-svcs.log.19.gz
88K vpxd-svcs.log.1.gz
112K vpxd-svcs.log.20.gz
112K vpxd-svcs.log.21.gz
112K vpxd-svcs.log.22.gz
112K vpxd-svcs.log.23.gz
112K vpxd-svcs.log.24.gz
112K vpxd-svcs.log.25.gz
112K vpxd-svcs.log.26.gz
112K vpxd-svcs.log.27.gz
112K vpxd-svcs.log.28.gz
108K vpxd-svcs.log.29.gz
112K vpxd-svcs.log.2.gz
112K vpxd-svcs.log.30.gz
112K vpxd-svcs.log.31.gz
112K vpxd-svcs.log.32.gz
112K vpxd-svcs.log.33.gz
112K vpxd-svcs.log.34.gz
112K vpxd-svcs.log.35.gz
112K vpxd-svcs.log.36.gz
108K vpxd-svcs.log.37.gz
112K vpxd-svcs.log.38.gz
112K vpxd-svcs.log.39.gz
112K vpxd-svcs.log.3.gz
112K vpxd-svcs.log.40.gz
112K vpxd-svcs.log.41.gz
112K vpxd-svcs.log.42.gz
112K vpxd-svcs.log.43.gz
108K vpxd-svcs.log.44.gz
112K vpxd-svcs.log.45.gz
112K vpxd-svcs.log.46.gz
112K vpxd-svcs.log.47.gz
112K vpxd-svcs.log.48.gz
112K vpxd-svcs.log.49.gz
112K vpxd-svcs.log.4.gz
112K vpxd-svcs.log.50.gz
112K vpxd-svcs.log.5.gz
112K vpxd-svcs.log.6.gz
108K vpxd-svcs.log.7.gz
108K vpxd-svcs.log.8.gz
112K vpxd-svcs.log.9.gz
4.0K vpxd-svcs-runtime.log-0.stderr
4.0K vpxd-svcs-runtime.log-0.stdout
4.0K vpxd-svcs-runtime.log-1.stderr
4.0K vpxd-svcs-runtime.log-1.stdout
4.0K vpxd-svcs-runtime.log-2.stderr
4.0K vpxd-svcs-runtime.log-2.stdout
4.0K vpxd-svcs-runtime.log-3.stderr
4.0K vpxd-svcs-runtime.log-3.stdout
4.0K vpxd-svcs-runtime.log-4.stderr
4.0K vpxd-svcs-runtime.log-4.stdout
4.0K vpxd-svcs-runtime.log.stderr
4.0K vpxd-svcs-runtime.log.stdout

 

Reply
0 Kudos
wila
Immortal
Immortal

Moderator note: FYI fished your reply out of the spam queue and removed the duplicate post.

--
Wil

| Author of Vimalin. The virtual machine Backup app for VMware Fusion, VMware Workstation and Player |
| More info at vimalin.com | Twitter @wilva
Reply
0 Kudos
Ajay1988
VMware Employee
VMware Employee

Seems vpxd-svcs-access-.2021-05-05.log did not roll over.

cat /dev/null > vpxd-svcs-access-.2021-05-05.log

inside journal also cleanup the files. That's not needed.

Also check if vpxd-svcs.log has any repetitive error

If you think your queries have been answered
Mark this response as "Correct" or "Helpful".

Regards,
AJ
Reply
0 Kudos
zik
Enthusiast
Enthusiast

It may be expected, but is it useful?

If it is intended for debugging, I would expect a flag to turn that off and only log errors.

Reply
0 Kudos