VMware Cloud Community
Yahkin
Contributor
Contributor
Jump to solution

Virtualcenter 2.5 connecting to remote ESX 3.5 Servers

Ok, here's the setup.

I have one ESX 3.5 server located in our colocation facility. It sits behind a NAT firewall. The management IP address is 192.168.124.100, and the VMKernel IP address is 192.168.124.101. The VMKernel ip is used to connect to our NFS server at 192.168.124.9.

I have our virtualcenter server here at my location. It also sits behind a NAT firewall.

I have attempted in vain to connect to the ESX server directly using port forwarding on the firewall. For some reason port 80 and 443 are required for Virtualcenter to add the host, and those ports are already in use on the CoLoc firewall for real applications.(Even though everything I read says only 902 and 903 should be required.) So, my fallback plan is to use SSH tunneling.

This works...kinda. I have Putty installed on the Virtualcenter server and I make the connection to the remote network. I have it setup to forward 80, 443, 902, and 903. I then tell Virtualcenter to connect to localhost. Viola it works! ...for about 2 minutes, then it gives me a "not responding" message. GRR! So as of right now I am managing the server in 2 minute bursts. Quite annoying. It's not a bandwidth issue as we have 15Mb on this end and 100Mb at the CoLoc. A console connection stays up just fine...It's just virtualcenter that returns the not responding message. Any ideas?

Problem 2: When I sent the server out there I had one running VM, and one template. When the server arrived, I connected to the NFS share where I want to keep my VM's and tried to create a VM from the template. This returns an "Unable to Connect" message. It sees the NFS share just fine, but I cannot move, deploy, do anything to the NFS mount. I read that port 2049 is needed to the VMKernel ip, so I tried to tunnel that as well...but I'm guessing it's trying to go to the 192.168.124.101 address directly from virtualcenter. This is of course not going to work. Any tips on making this work?

I'm seriously starting to think that Virtualcenter only works if it's on the same damn network as the servers. Since the colo is entirely linux, this would be a PITA. HELP!

0 Kudos
1 Solution

Accepted Solutions
admin
Immortal
Immortal
Jump to solution

Regarding problem #1: the VC agent on the ESX host (vpxa) sends udp heartbeats back to the VC server on port 902. If the VC server does not get any heartbeats from the host it will mark it has not responding.

View solution in original post

0 Kudos
3 Replies
admin
Immortal
Immortal
Jump to solution

Regarding problem #1: the VC agent on the ESX host (vpxa) sends udp heartbeats back to the VC server on port 902. If the VC server does not get any heartbeats from the host it will mark it has not responding.

0 Kudos
Yahkin
Contributor
Contributor
Jump to solution

Hmm, well that eliminates using SSH Tunnels since you can't tunnel UDP. So let's try the direct approach then:

To access it directly I will need to forward ports 902 and 903 from Virtualcenter to ESX Host. From ESX host I will need 902 UDP coming back to Virtualcenter. Any other ports that I will need to make this work?

Thanks.

0 Kudos
Yahkin
Contributor
Contributor
Jump to solution

Ok, the UDP heartbeat has solved problem 1. I am still able to use the SSH tunnel, I just needed to port forward UDP 902 back to the Virtualcenter server via the firewalls. Worked like a charm. I'm marking this as answered and will start a new thread with the NFS issue.

Thanks!

0 Kudos