VMware Cloud Community
grob115
Enthusiast
Enthusiast

VMs not powered up upon host start up s expected

Hi, question on delayed start up of VMs on a ESXi 4.0.  For some reason despite having setup the ESX to automatically startup the guest VMs upon the host starting up, all I saw is that in the Event log it has marked the VMs starting.  The actual VMs, when clicked on them still has the red stop button pressed and until I pressed the green play button to start it up, they would remain in this "starting" state.

Would this be due to me having put the ESX into Maintenance mode before I powered the server down?  Attached is a picture of the issue.

Reply
0 Kudos
7 Replies
a_p_
Leadership
Leadership

While a host is in maintenance some of the functions are deactivated. One of those functions is powering on VMs on the host. I assume you pressed the play button after exiting maintenance mode!?

André

Reply
0 Kudos
grob115
Enthusiast
Enthusiast

Hello, thanks.  I exited Maintenance mode and didn't see any activities so I manually pressed the Play button.  What's the point of entering Maintenance mode then?

Reply
0 Kudos
a_p_
Leadership
Leadership

Maintenance Mode makes sure the administrator controls the host during e.g. an upgrade and can define when the point in time when the upgrade is done by exiting Maintenance Mode. It is usually used in clustered environments to automatically migrate/vMotion VM's off a host before working on it. DRS and other features also respect Maintenance Mode and will not start or migrate any VMs on/to this host.

André

Reply
0 Kudos
grob115
Enthusiast
Enthusiast

Can you tell me how I can also schedule the exit of Maintenance mode?

Reply
0 Kudos
a_p_
Leadership
Leadership

Can you tell me how I can also schedule the exit of Maintenance mode?

You can't (except for scripting).

What was the reason you entered maintenance mode? If you shot down/reboot a host without entering Maintenance Mode, you'll receive a warning message. However, you don't have to enter Maintenance Mode, if you don't want/need it.

André

Reply
0 Kudos
Alceryes
Enthusiast
Enthusiast

Yeah, in my test environment, I haven't found a good reason to use maintenance mode either. I have all my guests shutdown and startup on a schedule whenever I have to power cycle my host and get the message every time. You can safely ignore that warning message.

(Maybe vmware should change the wording a bit so it doesn't sound like your doing something wrong?)  :smileyconfused:

Reply
0 Kudos
Dracolith
Enthusiast
Enthusiast

Alceryes wrote:

Yeah, in my test environment, I haven't found a good reason to use maintenance mode either. I have all my guests shutdown and startup on a schedule whenever I have to power cycle my host and get the message every time. You can safely ignore that warning message.

From the screen shot, it looks like you're connected directly to the host.

Normally, when using maintenance mode, you would login to vCenter, tell the host to enter maintenance

mode with the production VMs still running and vMotion them to a different host.

In fact... a host cannot enter maintenance mode until all VMs are migrated off  (or suspended/powered down).

After the VMs are migrated to a different host, that other host will be responsible for starting them on boot.

Once a host is in maintenance mode, it no longer runs VMs.

VMs cannot automatically start on boot,  because the host no longer runs VMs.

If you had configured the host to auto-start VMs on boot, that  will not work until you are finished with maintenance mode.

As soon as you exit maintenance mode, you would be able start VMs again.

So yeah, everything you described is the documented, expected operation...

Reply
0 Kudos