VMware Cloud Community
jimbob201110141
Contributor
Contributor

esxi 5.1 dell openmanage / lockdown mode

This is probably a really basic question, but I am new to this aspect of esxi, and cannot seem to find an answer.

I am setting up a server (poweredge r515) running esxi 5.1 (free license).  I access it using the vsphere client, and do not have vcenter. 

To configure the hardware (e.g., set up raid), I installed "dell openmanage offline bundle and vib for esxi" on the esxi host.  I use dell openmanage server administrator managed node to connect.

Initially, this all works great.  However, after some time (maybe a day), I see a message (attached) that openmanage login has failed because "Lockdown mode is enabled in the managed node."  I have never changed the Lockdown mode settings and they are disabled on the esxi web ui.  Also, in vsphere client, under Configuration > Security Profile no options appear concerning Lockdown mode.

If I reboot, I am once again able to connect with openmanage, but get the same login error after a day or so.

I must be missing something basic here.  Any ideas?  Thanks much!


Reply
0 Kudos
10 Replies
jagars
Contributor
Contributor

I'm having same problem.  Does anyone have a solution to this?

Reply
0 Kudos
adriancomsa
Contributor
Contributor

Hi,

i am having the same issue after the upgrade to ESXi 5.1 and OMSA install through update manager on a dell R710 host.

is there any way the plugin enables lockdown but withot the check being present?

we need a resolution fast as i have around 50 R710 boxes. Should i expect the same issues on all?

thanks,

Adrian

Reply
0 Kudos
vivari
Enthusiast
Enthusiast

Lockdown  mode should be disabled first. To disable follow the bello steps.
Any one of the option is enough.
To configure Lockdown mode:

Log directly into the ESXi host.
Open the DCUI on the host.
Press F2 for Initial Setup.
Toggle the Configure Lockdown Mode setting.
Enabling or disabling the Lockdown mode using ESXi Shell

You can run these commands from the vSphere CLI to verify the status of the Lockdown mode and to enable/disable it.
ESXi 5.x and 4.1
To check if Lockdown mode is enabled: vim-cmd -U dcui vimsvc/auth/lockdown_is_enabled
To disable Lockdown mode: vim-cmd -U dcui vimsvc/auth/lockdown_mode_exit
To enable Lockdown mode: vim-cmd -U dcui vimsvc/auth/lockdown_mode_enter
Enabling or disabling Lockdown mode using PowerCLI

To enable Lockdown mode using PowerCLI, run this command:
(get-vmhost <hostname> | get-view).EnterLockdownMode() | get-vmhost | select Name,@{N="LockDown";E={$_.Extensiondata.Config.adminDisabled}} | ft -auto Name LockDown
To disable Lockdown mode, run this command:
(get-vmhost <hostname> | get-view).ExitLockdownMode()
Reply
0 Kudos
adriancomsa
Contributor
Contributor

hi Vivari,

thanks for your message.

in ESXi Shell the Lockdown mode is not enabled.

could it be configured but the check not being present?

thanks.

Later EDIT:

please see below the return of the check if this is enabled or not:

~ #  vim-cmd -U dcui vimsvc/auth/lockdown_is_enabled
false

the message is clear, lockdown is not enabled.

Still the openamanage sees it enabled.

Any  sugestions?

thanks,

Adrian

Reply
0 Kudos
adriancomsa
Contributor
Contributor

hi ,

the OMSA agents started working correctly after restarting management agents on the affected host.

cheers,

Adrian

Reply
0 Kudos
jagars
Contributor
Contributor

Restarting management agents fixed for me as well. Thank you.

Reply
0 Kudos
adriancomsa
Contributor
Contributor

seems that this is not solving the issue for good as after a few hours the message is back.

still looking for an answer to this.

Reply
0 Kudos
jimbob201110141
Contributor
Contributor

I was having the same intermittent problem when I posted this question.  After the last reboot of the entire system (not just the management agent) about two weeks ago, however, the problem has not yet returned.  Unfortunately, I don't have a real solution for you ...

Reply
0 Kudos
SilvioSantoZ
Contributor
Contributor

Exactly the same problem here.

Confirmed three R710 boxes, both with and without H800 and MD1220 storage.

One of them also has an MD-1000 DAS and disk performance is very bad (on VMWare 4.1U3 was good/acceptable).

Exactly the same VMWare ESXi 5.1 + OMSA setup works flawlessly on an R720xd for a few weeks now.

I have also a T410 which seems to work (not confirmed since it's too recent - but it does not have a PERC card, it has the el-cheapo one I can't remember the name).

Somehow this all seems to be related to PERC.

Reply
0 Kudos
Jason510
Contributor
Contributor

I've been having the same issue for months now across several 5.1 and 5.1u1 systems.  They include R510, R710, and R720 servers.  They all have internal PERC controllers with direct attached storage.  I found a workaround at the Dell TechCenter forums:

http://en.community.dell.com/techcenter/systems-management/f/4494/p/19491101/20322503.aspx#20322503

SSH into your ESXi host and run:

/sbin/services.sh restart

This allowed me to use OpenManage Server Administrator across all of the hosts.

Reply
0 Kudos