This is probably a really basic question, but I am new to this aspect of esxi, and cannot seem to find an answer.
I am setting up a server (poweredge r515) running esxi 5.1 (free license). I access it using the vsphere client, and do not have vcenter.
To configure the hardware (e.g., set up raid), I installed "dell openmanage offline bundle and vib for esxi" on the esxi host. I use dell openmanage server administrator managed node to connect.
Initially, this all works great. However, after some time (maybe a day), I see a message (attached) that openmanage login has failed because "Lockdown mode is enabled in the managed node." I have never changed the Lockdown mode settings and they are disabled on the esxi web ui. Also, in vsphere client, under Configuration > Security Profile no options appear concerning Lockdown mode.
If I reboot, I am once again able to connect with openmanage, but get the same login error after a day or so.
I must be missing something basic here. Any ideas? Thanks much!
I'm having same problem. Does anyone have a solution to this?
Hi,
i am having the same issue after the upgrade to ESXi 5.1 and OMSA install through update manager on a dell R710 host.
is there any way the plugin enables lockdown but withot the check being present?
we need a resolution fast as i have around 50 R710 boxes. Should i expect the same issues on all?
thanks,
Adrian
hi Vivari,
thanks for your message.
in ESXi Shell the Lockdown mode is not enabled.
could it be configured but the check not being present?
thanks.
Later EDIT:
please see below the return of the check if this is enabled or not:
~ # vim-cmd -U dcui vimsvc/auth/lockdown_is_enabled
false
the message is clear, lockdown is not enabled.
Still the openamanage sees it enabled.
Any sugestions?
thanks,
Adrian
hi ,
the OMSA agents started working correctly after restarting management agents on the affected host.
cheers,
Adrian
Restarting management agents fixed for me as well. Thank you.
seems that this is not solving the issue for good as after a few hours the message is back.
still looking for an answer to this.
I was having the same intermittent problem when I posted this question. After the last reboot of the entire system (not just the management agent) about two weeks ago, however, the problem has not yet returned. Unfortunately, I don't have a real solution for you ...
Exactly the same problem here.
Confirmed three R710 boxes, both with and without H800 and MD1220 storage.
One of them also has an MD-1000 DAS and disk performance is very bad (on VMWare 4.1U3 was good/acceptable).
Exactly the same VMWare ESXi 5.1 + OMSA setup works flawlessly on an R720xd for a few weeks now.
I have also a T410 which seems to work (not confirmed since it's too recent - but it does not have a PERC card, it has the el-cheapo one I can't remember the name).
Somehow this all seems to be related to PERC.
I've been having the same issue for months now across several 5.1 and 5.1u1 systems. They include R510, R710, and R720 servers. They all have internal PERC controllers with direct attached storage. I found a workaround at the Dell TechCenter forums:
http://en.community.dell.com/techcenter/systems-management/f/4494/p/19491101/20322503.aspx#20322503
SSH into your ESXi host and run:
/sbin/services.sh restart
This allowed me to use OpenManage Server Administrator across all of the hosts.