I have no idea how to explain this other than just saying what I have and what I'm trying to do.
So I have a synology NAS 1513+. it has 2 4TB enterprise drives. I created a iscsi target and LUN of 500GB.
I've created a iSCSI data store with the iSCSI adapter. It's linked to the synology NAS
My VM Host is an Intel NUC with a core i5 and 16GB RAM and it's running hypervisor 5.5 the latest and greated iso. If you're not familiar with the Intel NUC it's a small 4in by 4in vy 1in block that's pretty beefy for its size. It's also low power and low cost. The only problem is it only has one NIC.
On the host I created a VM on the iSCSI data store.
I can boot up the VM and everything appears fine but it's not able to obtain an IP address. Even if I enter in a static IP I still can't get anywhere.
I've heard other people do this so I didn't think it would be a problem. So is it a problem to have the VM on an iSCSI data store trying to obtain an IP address using the same NIC that the iSCSI is on? I am pretty sure I've done this before so I don't understand why this isn't working now.
Any thoughts?
Much appreciated!
Thanks!
Ok so I've moved the VM off the iScsi hba and onto the local mSata ssd storage hba. I detached the iscsi target, deleted the datastore, disabled iscsi adapter, and rebooted the host. I deleted the guest vm. Saved the VMDK disk file. recreated the VM on the SSD. Moved the VMDK from the iscsi hba to the ssd hba. attached the disk to the new VM. not in that order. I booted up the VM running win2k8 R2 enterprise from my MSDN subscription and it still cannot obtain an IP address. I get 169.x.x.x.
With the Intel NUC the driver isn't imbedded with the hypervisor install so I had to inject the driver using the ESXi Customizer. I know of people that got it working on 5.1 u1. It looks like that it does not work with 5.5 at all.
Only thing I can think of is the intel NIC driver itself is not compatible with vmware esxi 5.5.
This is what I used: NanoLab – Running VMware vSphere on Intel NUC – Part 2 | tekhead.org
I have a DC53427HYE too, and it works fine with ESXi 5.5...
What switch do you use? Cisco switch with Port-Security could cause that issue.
Did you Update your ESXi to 5.5 or is it a fresh install?
Please provide:
~ # ethtool -i vmnic0
~ # vmware -v
~ # esxcfg-nics -l
~ # esxcfg-vswitch -l
~ # ethtool -i vmnic0
driver: e1000e
version: 1.3.10a-NAPI
firmware-version: 0.15-4
bus-info: 0000:00:19.0
~ # vmware -v
VMware ESXi 5.5.0 build-1439689
~ # esxcfg-nics -l
Name PCI Driver Link Speed Duplex MAC Address MTU Description
vmnic0 0000:00:19.00 e1000e Up 1000Mbps Full ec:a8:6b:f9:9b:3f 1500 Intel Corporation 82579LM Gigabit Network Connection
~ # esxcfg-vswitch -l
Switch Name Num Ports Used Ports Configured Ports MTU Uplinks
vSwitch0 1536 7 128 1500 vmnic0
PortGroup Name VLAN ID Used Ports Uplinks
VM Network 0 2 vmnic0
vSAN 0 1 vmnic0
Management Network 0 1 vmnic0
Hi
Welcome to the communities.
My first suggestion is if it was working fine earlier then check if you did any update .
or upgrade firmware version might help to over come this .
Take care!
Thanks for your reply I greatly appreciate your time.
I will get that info tomorrow.
I have a Cisco sg300-10 managed switch, Cisco SG300-10 (SRW2008-K9-NA) 10-port Gigabit Managed Switch - Newegg.com.
It's a fresh install of ESXi using the ESXI-customizer to inject the NIC drivers into the ISO. I have to do that otherwise vmware says there are no supported NICS installed.
I installed vmware onto a USB drive using vmware workstation 10. After the installed I plugged the USB drive into the NUC, configured it, and it booted up fine if I recall.
I haven't done much trouble shooting today.I do want to try different drivers that I found on another site.
If I remember correctly I also had these problems before upgrading the bios version on the NUC, try that first!
// Linjo
I will try that also.
So somehow it is working now. I didn't make any changes that I can recall. Before bed I turned off everything pretty fed up with it. Today on christmas eve I turn on the NUC and try again. After booting up the NUC and then the VM I was able to get an IP address. It's working now. What's weird is that I tried rebooting and shutting it off last night before posting here. It didn't fix it then.
I recreated my datastore on iSCSI and it worked fine. Running 2 VM's now where they are located on my NAS. Performance is fine. It's taking a bit installing 129 updates now though.
So ya that's really weird. I'm not sure what to make of it. In my 20 years in IT I've never seen this before. I'll update whatever firmware and drivers I can think of.
Now that It is somehow working I've been able to play with it a bit. I'm actually kind of disappointed in the performance. I wasn't expecting anything grand but I'm constantly running in turbp mode just doing windows updates. I'm contemplating taking it back and maybe building a 6-core tower instead. Might not be as efficient on power though.
Meh ...