Hi I'd like some comments about connecting a Datastore to 2 different Datacenters.
Both have very similar setups, each has an EMC Unity array, ESXi v7+ hosts, and the same iSCSI networks / networking is available at both sites on 10G links. All ESXi hosts are on the same iSCSI networks.
Right now only cold svMotions work from one datacenter to another. Unless I'm missing something - in order to do live vMotions from one datacenter to another all hosts will need to have datastores mounted.
Though it seems like it would be fine - I have some apprehensions about doing this.
When Googling I found that i may raise some alarms, like "Datastore is in multiple datacenters" found in a few forum comments and articles.
Can anyone comment on doing this? Is there a need to worry about data corruption or datastore locking issues? Are there some other drawbacks to this scenario? The main driver is to be able to vMotion from one datacenter to another without having to shut down guests. I found mention that this is not best practice - but why? My guess is that maybe when there are multiple Datacenters there could also be a WAN involved and make for very high latency - but in my situation this is all on local 10G LAN.
I look forward to hearing your comments.