In my lab I have a 3 node vsan cluster. I reinstalled esxi on 2 of them and swapped the disks around to break the RAID so to speak. I reinstalled ESXi and now both of them purple screen and says multiple file systems detected. Other than booting up to a Linux Live disc and wiping the disc's is there a better way to destroy the old vsan. I am assuming that when the new ESXi boots up it seeing the partitions on the other drives and freaks out. I am using BOSS cards on my Dell servers. When I go into Life Cycle manager I can not see my other disks. I only see the boss ones. However when I install ESXi I see the other disks that had the old VSAN. How can I destroy my old VSAN?
@RadarG, is it PSODing with a vSAN module mentioned in the backtrace? (e.g. PLOG, LSOM, CMMDS, virsto)
If so then you can disable all the vSAN modules in pre-boot using the steps here https://kb.vmware.com/s/article/66996 and then delete the partitions off the vSAN devices (with 100% understanding the data on them will be gone).
Note that if it is PSODing for a different reason e.g. ESXi filesystem then the above won't help, if that is the case then I would advise removing one of the maybe duplicate devices and formatting that on another server/computer before adding it back (assuming it can't be done via iDRAC).
perhaps you're hitting something like tihs
https://kb.vmware.com/s/article/2000476
looks like breaking the raid perhaps some how left multiple ESXi OS filesystems present and this triggered when you re-installed
if its the OS disk you need to zap then use a linux live cd and find the disks,
vSAN will have 2 partitions , an ESXi OS boot disk will have several (5 if I recollect)
Hi @RadarG , did one of these KB's answer your question? If so, please check the "verified solution" button in order to better help your peers find this information!
Thanks!
Jamie - Digital Support Team