SMB Storage Device Not Ready For VMware
Drobo recently released the latest version of the Drobo Elite. They should have kept it in the oven a little longer. We decided to use this SMB storage device because it is VMware Certified, low-cost and has built in 2-disk redundancy.
This Drobo has 8 bays and can handle any size, speed and type of disk. We filled it with 2TB 10K SATA drives from Western Digital.
The original design consisted of two Dell R510 servers running Vsphere 4.1 with fail-over mode enabled. All VMs will reside on the Drobo.
The first sign of trouble appeared during a tech support all to Drobo. I needed some help configuring the iSCSI connections so I could add the datastores to the VMware hosts. The tech I spoke to didn’t know how yet as the model was brand new. Uh-oh.
I figured it out on my own. The Drobo Dashboard software needs to be installed on a physical machine on the same network as the iSCSI connections. From there, you can create partitions and set the RAID level. I chose the 2-disk redundancy level. I would regret that decision later. I created several 2TB partitions
Once the partitions were made, I was able to add the datastores to both Vsphere hosts and all looked good. I copied the ISO files up and installed Small Business Server 2011 without a hitch.
The virtual SBS Server had a dedicated GB NIC on its own vSwitch. The vNIC was the Intel 1000 emulator. During troubleshooting I did try the VMXNET3 vNIC and it made no real difference in throughput. The physical NIC was connected to a Juniper EX2200 switch with the MTU size set to 1500 on the switch and vNIC.
So, we get onsite and install the infrastructure. All is fine until we actually start copying files from physical machines on the network to the SBS server. The throughput was very poor. Just using RDP to get into the server was agonizingly slow. I got that rock in my stomach feeling that we may be in trouble.
– 16kbps to 1 MB/ sec throughput on the vNIC.
– Latency in the 800 ms range for Reads and Writes to the datastore.
– Timeouts were likely but unconfirmed. We had some file copy resets and errors.
1. Tried Jumbo Frames on the vNIC and Juniper switch. It helped a little at first, but the latency in reads/writes continued
2. Added the VMXNET3 adapter and ran the Fix My Network wizard in SBS to establish the new vNIC.
3. Swapped the ethernet cables with new CAT6
4. Moved the iSCSI connections off the switch. Configured crossover cables and connected directly to the vHost
All of those made no difference. We figured at that point that it was probably the overhead of the 2-disk redundancy of the Drobo. It’s a software based RAID and most likely resource intensive.
Now we were neck deep in trouble. The client had been in the office the day after the installation. Mail Services, Internet Access, BES and File Access were all very slow and had disconnects.
Finally, we said hell with it. We had to move the virtual SBS off of the Drobo. The problem was the vHosts weren’t designed with enough disk space to house the SBS as is. I used the VMware Converter to run a V2V conversion. In the process I shrunk the disks to fit in the vHost.
A 120 GB disk took 8 hours to get from the Drobo to the vHost over the cross connected iSCSI connections.
Once on the vHost, the network throughput increased to 20MBps. Latency dropped to 16 ms from 800 ms. No network drops and all services responded quickly.
We plan on dismissing the Drobo- Office Space Style. I’ll be the one with the bat.
Categories: IT Pros