Installing Nexenta Core Platform 3.0.1 With Nappit on VMware vSphere 4.x

I have been mulling over what exactly the eventual replacement for my Windows Home Server might be one day – and Nexenta is something that I have been pondering for a while.

The Nexenta Core Platform (NCP) is what the commercial (and community) versions of Nexenta (NexentaStor) are built upon.

NCP is based on Ubuntu, with an OpenSolaris kernel. NexentaStor (Community) has a Web Management User Interface (WMUI) and an 18TB limit for storage. NCP has a community developed WMUI called Nappit.

I decided to look at installed NCP and Nappit to get a feel for NCP over NexentaStor Community edition as I have not decided yet on what amount of storage I might want to use Nexenta for. This is because my plan is to use mirroring to provide basic redundancy rather than other forms of RAID. For some storage pools I might use three mirrored drives together rather than two and so I can see this strategy eating into the 18TB limit of NexentaStor Community (although hopefully not too quickly). I guess I don’t want to feel limited with my next storage server.

I am still pondering the pros and cons of virtualizing NCP on VMware vSphere versus running two physical boxes but for now lets look at installing NCP in a vSphere virtual machine.

Note the following keys used during installation:

  • Up and Down arrow keys move the cursor up and down between input fields and check-boxes,
  • Spacebar marks your selection,
  • Tab cycles through the options,
  • Enter confirms your choice and proceeds to the next step.

First download the Nexenta .iso and copy it to your vSphere datastore.

Create a new virtual machine and specify the following Guest Operating System properties – Linux and Ubuntu (64-bit).

I configured 4Gb of RAM with the default LSI Logic Parallel SCSI controller with a 12GB vitrual hard disk.

Finally point the virtual CD-ROM of the virtual machine to the uploaded Nexenta .iso and boot the virtual machine.

Enter a password for root, then press the down arrow key and re-enter your password. Press tab to highlight the OK button and then press Enter.

Login as root (or login as other user, enter su to get root permission).

At this point I tried to install napp-it but discovered that I did not have an IP address. The fix was as follows:

svcadm disable svc:/network/physical:default
svcadm enable svc:/network/physical:nwam

I entered the following command to check that I had an IP address:

ifconfig -a

Now we can install the nappit web interface for Nexenta:

wget -O - | perl

Open your preferred browser and enter: http://<server-ip&gt;:81 to manage your Nexenta installation.



Installing FreeNAS 8 on VMware vSphere (ESXi)

FreeNAS is an Open Source Storage Platform and version 8 benefits not only from a complete rewrite – it also boats a new web interface and support for the ZFS filesystem. It is also worth mentioning that FreeNAS supports Advanced Format drives (something that my Windows Home Server does not).

The features of ZFS are many but it is the data integrity and large capacity support that caught my attention when I first started to ponder alternatives to Windows Home Server (WHS).

The other ZFS contender that has piqued my interest is Nexenta whose community edition has an 18TB limit (although you can run the Nexenta Core version with a community developed GUI without any storage limit). One key difference to be aware of (and these will be moving goalposts) are the versions of ZFS that both of these projects are running. At the moment, for example, it looks like FreeNAS does not support de-duplication of data while Nexenta does.

FreeNAS is designed to be run from a flash drive which is nice in that all of your hard drives can be dedicated to storage. A 1GB drive is recommended as the minimum requirement and this can hold several FreeNAS images – so you can roll back to a previous installation if you experience troubles during an upgrade for example. I will be installing FreeNAS to a 1GB virtual hard disk instead.

I have yet to decide between FreeNAS and Nexenta yet – but for today wanted to get the ball rolling installing FreeNAS 8 in VMware vSphere 4.x.

First download the FreeNAS 8 .iso and copy it to your vSphere datastore.

Create a new virtual machine and specify the following Guest Operating System properties – Other and FreeBSD (64-bit).

I configured 4Gb of RAM with the default LSI Logic Parallel SCSI controller with a 1GB vitrual hard disk.

Finally point the virtual CD-ROM of the virtual machine to the uploaded FreeNAS .iso and boot the virtual machine:

Press 1 to begin installation:

Press Enter to install to default device:

Press Enter again to install to the VMware virtual disk:

Press Enter again to confirm installation to hard drive (installation to USB is the preferred method for FreeNAS but this is not practical on ESXi):

Wait for FreeNAS to copy the image to the virtual drive:

Reboot the virtual machine:

As you can see there are various options for configuring FreeNAS when it boots. For now I will take a quick look at the Web interface.

Open Firefox and browse to the IP address of your FreeNAS installation (as detailed in the previous screenshot).

Use admin for the Username and freenas for the password.

Welcome to the FreeNAS interface!

As you can see there are plenty of options available to configure FreeNAS and (now that installation is complete) that will be the topic of a later blog-post.

Amahi / Greyhole – Alternatives to Windows Home Server (Vail)?

Microsoft dropped the Drive Extender (DE) functionality from Windows Home Server (Vail) this week, to the surprise and disbelief of the WHS community.

Drive Extender allowed users to dynamically shrink and grow their WHS storage pool and enable folder level duplication to protect important files. In Vail DE promised some great new features – such as real time data duplication, background storage operations, uninterrupted media steaming and more. At this juncture I have to say that I am very disappointed with DE being dropped from Vail and find it hard to see how it will gain much traction in the market without it.

I had recently been pondering what the alternatives to WHS might be and was already aware of a similar technology for Linux called Greyhole (as found in products such as Amahi) and was also looking into Nexenta – a ZFS based server platform.

My plan is to evaluate the basic workings of both in virtual machines before I make any decisions – but I did receive a timely email from the Amahi group today promoting Amahi (and Greyhole) as a WHS (Vail) alternative.

The feature set for Greyhole certainly looks to be good – though there are apparently still some “obscure” bugs to be worked out. This is probably to be expected given that Greyhole is still in Beta. I can’t say that the prospect of trusting my data to Beta software gives me much comfort – but equally WHS had some very bad issues in its early days that took a long time to be resolved. This explains why I was not an early adopter of WHS either.

One of the key tests for me will be to see if the ‘automatic free space balancing across disks’ interrupts other server operations such as streaming media or not? I certainly don’t expect it to, but I have not seen anything documenting how this feature works either.

Another feature that sways me towards Greyhole is that drives can be removed from a Greyhole storage pool and read directly by another (Linux) computer. This was certainly something that I appreciated in WHS (although this was not to be the case in Vail).

After Window 7 I had high hopes for Vail and it really is a shame that Microsoft has thrown in the towel with Drive Extender in Vail. While DE certainly was not perfect in WHS version 1 it was a big step in the right direction for the consumer market as it was a huge part of what made WHS “simple” to administer. I will wait and see what the final version of Vail looks like next year – but at this point I am hardly waiting with baited breath.

I still plan to use WHS to backup Windows machines and will probably run it as a virtual machine again in the future.