Build Your Own 32 Core Home Lab Server

While working on my newest Pluralsight course this past Fall, I decided I wanted an easier way to create virtual machines on my local home network. Sure, it’s easy enough to spin up a few VMs on my iMac, but at some point the limitations of 16 gb of RAM and 4 processor cores was going to catch up with me. For this course in particular, I wanted to be able to run a half dozen or more VMs simultaneously to simulate a real corporate network with all or most components of the Elastic Stack running on it. What I really needed was a cheap virtual machine host with a lot of CPU cores and RAM and some reasonably fast disks. Before getting started on my course, I set out to build a server exclusively for the purpose of serving virtual machines on my network.

The centerpiece of the server build is the Intel Xeon E5-2670 CPU. The 2670 was released in 2012 with 8 cores (16 with Hyperthreading), and a full 20 mb L3 cache. By any measure it was a 64 bit x86 workhorse. Originally 2670s were priced at around $1,500 each which is far above the ~$1000 price tag I was hoping to keep the cost of this server under. As of this writing, however, you can find them on eBay for around $80. If you combine two of them, you end up with 32 logical cores which is an awful lot to spread around for virtual machines.

The Intel Xeon E5-2670 2.6 ghz

I found a used two-pack of the CPUs for $168 on eBay and paired them with a new ASRock EP2C602 dual socket motherboard which I got for $293, also on eBay. The EP2C602 is an older model board and works with DDR3 memory. This is slower than quad band DDR4 RAM but the DDR3 board and RAM parts were a lot less expensive. In fact I found a used kit of 64 GB ECC RAM for $105. Again, I wasn’t really interested in raw speed. I needed bandwidth in terms of RAM size and total logical CPU cores so I could run a lot of VMs simultaneously.

The ASRock EP2C602 dual socket LGA2011 motherboard

For the rest of the parts, I went directly with Amazon. To supply power I got an EVGA Supernova G2 750 watt supply. The important part to know here is that this power supply has two CPU connections which are needed for a dual socket motherboard. Make sure if you are doing a build like this and use a dual socket board that your power supply can properly power both CPUs. The processors are cooled with two Cooler Master hyper 212 fan/heat sink setups.

For the disks, I knew I wanted to use an SSD for the operating system, but for VM storage I wanted a RAID 0 striping setup with cheap disks to get as much bang for my buck as possible. I settled on two Western Digital 1TB 7200 RPM drives that I planned to set up in parallel for a total of 2 TB of space. That would be plenty of breathing room and striping meant they’d perform twice as fast (even though they’d now be half as reliable if one failed). For the OS drive I went with a basic Kingspec 64GB SSD since I knew this drive would only host the operating system. A run of the mill Asus DVD-RW SATA optical drive rounds out the internals.

All of this is housed inside a Phanteks Enthoo Pro full tower case which has plenty of room for the extra large motherboard, drives, and cooling. Importantly, this case supports the SSI EEB motherboard form factor which is a 12″ by 13″ board. That’s the size of the ASRock dual socket board I mentioned earlier and I wanted to make sure it would fit in the case I bought.

The Phanteks Enthoo Pro full tower case

The research I put into all these parts paid off because they all went together perfectly. I chose to use VMWare ESXi  version 6.5 as my host OS since that’s what we use at work and I know it pretty well. The free version of ESXi does not include some of the features you get with vSphere like cloning and external storage, but for my needs it would be perfectly fine. Other good options to use would be Citrix Xenserver or Microsoft Hyper-V. (I may rebuild this machine at some point in the future to fiddle with one or both of those but to date I’ve stuck with VMWare)

VMWare host status screen showing 32 logical CPUs, 64 gb RAM, disk

I _did_ run into one problem. VMWare is very particular about what it considers to be a real hardware RAID controller and it refused to recognize the onboard “software” RAID controller of the ASRock motherboard. It would only see the disks as individual volumes. After consulting with the VMWare compatibility list and doing some eBay researched, I ordered a used LSI MegaRAID 9260 controller for $119 as well as a 3WARE multi-lane SFF-8087 to SATA III cable setup to connect the Western Digital storage drives to the controller. After installing the LSI card, VMWare was happy and I was able to create a single 2 TB storage volume using RAID 0 striping.

Neither XenServer or VMWare won’t recognize most “software” RAID controllers
LSI MegaRAID SATAIII Controller

 

 

 

 

 

 

 

Since adding the LSI MegaRAID card I haven’t had any hardware compatibilities problems at all and the system has been running great. I can run all the virtual machines at the same time with pretty good performance. The VMWare client tools install automatically in Windows VMs and for Linux, I use the open-vm-tools package that is available in most modern distributions.

Running Windows and Linux VMs

My costs are broken out below. As you can see, I went a little over my $1000 budget which was mostly blown by having to add the extra hardware RAID card but in the end it was worth it. Being able to create as many local VMs on my home network as I wanted while creating my Elastic Stack course was invaluable. You could certainly save a good chunk of money by going with a single Xeon CPU and single socket motherboard. You’d only get 16 logical cores that way but that’s still not a bad rig.

All the parts for the build

Cost Breakdown:

 

About JP Toto

JP is a devops developer in Philadelphia, PA. He works at eMoney Advisor by day and attempts to cook by night.

Leave a Reply

Your email address will not be published. Required fields are marked *