In the previous posts we have seen the steps of how to setup our personal home lab. For this purpose I have used an Intel NUC. Apart from packing interesting features it is lacking in terms of available Network devices that could be used for VMware vSphere deployments. Hence in this article we’ll explore the option to add VMware vSphere Drivers for new netowrk adapters. As a matter of truth the Intel NUC is shipping with a built-in Intel 219 network adapter along with a soldered Intel Wireless antenna (8260 M.2).
Not particular appealing if you would like to test scenarios where the built-in network card is becoming your single point of failure. So a workaround can be by just adding external network cards taking advantage of the USB 3.0 ports available. They provide enough bandwidth to achieve Gigabit throughput very similar to real Gigabit interfaces. For this reason it might be handy to install new VMware vSphere Drivers to support even more Network cards should the built-in drivers not be suitable to recognise new hardware.
The good news is though the Intel NUC is shipping with 4x USB 3.0 (2 front / 2 back) plus additional 2x USB 2.0 internally through an header. A quick look at shopping sites also shows the cost of USB 3.0 Network Adapters is very reasonable. Ideally you might want to add up to 3 USB Network Adapters which will add 3 more vNICS to your vSphere ESXi Host. With a total of o 4 vNICs per Host possibilities for home lab scenarios are endless:
- Physical separate vNICs for different traffic types (Management, VM traffic, vMotion, iSCSI, Snaphtots etc..)
- Custom TCP Stacks in ESXi Hosts
- Custom UpLink configuration for Standard and Distributed Switches
- NIC Teaming, High Availability, Load Balancing scenarios
- Custom configurations for VMware HA, FT DRS and a lot more
So with these in mind I would like to cover in this post on how to install extra VMware vSphere drivers for additional network cards. For this purpose I have successfully tested the setup for the following USB Network Adapters based on the Realtek r8152 driver:
All of them are based on the same network driver. Also for consistency I decided to buy the exact same set of USB Network Adapters I did for the first Intel Nuc when testing them just to have some sort of consistency. At this point we are ready to start.
Install VMware vSphere Drivers for new network cards
The installation of the new drivers to add the USB Network Adapters comes in a VIB format (vSphere Installation Bundle) that can be considered as a sort of archive where all the necessary resources are stored to perform the installation of new drivers, update the existing ones and even install patches. In this case I’m using a VIB file compiled from Jose Gomes and available from his excellent post.
So first of all let’s copy upload the drivers on the Datastore.
Since we’ll be using the ESXi command line to install the drivers we need to enable the access making sure these 3 services are up and running:
- Direct Console UI
- ESXi Shell
- SSH
Now in case we would like to use the ESXi Console these 3 services should be up and running. Alternatively we can enable just the SSH Server service and then use a SSH Client like putty to access the ESXi Console Host. Of course in both cases let’s also make sure the necessary ports are open.
At this point once we are logged in we can issue the following command to query about the configured and working network connection as per standard ESXi installation:
esxcli network nic list
which returns the built-in network card as expected on “vmnic0”.
Let’s connect our USB 3.0 Network Adapters and connect them to our switch making sure at least the lights are blinking. Since the new VIB is not listed in the officially supported list of ESXi extension than we need to change the Host Image Profile Acceptance Level to Community Supported in order to install our custom VIB. This setting can be found in the Manage > Settings > Security Profile for the Host where we intend to install this VIB.
Let’s go back to our putty SSH session and verify the location of the VIB package. Once located we can simply issue the command:
esxcli software vib install -v “path-to-vib-file”
As per screenshot below I’m installing the most recent version. At the time of writing that is r8152-2.06.0-4_esxi60.vib.
As the screenshot is showing the installation is successful and does not require Host reboot.
At this point if we list the previous command again to list the available network cards we should get something like this. In my case with two extra NICs: vmnic32 and vmnic33.
At this point we can use our new vNICS or even adding more by using multiple USB Adapters. But it looks like there is a performance drop in TX compared to RX. 4 vNICs in total per Host for a home lab should be more than sufficient with most testing scenarios.
Another useful command to list installed VIBs can help us determining which versions are installed along with other useful information.
esxcli software vib list | grep -i r8152
I would also like to note the following:
- As a precaution power down all VMs or “move” them with vMotion to a different Host
- Before install any new VMware vSphere driver or package put the Host in Maintenance mode
- Once the packages have been installed resume from Maintenance mode and Restart VM’s
This concludes this quick post about installing new VMware vSphere Drivers to add network cards into our ESXi Hosts.
Add Comment