From Automation Noob to…Automation Noob with a Plan

Note: This post is being published at the same time as a lightning talk that is being delivered at Nutanix .NEXT 2018 by the same title. It contains link to the various resources mentioned during the talk.

I’ve been working in IT for 15 years and I think my story is very similar to that of many of my peers. I had some programming courses in college, but decided that I was more interested in infrastructure and chose the route of a systems administrator over that of an application developer.

Early in my career most of my tasks were GUI or CLI driven. Although I would occasionally script repetitive tasks, that would usually consist of googling until I found someone else’s script that I could easily change for my purposes. Most of the coding knowledge I had gained in college I either forgot or was quickly outdated.

Fast forward to the current IT infrastructure landscape and automation and infrastructure as code are taking over the data center. Not wanting to let my skills languish, I have embarked upon improving my skills embrace the role of an infrastructure developer.

The purpose of this blog post is not to teach the reader how to do anything specific, but to share what methods and tools I’ve found to be useful as I attempt to grow my skill set. My hope is that someone will be undergoing the same career change as myself and find this to be useful. I’ve heard many tips and tricks some of which I have taken to heart as I work towards my goal. I will devote a bit of time to each of these:

  • “Learn an language.”
  • “Pick a configuration management tool.”
  • “Learn to consume and leverage APIs.”
  • “Have a problem to solve.”

I’m going to take these one by one and break down my approach so far.

Learn a Language

When I first heard this I was unsure which language I should learn and where I should start. I actually cheated a bit on this one. I chose two languages: Python and PowerShell. I chose these based on the fact that they are both powerful, widespread, and well suited to most tasks I would want to automate.

PowerShell

To get away from googling other people’s PowerShell scripts and actually create something myself I wanted to actually understand the language and how it handles objects.

I’d heard many mentions of the the YouTube series “Learn PowerShell in a Month of Lunches” by Don Jones. I made my way through this series and have found it very valuable in understanding PowerShell. There is a companion book by Don and Jeff Hicks available as well. I have not purchased the book myself, but have heard good things.

Another great way to learn PowerShell or any other number of technologies is Pluralsight. Jeff Hicks, one of the authors of the previously mentioned book, also authored a course named “Automation with PowerShell Scripts.” I am still making my way through this course but like pretty much everything on Pluralsight it is high quality. If you have access to Pluralsight as a resource I highly recommend taking advantage of it.

Python

I was even less familiar with Python than I was with PowerShell before making the decision to enhance my automation skills. Although I understand general program principals, I needed to learn the syntax from the beginning. Codecademy is a great, free resource for learning the basics of a language. I made my way through their Learn Python course just to get basics under my belt.

Codecademy was a great way to get started understanding Python syntax, but left me with a lot of questions about actual use cases and how to start with scripting. Enter another “freeish” resource, Packt. I say freeish because Packt gives away free ebook every day. I check this site every day and have noticed that some books have popped up multiple times. Many of these are Python books and one that I have been spending my time on in particular is Learning Python by Fabrizio Romano. My favorite method of learning is to have a Python interpreter and the book opened in side by side windows on my laptop. Not pictured is the code editor will keep opened in the background or on another monitor.

LearningPython

Another resource worth mentioning is Google’s free Python course. I’ve only looked at this briefly and have not spent much time on it yet. It appears to be high quality and easy to follow, and at $0 the price is right!

Pick a Configuration Management Tool

There are many of these out there and choosing the right one can be difficult. What makes one better than the other? You’ve got Puppet, Chef, and Ansible for starters. Fortunately for me the decision was easy as my employer at the time was already using Puppet, so I just decided to dive in.

The first thing I did was download the Puppet Learning VM. This a a free VM from Puppet that will walk you through learning the functionality of puppet, not just by reading the included lessons (pictured below), but by accessing the VM itself by SSH and actually interacting with Puppet.

LearnPuppet

You can run this VM in a homelab or even on your old PC or laptop given you install something like VMware Workstation or VirtualBox. Learning Puppet takes place in the form of quests. You are given an objective at the end of a module that you must achieve within the Puppet VM and you will see progress indicator on the screen informing you when you have completed various tasks. This is one of the coolest “getting started” guides that I have ever come across and I cannot recommend it highly enough.

As great as the Puppet Learning VM was, I wanted to strike out on my own and really get to know the product. To do this, I used old hardware at work to setup a lab and mirror our production environment as best I could. When things didn’t work my troubleshooting efforts usually led me to the Puppet User Group on Google.

This is a great resource for anyone who has questions while trying to learn Puppet. I’ll be honest and admit that I’m not very active in this group. I mostly just subscribe to the daily digest, but reading other people’s questions and seeing how they resolved their issues has been very helpful for me.

Where things got really interesting with Puppet was when I broke my vCloud Director lab entirely. By recycling many of the Puppet modules and manifests from my production environment I managed to overwrite the response file of my vCD lab. These response files are specific to each installation of vCD and are not portable. Although this was a lab and I could just start over, I was determined to fix it in order to get a better understanding of vCD. This taught me an important lesson about paying attention to the entirety of the configuration you will be enforcing, regardless of the tool you are using.

Learn to consume and leverage APIs

This one is the newest too me and I am only getting started. Recently vBrownBag hosted a series called “API Zero to Hero” and I watched them all. I also have downloaded Postman and use it to play with an API every time I want to learn a little more.

When it comes to actually doing something with an API, I am interested in leveraging the API in Netbox, the excellent DCIM tool by Jeremy Stretch of Digital Ocean, to provision and assign things like IP addresses, VLANs, etc. and save myself some manual work.

Have a problem to solve

All the learning in the world won’t do you much good if you don’t apply it. Shortly after acquiring a little bit of PowerShell knowledge I felt comfortable enough to write some basic scripts that helped me save some time in the Veeam Console. I delivered a vBrownBag tech talk on this at VMworld 2017 and you can view it in the video below if you are interested.

Long story short. I wanted to quickly change a single backup job on dozens of Veeam Backup jobs. Rather than click through all of them I dug through the Veeam PowerShell documentation and tested things out until I came up with 7 lines of code that would allow me to toggle storage integration on or off for all backup jobs.

Bonus Tips

Now that I’ve walked through my approach to the various pieces of advice I received over the years, I’m going to leave you with a couple more.

Pick a code editor and use it

We are long past the days of quickly editing .conf files in vi or parsing data in Notepad++. There are numerous code editors available and I’ve tried a few like Sublime Text and Atom Text. My favorite that I’ve settled on is Microsoft Visual Studio Code. It’s robust, lightweight, and extensible. What more could you want?

Use version control

Project.ps1, ProjectV2.ps1, and ProjectV2final.ps1 are not acceptable change tracking in an era of Infrastructure as Code. Its important to learn and use version control as you are creating and updating your automation projects. The most popular and widely used resource for this is git and more specifically GitHub if you want to host your code publicly, or GitLab to host the repositories yourself.

To get started learning Git, once again I would recommend starting with CodeCademy if you are completely new. They have a Learning Git class that is a suitable introduction. If you already have the basics handled, then you may want to learn some of the more advanced concepts outline in the vBrownBag #Commitmas series.

Making sense of it all

At this point you can tell that I have my work cut out for me. I have listed several different skills that I am trying to learn or plan on learning to stitch together into a useful skill set. If you find yourself in the same situation, the best encouragement I can give you is to keep at it.

You may not be able to sit and learn everything you need without interruption. I certainly haven’t. But whenever I have some spare time available I go back to the latest course or YouTube series I’m working on and try to make progress. If you are reading this post and find yourself in a similar position, I encourage you to do the same and keep at it!

Slides from the lightning talk related to this post are available here. Links to all the resources from this blog post are embedded in the slides as well.

News from Nutanix .NEXT 2018

Nutanix is holding their annual .NEXT conference in New Orleans this year and have made several new announcements and enhancements to their platform. I will be highlighting some of my favorites

Flow

Flow is the name of the Nutanix Software Defined Networking (SDN) offering. As of version 5.6 Nutanix had the capability to perform MicroSegmentation of workloads. With release of Flow, Nutanix will be extending their SDN capabilities to include network visibility, network automation and service chaining.

FLOW

Flow is available for customers using the Nutanix Acropolis Hypervisor (AHV). There will be no additional software to install. Many of the SDN enhancements coming to AHV are a result of the recent Nutanix acquisition of Netsil.

FLOWNetsil

If you are in New Orleans for .NEXT this week there are a few sessions/labs that will be a great opportunity to learn more about SDN in Acropolis.

FLOWSessions.png

ERA

Era is Nutanix’ first foray into PaaS and is a Database management solution. Using ERA, Nutanix customers will have the ability to manage their databases natively on their Nutanix clusters.

ERAcdm.png

ERA’s Time Machine feature will allow customers to stream their Database to their Nutanix cluster and clone or restore to any point in time up to the most recent transaction.

ERATimeMachine.png

The first release of ERA will support Oracle and PostgreSQL databases.

ERA1.0

Long term, the ERA of vision aligns well with the overall Nutanix philosophy of giving customers choice. Having the ability to manage a DB of a customer’s choice and in the cloud of their choice is the reality that Nutanix is aiming for with ERA.

ERAvision.png

Beam

Beam is the result of the Nutanix acquisition of Botmetric. Beam is targeted at enabling customers to have visibility and control of cost across multiple clouds. In the first release Beam will support Nutanix Private clouds as well as AWS and Azure, with GCP support coming in a future release.

Nutanix Beam Botmetric

Nutanix Beam NTC

Nutanix Beam Cost Optimization

Its clear with the announcements by Nutanix at .NEXT 2018 that hybrid or multi cloud strategies are the goal of Nutanix and their platform. Giving customers the freedom and choice and the ability to provide the right services in place to enable their business to succeed in the 21st century.

How to Create a DellEMC XC CORE Solution With the Dell Solutions Configurator

Nutanix and DellEMC announced a new way to purchase a Nutanix solution on Dell hardware this week in the form of Dell XC CORE. Customers can purchase hardware delivered to give a turnkey Nutanix experience and still purchase software separately preventing a license from being tied to a specific appliance.

For Nutanix partner resellers there has been a bit of confusion regarding how we can configure and quote the XC CORE nodes for customers seeking these options. After a little bit of searching I have completed a couple of these configurations for my customers.

It looks like some partners have different versions of the Dell configurator with some allowing them to select some options at the time of solution creation. The version I have access to does not so I had to dig a bit to find where I could configure XC CORE nodes.

New Solution

After creating a new solution, I navigated down to the available Dell XC nodes that were previously available to me.

DeviceSelection

By selecting a Dell XC node that was listed in the XC Core datasheet and expanding the first item, “XC640ENT” in my case, I found a new option. Dell EMC XC640ENT XC CORE.

ConfiguratorXC640ENT

The rest of the configuration was as familiar as any Nutanix configuration and I was able to complete the solution according to the sizing data I had already configured.

Proud to be Selected as a Veeam Vanguard 2018

I woke up on the morning of March 6, 2018 to a very pleasant surprise. I had received the email pictured below from Rick Vanover, Director of Product Strategy at Veeam Software inviting me to be a member of the Veeam Vanguard.

Screenshot_20180307-103741~2.png

Needless to say I immediately thanked Rick for selecting me and accepted his invitation. I had the pleasure of meeting several members of the Veeam Vanguard when attending VeeamON last year. A few of them encouraged me to apply and I want to specifically thank Matt Crape, Rhys Hammond, and Jim Jones for their support.

 

I gained my VMCE certification while in New Orleans and have since led the majority of Veeam PreSales discussions on behalf of my employer. I also had opportunity to deliver vBrownBag tech talk on the VMTN stage at VMworld last year about my experience Automating Repetitive task in Veeam Backup and Replication using the Veeam PowerShell Snap-In. I look forward to continued involvement in the Veeam community and look forward to being an active participant in the Veeam Vanguard Program.

 

Building an On-Demand Cloud Lab

This time last week, I was in Palo Alto, California for the VMUG leader summit, which was great.  Prior to the summit I submitted an entry into a new VMUG program called SPARK.

To participate in SPARK a leader was asked to submit two power point slides outlining an idea for a way to provide value back to the VMUG community. Additionally we were asked to submit a video that would be played back at the summit which would be played back before all leaders present voted for the favorite submission. The Indianapolis VMUG submission was an On-Demand Cloud lab built on VMware Cloud on AWS.

It was a tight race, but I’m proud to say that Indianapolis VMUG won the first VMUG SPARK contest and will receive $5000 funding to help make this idea a reality.

Here’s the point of the post though. I can’t do this alone, and in fact I don’t want to. I want to involve the community to make it as great as it can be. I want to hear other’s ideas as to how we can create an infrastructure that can be spun up quickly and easily and help other VMUG members learn new skills.

We will be holding our initial planning calls after the new year at whatever date/time works for the best for the most participants. If you would like participate please reach out to me via Twitter. My DMs are wide open and all are welcome. We can make a great platform for everyone to learn on if we work together!

 

Backup pfSense from Ubuntu 14.04

Back in mid 2016 Github user James Lavoy released a python script to backup pfSense. I was excited because pfSense didn’t (and still doesn’t) have a built in backup scheduler. Sure, you can backup manually from the GUI, but I don’t trust myself to remember to do that after time I make a change to my config.

I downloaded the script and changed the necessary options to point it to my pfSense box and supplied the credentials necessary for backup. I unfortunately received the following error:

AttributeError: 'module' object has no attribute '_create_unverified_context'

I quickly found out that this was due to my OS running Python 2.7.6 and SSLContext was introduced Python 2.7.9.  I reported this issue on James’ Github and he suggested installing Python 2.7.9 or later from another ppa as coding around the issue would require a complete rewrite of the script.

James has updated the README.md to indicate this requirement but his instructions are a bit out of date. The ppa specified is not being maintained and the fkrull/deadsnakes ppa should be used instead. To install python 2.7.12 and mechanize and meet all requirements of this script run the following commands:

sudo add-apt-repository ppa:fkrull/deadsnakes
sudo apt-get update
sudo apt-get install python2.7 python-pip
pip install mechanize

Before running the script, make sure you edit it to point to your pfSense box’s IP address (and https port if necessary) and supply the correct credentials. Whether you are running this manually or automated you will need to specify the path for Python 2.7.9+ as it is not necessarily invoked by default by the “python” command. For example I have to run:

40 14 * * * /usr/local/lib/python2.7.12/bin/python /media/backups/pfsense-backup-master/pfsense_backup.py

In order to automate this, you’ll want to add a cron job. Do so by editing your crontab:

crontab -e

My crontab entry looks like this:

40 14 * * * /usr/local/lib/python2.7.12/bin/python /media/backups/pfsense-backup-master/pfsense_backup.py

This will run the script every day at 2:40 pm. Why 2:40? Why not?

Some may be wondering why I didn’t just upgrade my VM to Ubuntu 16.04. I tried and many services failed to load or lost their config after the upgrade, so I rolled back to a snapshot. 14.04 will continue to receive updates until 2019 as it is an “LTS” release and I anticipate migrating off the server by then anyway.

If you are like me and don’t need to be on the latest and greatest OS, but want to be able to use scripts like this, hopefully this will help.

OVA Template Deployment Stuck “Validating”? Try PowerCLI!

I recently made the switch from working as a customer to working as a Solutions Architect at a VAR. I had bought a number of Intel servers from various OEMs during my career but never Cisco UCS. However I have plenty of customers these days who are currently UCS customers or are interested in UCS in their infrastructure.

For this reason I decided to download the Cisco UCS Platform Emulator. The UCS Platform Emulator is a free tool that allows risk-free experimentation in a UCS manager environment. It can be downloaded as a .zip containing all virtual disks and metadata, or simply as a singly .ova file for easy deployment. Naturally I opted for the .ova file as I have a full vSphere environment running in my homelab thanks to VMUG Advantage.

Once I had the bits in hand I fired up the new HTML5 vSphere client and started the “Deploy OVF Template” wizard. Even though the new HTML5 client is new to me, the wizard was intuitive and similar to what I was previously used to with the C# client and Flash based vSphere Web Client. I hit a roadblock at one point though when the wizard display a message that it was “Validating” and appeared to make no progress.

Validating

Ooookay, well I guess I’ll fire up the Flash client, wait for it to load and deploy from there.

Unsupported

Well it looks like I can no longer deploy templates from the vSphere Web client in vSphere 6.5. Apparently my choices are troubleshooting the HTML5 client or nothing….or are they?

Enter PowerCLI

I’ve spent the last 12-16 months familiarizing myself with PowerCLI so this was the perfect opportunity to see if there was a way to deploy my template without the need of the GUI. I quickly found the Import-Vapp cmdlet which is thoroughly documented here.

Running through the options available I constructed the test below:

Import-VApp -Source \\192.168.2.6\Data\HomeLab\CiscoUCS\UCSPE_3.1.2e.ova -Name UCSPE -VMHost (Get-VMHost -Name esx06.kennalbone.com) -Datastore (Get-Datastore -Name NFS-FS2-ProductionFast) -DiskStorageFormat Thin -Location (Get-ResourcePool -Name Normal) -Whatif
What if: Performing the operation "Importing '\\192.168.2.6\Data\HomeLab\CiscoUCS\UCSPE_3.1.2e.ova'" on target "Host 'esx06.kennalbone.com'".

Adding -Whatif allows testing before making any actual changes to objects in PowerCLI/PowerShell. With this test behind me I dropped the -WhatIf parameter and deployed my OVA file for real.

Import-VApp -Source \\192.168.2.6\Data\HomeLab\CiscoUCS\UCSPE_3.1.2e.ova -Name UCSPE -VMHost (Get-VMHost -Name esx06.kennalbone.com) -Datastore (Get-Datastore -Name NFS-FS2-ProductionFast) -DiskStorageFormat Thin -Location (Get-ResourcePool -Name Normal)

The deployment went by so quickly I wasn’t sure if everything completed properly. A quick check with “Get-VM” showed that the new VM did exist.

Get-VM -Name UCSPE

Name                 PowerState Num CPUs MemoryGB
----                 ---------- -------- --------
UCSPE                PoweredOff 1        1.000

A quick power on and check of the VM console showed that the VM was in fact deployed and booting properly.

VMConsoleWindow

You can try this for yourself. Just replace the text in the brackets with the necessary information in your own environment.

Import-VApp -Source <FullPathtoTemplate.ova> -Name <VMName> -VMHost (Get-VMHost -Name <ESXiHostName>) -Datastore (Get-Datastore -Name <DatastoreName>) -DiskStorageFormat Thin -Location (Get-ResourcePool -Name <ResourcePoolName>)