OpenHacking away

Last week I coached a team during a 3-day OpenHack. The team members were new to DevOps but they had incredible energy and we all learned quite a bit. 

We used GitHub and Azure DevOps to create a blue/green deployment pipeline for a few APIs. Put some branch protection on Trunk including a pull request build / tests pipeline. We also used Azure Boards and GitHub issues to track work and bugs.

Good times.  🙂

Azure App Configuration service

The other day I was asked if it’s possible to put objects, arrays, etc as the value for a configuration variable. Hm… Good question.

Figured, how about JSON to the rescue? And it worked.

I created a simple .NET Core console app that reads a JSON string from Azure App Configuration service and parses the JSON into an int array.

In App Configuration service…

  • The Key name is: foo

  • The json string looks like this: {“value” : [0,1,2,3,4]}

  • I set the Content type to JSON, but that isn’t required.

The output from the program is:
this is a test: {"value" : [0,1,2,3,4]}
element is: 0
element is: 1
element is: 2
element is: 3
element is: 4

The sample program is located in my GitHub repo here.

Here are a couple of handy references:


New domain name servers

I just now created a DNS Zone in the Azure portal to take care of Then I copied the name servers from Azure and updated the ones in GoDaddy. Didn’t take long at all.

Grabbed the name servers in the Azure Portal

And updated the ones on GoDaddy…

Big change of direction…

Over the past two weeks I’ve been doing a lot more AKS studies, which included this excellent MS Learn content. It walks you through creating an well-featured AKS cluster, soup to nuts. You owe it to yourself to take a look.

But I’ve also been doing a lot of studying of the Microsoft Power Platform, specifically Power Apps, Power Automate, and Power Virtual Agents. No code, low code environment, which sounds nice and easy, but the learning curve has been larger than I expected. I recently found a bunch of YouTube videos that I’m working my way through.

Anyway, that’s been taking up a lot of my time. Soooooo, I decided to shift gears regarding this AKS project.

Miniblog is a nice engine, but all my content is currently at I didn’t want to have to try to migrate all my old posts into Miniblog, so I used the WordPress bitnami image and threw it into an AKS cluster, and pointed my old domain name at it. I then exported all my wordpress content from and imported it into Worked like a charm. These instructions gave me a great head start. And I pointed Open Live Writer to it so I can compose from my laptop. Easy peasy.

I still need to . . .

  • [] create a post about my adventures in converting Miniblog into docker-able source code and putting it into an AKS cluster
  • [] create a post about creating the Azure DevOps pipeline from the Azure DevOps Service in the Azure portal – 5 minutes and done. Maybe a video would be better, since writing all the stuff down would take a lot longer.
  • [] and a few other odds and ends.

So, head on over to and enjoy. . .

A new domain. . .

I mapped my old domain name to the app service plan. So there’s another item for the TODO list once I get external storage set up:

– [] Copy all the blog posts from over to

Baby steps on the journey . . .

The site in it’s brand-new-baby form is alive and running in a Linux container-backed Azure website here. Again, just getting started with it, so nothing fancy at this point. I’ve learned a ton of stuff just getting this far — not just following some hands-on lab or quickstart tutorial, but having to look things up and try things out. Lots to blog about. I’ve been keeping a blog for forever on WordPress, but I’m going to use the new site to blog about this process.

Also, I used the Azure Portal’s DevOps Project feature to create the site with a docker container backing it, Azure Container Registry, an Azure DevOps CI/CD pipeline linked to this GitHub repo, etc. It took only 5 minutes to stand it up by answering 5 simple questions. (I’ll eventually put the blog in a Azure Kubernetes Service cluster, but I have a bit more learning / work to do so I don’t leave it hackable. 🙂

Update 2020.05.11 – I’m not going to keep I’m too cheap to pay that much every year. 🙂 I’ve decided to revive my old domain name I also bought a custom domain name,, where it lives. Current next steps are to create external storage for the blog so it doesn’t get blown away with each new container deployment. It’s currently using storage within the container, which isn’t good. As in, when I commit this change to master and the CI/CD pipeline runs, it’s going to blow away the container and anything I’ve posted on the site. 🙂 And I need to add an SSL cert to the site so it can do https…..

The beginning of my journey to take a dotnet core app and stuff it into AKS. Properly.

The App is based on MiniBlog.Core by Mads Kristensen

The ultimate goal is to

  1. Run Miniblog.Core safely and securely in AKS. (All proper network / security / healt checks / secrets management / yada yada yada) in place.
  2. Deploy updates to the engine in AKS through GitHub CI/CD pipelines, which will include container scans, etc., using blue/green pattern.
  3. The Docker image will be stored in the Azure Container Registry.
  4. And other things. . .

For now I’ll document my efforts along the way here in the MD file. Once I get the engine up & running in AKS I’ll start documenting things there.


I’m running Ubuntu in WSL (Windows Subsystem for Linux) on Windows 10.

I also have Docker Desktop. It includes the Docker Engine, CLI client, Kubernetes, etc.

And I’m using Linux containers, not Windows Containers.

the app

I cloned the Miniblog.Core repo from

I created a list of interesting links for Azure DevOps pipelines, GitHub repos…

There are a ton of DevOps best practices regarding people, process and tooling. Blog posts, books, videos, etc. DevOps is all about a transformative journey. It’s not just about CI/CD.

Even so, over the weekend I pulled together a list of best practices for Azure DevOps pipelines and GitHub repos. Some of it is overview info, some links contain hands on walkthroughs. You might notice there’s a bit of a Kubernetes flavor sprinkled around. That’s because I’ve been spending some time coming up to speed. There’s a lot to learn in that space. Anyway, here’s what I have so far:

I’ve created a gist over on GitHub. I’ll post new items over there as I find them. . .

Azure Data Factory, Databricks, and encryption at rest

During a call with one of my clients the question came out about encryption of data at rest for Data Factory and Databricks. The good news is that yep, the data is encrypted at rest.

Here are some references. . .

Data Factory Security:

Azure Data Factory does not store any data except for linked service credentials for cloud data stores, which are encrypted by using certificates.

Databricks uses Azure Data Lake Storage, which is encrypted at rest:

Azure Data Lake Storage:

Azure Data Lake Storage Gen2 is a set of capabilities dedicated to big data analytics, built on Azure Blob storage. Data Lake Storage Gen2 is the result of converging the capabilities of our two existing storage services, Azure Blob storage and Azure Data Lake Storage Gen1. Features from Azure Data Lake Storage Gen1, such as file system semantics, directory, and file level security and scale are combined with low-cost, tiered storage, high availability/disaster recovery capabilities from Azure Blob storage.