Javier Caceres – jacace

Javier's blog about C#, Software Architecture and Design

Quick Start with CHEF

Hello colleagues,

This is not a long tutorial of CHEF, it is simple a summary of the most important commands to get up and running in CHEF.

Commands to create a recipe:

create directory <dir>, e.g.: c:\chef\cookbooks

knife cookbook create <cookbook_name>

knife cookbook test <cookbook_name>

knife cookbook upload <cookbook_name>

Commands to attach a recipe to a node and setup attribute values:

knife node run_list add <node_name> <cookbook_name>

knife exec -E ‘ nodes.transform(“*:<node_name>)  {|n| n.normal[“<attribute_section_name>”] = { “label1”: “value1”, “label2”: “value2” } } ‘

Commands to run recipes:

Locally: chef-client -o “recipe[<cookbook_name>::<recipe_name>]”

Remotely: knife ssh ‘addresses:xxx.xxx.xxx.xxx’ ‘ sudo chef-client -o “recipe[<cookbook_name>::<recipe_name>]”



Javier Caceres


Review of Deep Learning Documentation

Recently I have been invited to write some material about Artificial Intelligence, Machine Learning and Deep Learning. However there are already several good resources about these topics, so in the interest of knowledge transfer and to make your learning curve smoother I instead selected, filtered and summarized relevant information that you should know in this (short) blog post.

Let’s start with some basic definitions [1]. Artificial Intelligence is an umbrella term for any program which can sense, reason, react and adapt. Machine Learning refers to algorithms which improve over time and more data (ML is an subset of AI). Deep Learning is subset of ML which specializes in multilayered neural networks which can learn from vast amounts of data. There are different ML problems, including Supervised learning, Unsupervised learning and Re-inforcement learning.

Training a model in ML means to iteratively improve it to minimize erros, maximize accuracy and feed data to the ML algorithm to maneuver on unfamiliar terrain.

Machine Learning tasks and problems can be classified depending on the output type, for example [2], in (statistical) classification inputs (a.k.a. events) are grouped into classes or labels (e.g.: spam filters) whereas in regression (analysis) the outputs are continuous because the numeric dependency is predicted / approximated (e.g. price of a house). Classification and Regression problems are of the Supervised learning category.

There are different types of artificial neural networks [3], including: Multi-Layer Perception (MLP), Convolutional Networks (CNN) and Recurrent Networks (RNN). There some sub-types of Recurrent Networks (vanilla RNN), including Long-Short Term Memory (LSTM) and Gated Recurrent Units (GRU), amongst others. A practical example of MLP could be an algorithm that takes as an input an image of some handwriting (e.g.: numbers) and translates it to its digital representation.

Artificial neural networks are made of neurons, connections and weights and the learning rule/algorithm. The learning process involves the selection/definition of a cost function and the calculation of the gradient (a gradient is a vector-valued function) of the loss function (backpropagation – a loss function is a function of a value of one or more variables with a real number to represent the cost).


There are several current and future applications of AI/ML/DL, including content customization (social media), recommender systems for retail (incl. property recommenders for real state), intelligent bots (incl. customer service and conversational commerce), gaming (to anticipate moves in a play), sports performance and Investment Portfolio Management (a.k.a.: https://en.wikipedia.org/wiki/Algorithmic_trading).

Frameworks and Tools

There are multiple tools and frameworks, being the following Python open source libraries the most popular options: TensorFlow (by Google), Caffe (by UC Berkeley) and Theano (by Université de Montréal) among others. Intel has optimized these popular libraries to run much faster on Intel architecture [4], more specifically for Intel Xeon (codename Broadwell) and Intel Xeon Phi (codename Knights Landing).

Microsoft also offers platforms (incl. Services such as Cognitive Services, Bot Framework and Azure Machine Learning and infrastructure such as Azure Cosmos DB), solutions for end users (e.g.: Dynamics 365 AI-based solution for customer service) and frameworks (e.g. Visual Studio Code Tools for AI [5],  Microsoft Cognitive Toolkit and Cortana Skills Kit) for Machine Learning. There are also some benchmarks [6] suggesting that CNTK is better than TensorFlow.


[1] https://software.intel.com/en-us/articles/how-to-get-started-as-a-developer-in-ai

[2] https://software.intel.com/en-us/videos/machine-learning-introduction-regression-and-classification

[3] https://software.intel.com/en-us/videos/deep-learning-102-webinar-introduction-to-neural-networks

[4] https://software.intel.com/en-us/articles/tensorflow-optimizations-on-modern-intel-architecture

[5] https://marketplace.visualstudio.com/items?itemName=ms-toolsai.vscode-ai

[6] https://docs.microsoft.com/en-us/cognitive-toolkit/reasons-to-switch-from-tensorflow-to-cntk



Javier Andrés Cáceres Alvis

Intel Black Belt Software Developer



Tools to lint Dockerfiles in VS and others

All Dockerfile linters are either a command line tool or a webpage and they borrow the best practices from Docker’s “Best practices for writing Dockerfiles“. I am listing here the most popular tools to lint Dockerfiles.

  1. hadolint: opensource linter written in Haskell. Rules are basically defined in code as seen in the Rule.sh source file here. There is also a fork of it here.
  2. dockerfile lint: this is the linter provided by the project atomic. This Linter runs in nodejs. Rules are defined via regular expressions as seen in here. This linter is based on the dockerfile checker which is written on Python and also reads the rules from a YAML file called dockerfile_rules.yaml.
  3. Dockerlint: another nodejs linter. Rules are written in CoffeeScript.
  4. Plug-ins for Visual Studio Code: Visual Studio Code plug-ins offer the richiest developer experience. They offer autocomplete for Dockerfiles and Docker compose plus syntax highlighting. The most complete plugin is Docker Support from Microsoft, there are other plug-ins including the Docker Linter and CodeLift (requires an online account).
  5. Linter for dockerfile: this is a project from RedHat Labs. I didn’t find any detail about its internal working.rh_linter



Javier Andrés Cáceres Alvis

Microsoft Most Valuable Professional – MVP

Intel Black Belt Software Developer


Docker Tools for TFS

This blog entry collects my experiences trying out the 3 Docker extensions available in the Visual Studio Team Services marketplace.

  1. Docker Integration (1397 downloads): this extension is provided by Microsoft and enables users to build docker images, run docker images or execute docker commands. This plug-in is very similar to the Jenkins Docker Plugin and to other plug-ins in the sense that you need to provide the connection details of a docker host to run the commands on it. The main difference is that this plug-in allows you to run docker compose files. This type of applications are known as multi-container Docker applications and they can be deployed to an existing cluster in Azure Container Services (ACS). One of the things that I don’t like about this plugin is that it needs a build agent in addition to a Linux host, which seems for me like an overkill.
  2. Docker build task (162 downloads): This open source plugin offers the same basic functionalities than the Docker Integration plug-in, including the capability to build a Dockerfile and a feature to push images to a docker registry. If you don’t supply SSH connection details the docker commands will be executed locally. Another cool thing about this build task is that provides a docker image for a build agent.
  3. Container Security (34 downloads): this is a Build Step which scannes and locks down container images. This runs a deep scanning for vulnerabilities. I am not familiar with this tool beyond reading the plugin description but seems like it checks the components and packages referenced by the dockerfile against a data base of vulnerabilities and reports any known vulnerability.


Javier Andrés Cáceres Alvis

Microsoft Most Valuable Professional – MVP

Intel Black Belt Software Developer

Openshift Extensions for Visual Studio and TFS

Click2cloud is the only company providing tools for Openshift in the Visual Studio Marketplace (formerly known as Visual Studio Gallery). From the number of downloads, I can tell that these tools are not popular and they differentiate very little between them (in other words, they all seem to do the same).

  1. Click2Cloud Visual Studio 2012+ Extension for Red Hat OpenShift 2 (142 downloads): this plugin allows you to deploy Windows nodes (to run .NET apps) in Openshift. This is for Openshift 2.x so I don’t look at it any further.
  2. Linux Cloud Tool for Visual Studio (252 downloads): this plugin provides similar functionality than the Openshift plug-in for Eclipse. In other words, you can create an application and configure the build/deployment config in Openshift. You can also complete the same tasks using the Openshift command line interface and/or the Openshift Web UI.
  3. Click2Cloud Docker Extension for Visual Studio (156 downloads): this plug-ins allows you to connect and pull images from docker registries. It also allows you to connect to a remote docker host to run containers. You can also deploy containers to Openshift 3. This plugin is similar to the plugin “Docker Tools plug-in 2.1 for Eclipse“.
  4. Click2Cloud Container Extensions for Visual Studio (566 downloads): this plugin provides similar functionality than the Openshift plug-in for Eclipse plus the Docker Tools plug-in 2.1 for Eclipse. This means that you can deploy apps to Openshift from Visual Studio and you can connect/pull images from Docker registries.
After briefly reviewing these plugins, it seems that the Click2Cloud Container Extensions for Visual Studio is worth further reviewing because it is the most complete and popular (in relative figures).
I also reviewed the Visual Studio Team Services marketplace and found only one plugin for Openshift called “Openshift Deploy Tools” by Almatoolbox. This is a free plugin that adds two build steps to the build definition. These build steps allow you to invoke a new build in Openshift and to tag an image on a different target repo (a.k.a.: promotion). This plugin simply wraps the Openshift’s oc commands in powershell scripts and require you to install the Openshift CLI.

Jenkins plugins for Openshift

I searched for openshift plugins and found the ones listed below. Generally speaking, these plugins allow you to do the same than the Openshift API and they’re not very popular (based on the number of downloads), also they lack of documentation (except the Deployer plugin).

a) OpenShift Login Plugin: 0 downloads
b) OpenShift Sync Plugin: 160+ downloads
c) OpenShift Pipeline Plugin: 250+ downloads
d) Openshift Deployer plugin: 120+ downloads
I installed the first (a and b) two but didn’t see any additional build steps or something additional in the post build steps.
c) The Pipeline plug-in added the following build steps (shown in the slideshow below in the same order): and the Build Trigger option “OpenShift Jenkins Pipeline Builder”
  1. Create Resource
  2. Delete Resource (there are multiple options here, you can delete by key, by YAML/JSON file or by specifying a Label)
  3. Exec (to open a shell connection to a pod)
  4. Tag image
  5. Cancel Build
  6. Trigger Build
  7. Trigger Deployment
  8. Verify Build
  9. Verify Deployment
  10. Verify Service
  11. Scale Deployment (also available as a post build step)

    El pase de diapositivas requiere JavaScript.

d) The Deployer plug-in on the other hand, added the Deploy and Delete openshift application build steps.


In terms of scripting, if you’re not familiar with the syntax for openshift pipelines you can simply click in Pipeline Syntax to generate the Pipeline script aided by the UI as seen in the following image.

14-snippet-generatorsDifferent pipelines have different options, the only ones being common are: cluster API URL, Authorization token and the project name. For example, the following screenshot displays the options that were required to tag my sample app image.


OpenShift 3.3

On a side note, Openshift 3.3 was recently released (Sept 2016) and Pipelines were integrated into Openshift. This means that ‘the same OpenShift build steps are available in the classic “freestyle” jobs as Jenkins Pipeline DSL methods’. Another scenario for Openshift is to use it in conjunction with the Kubernetes Jenkins plugin to dynamically provision slave instances as demonstrated here. I tried to install this plug-in but didn’t come-up in the list of Available plug-ins.


Javier Andrés Cáceres Alvis

Microsoft Most Valuable Professional – MVP

Intel Black Belt Software Developer




Jenkins Plugins for Docker

There are more than a dozen Jenkins plug-ins for Docker. This blog post reviews the most popular and useful plugins available for Docker including the Docker Plugin, the Docker Pipeline Plugin, the Docker build step plugin and the CloudBees Docker Build and Publish plugin. You can install any plug-in from Home > Manage Jenkins > Manage Plugins > Available and then you can filter using the provided textbox.

a) The Docker Plugin is a plugin to “dynamically provision a slave, run a single build, then tear-down that slave”. The only drawback of this solution is that you need to the define a docker hosts in advance.

  1. The docker hosts are defined in the option: Manage Jenkins > Configure System > Cloud > Add new cloud > Docker as seen below.
  2. 0-add-cloud1-configure-docker-hosts
  3. You can have multiple “Clouds” (the most important parameter is the container cap). Then you will need to setup the base image that will be used to spin-up new containers. The suggested image by the plugin author is “evarga/jenkins-slave”, but you can use any image as long as it has a JDK to build apps and Jenkins can connect to it.
  4. Once your Cloud is setup,  you can create a free style project and add a docker build step as seen below.
  5. 2-use-a-cloud
  6. Last but not least, you can modify the project properties to commit and push a resulting image on the successful competition of the job as seen below.
  7. 3-push on job completition.png


b) Docker pipeline plug-in:  if you feel more confortable with scripting (i.e.: with creating a Jenkins project of type pipeline) then this plugin allows you to write code to pull an image, create a container, installs packages into the image as seen in the code snippet below.

docker.image(‘maven:3.3.3-jdk-8’).inside {
  git ‘…your-sources…’
  sh ‘mvn -B clean install’

c) Docker build step: this plugin adds a plethora of docker commands options as seen in the image below. This options simply eliminates the need to write a script to execute the same commands in a manual way.


d) CloudBees Docker Build and Publish plugin: this is a simple plugin to access a docker host to build and publish a docker image (this is very similar to the Docker Plugin -step 7-). This plugin needs a Dockerfile.


This build step is very similar to the “Docker Plugin” build step seen below:



In summary: the “Docker build step plugin” and the “CloudBees Docker Build and Publish” plugin don’t add much value and this functionality is included in the “Docker Plugin”.

The “Docker Plugin” allows you to setup multiple docker hosts and it is more flexible in terms of distributing the workload and setting up the host/image parameters. The “Docker Pipeline Plugin” is very flexible and ideal for customizing your build. It would be cool to have a mix of these two plug-ins.


Javier Andrés Cáceres Alvis

Microsoft Most Valuable Professional – MVP

Intel Black Belt Software Developer


My presentation on the Docker Global Mentor Week 2016 – Dublin

Last week I presented the session “Developing Web Apps in the Net Core For Docker” in the Docker Global Mentor Week 2016 – Dublin . This is the material I used in the presentation:

And this is the link that I referenced in the talk.

With special thanks to the attendes, I want to share the following awesome pics below:

El pase de diapositivas requiere JavaScript.

Thanks for reading,

Javier Andrés Cáceres Alvis

Microsoft Most Valuable Professional – MVP

Intel Black Belt Software Developer

Openshift plug-in for Eclipse

I recently reviewed the official Eclipse openshift plug-in from RedHat.

To get this plug-in I searched “openshift” in the “Eclipse Marketplace”, then selected “JBoss Tools 4.2.3” and clicked in install. Once installed, I saw an option to create new apps.

To create a new app, you will need to enter the Openshift server URL, the username and the password as seen in the slideshow below (picture #2 below).

Next in the step #3, the “New  Openshift Application” wizard will allow you to select a template to build your application on top of it. A template is a docker image with a set of variables and a middleware. In my example I used the tomcat template.

El pase de diapositivas requiere JavaScript.

In the next step you need to enter the git repo (incl. the build triggers) and select the ports (image #4 above).

One of the things that I didn’t like about the plug-in is that it requires the Openshift command line utility and that it clones the code locally (in your dev machine).

After completing the new application wizard, your application is built and deployed to Openshift. You can browse your project and update the build and deployment configuration via updating the YAML as seen in the image #8 from the slideshow above.


Javier Andrés Cáceres Alvis

Microsoft Most Valuable Professional – MVP

Intel Black Belt Software Developer



Blog de WordPress.com.

Subir ↑