Tuesday 28th February, 2017
9:00am to 12:30pm
For decades, software engineers used to proudly shout "It works on my machine".
They handed-over their application packages to the operations department who were condemned to make it run on their staging and production systems.
With Docker's container based approach, software engineers are now encouraged to get engaged in the process of making their packages run anywhere not just their machine.
Docker offers a model to package all components of an application including its infrastructure and distribute it to almost all major platforms.
While still many organisations are hesitating to put Docker into production
it seems fairly easy to argue with their management to containerize their development infrastructure.
Having container based tools at hand, developers are now able to run and debug a full blown set of development tools on their local machine.
Build engineers can develop their build jobs locally, QA engineers are able to manage their testing environments with very little hardware resources.
In this workshop we set the ground with some Docker basics and then take a deep dive into containerizing popular components of a typical development tool chain.
We start with looking into 'dockerizing' your favorite command-line tool, like Gradle or AsciiDoctor.
Next we will build a Jenkins master/agent based build server using docker-machine and docker-compose and show how we can scale agents as the build load rises.
Finally we add build agents as Docker containers that that themselves will be able to build Docker images and run Docker containers (DinD, DooD).
To participate in this interactive workshop you bring your own laptop with latest docker-machine and docker-compose installed.
As Docker involves heavy network traffic, you also bring your own cloud (suggesting something light-weight such as DigitalOcean or whatever is familiar to you).
Sign in to add slides, notes or videos to this session