What happens when a bank starts “to play” with containers

A session at Incontro DevOps Italia 2017

The adoption of container technology, with the new dresses from Docker Inc., continues to grow at rocket speed. The huge success spans from the small startups up to large enterprises, since all categories recognize the benefits deriving from this technology. However, as at the edge of a black hole or a travel at the speed of light, when the IT of a bank (or a large enterprise more in general) approaches container technology, it starts to face dynamics not as linear as the concept of the OS-virtualization itself. The “players” in the game are also network and security policies, legacy workloads, shifted roles and responsibilities, operations management and most importantly the integration with existing, mostly custom, dev/ops pipeline tools. The IT has to deal with huge investments made in the past that do not go in the direction of having developers leverage pre-baked images and the sysadmins manage easily the containers’ deployment. The analysis aims to present as sort of prescriptive guide, based on the experience gained on real-use case of a major bank, the different points of attention and caveats the IT professionals from an enterprise might find when start to fit containers in the application delivery processes and existing enterprise’s frameworks.

Sign in to add slides, notes or videos to this session

Tell your friends!


Date Fri 10th March 2017

Short URL


View the schedule


See something wrong?

Report an issue with this session