How to use Ansible to provision an EC2 instance with an app running in a Docker container

I created this suite of Ansible playbooks to provision a basic AWS (Amazon Web Services) infrastructure on EC2 with a Staging instance, and to deploy a webapp on the Staging instance which runs in a Docker container, pulled from Docker Hub.

Firstly a Docker image is built locally and pushed to a private Docker Hub repository, then the EC2 SSH key and Security Groups are created, then a Staging instance is provisioned. Next, the Docker image is pulled on the Staging instance, then a Docker container is started from the image, with nginx set up on the Staging instance to proxy web requests to the container. Finally, a DNS entry is added for the Staging instance in Route 53.

This is a simple Ansible framework to serve as a basis for building Docker images for your webapp and deploying them as containers on Amazon EC2. It can be expanded in multiple ways, the most obvious being to add an auto-scaled Production environment with Docker containers and a load balancer. (For Ansible playbooks suitable for provisioning an auto-scaled Production environment, check out my previous article and associated files “How to use Ansible for automated AWS provisioning”.) More complex apps could be split across multiple Docker containers for handling front-end and back-end components, so this could also be added as needed.

Continue reading “How to use Ansible to provision an EC2 instance with an app running in a Docker container”

How to automate provisioning and deployment of RabbitMQ with cert-manager on a Kubernetes cluster in GKE within GCP

I was brought in by a startup to set up their core infrastructure in a way that functioned as needed and could be automated for safe and efficient provisioning and deployment. The key requirement was making RabbitMQ work only with secure certificate-based connections – the AMQPS protocol, rather than AMQP – for security and compliance purposes. This needed to be done within a Kubernetes cluster for storage and shared states via StatefulSets, ease of scaling and deployment, and general flexibility. It was also necessary to set this up on GCP (Google Cloud Platform) as that was already in use by the startup and they didn’t want to consider alternative cloud providers at this stage, so GKE (Google Kubernetes Engine) needed to be used for the Kubernetes cluster.

Getting certificates for use with RabbitMQ within Kubernetes required the setup of cert-manager for certificate management, which in turn needed ingress-nginx to allow incoming connections for Let’s Encrypt verification so that certificates could be issued.

I successfully solved the problems and fulfilled the requirements. It’s still a “work in progress” to some extent. Some of the config is a little “rough and ready” and could be improved with more modularisation and better use of variables and secrets. Also, the initial cluster provisioning is fully automated with Terraform, and the rest is only semi automated currently. So there is room for further improvement.

All the code and documentation is available in my GitHub repository. Below I will explain the whole process from start to finish.

Continue reading “How to automate provisioning and deployment of RabbitMQ with cert-manager on a Kubernetes cluster in GKE within GCP”