In this scenario we’re going to create a serverless application. We will be using cloud services like AWS API Gateway, Lambda and DynamoDB deploying them using Terraform.

The lambda function will have both read and write access to the DynamoDB table.

Serverless computing is a cloud computing model in which a cloud provider automatically manages the provisioning and allocation of compute resources. This contrasts with traditional cloud computing where the user is responsible for directly managing virtual servers.

A few things needed:


In the tutorial that follows, you specify the VPC, subnets, and security groups when you create the DB instance. You also specify them when you create the EC2 instance to host your web server. The VPC, subnets, and security groups are required for the DB instance and the web server to communicate. After the VPC is set up, this tutorial shows you how to create the DB instance and install the web server. You connect your web server to your DB instance in the VPC using the DB instance endpoint endpoint.

We are going to complete the following steps:


In this scenario we are going to be creating an AWS Lambda in Python to automatically process any JSON files uploaded to an S3 bucket into a DynamoDB table.

In DynamoDB I’ve gone ahead and created a table called “employees” and the the primary key is employee ID. It can be anything you like.


In this scenario, we have a directory that is very cluttered and we need to organize and back it up simultaneously. We can do this by writing a script that checks a local folder for specific file extensions, backs then up and moves them to their appropriate directories in S3.

First, we need a user that has API keys to access S3.


I have uploaded my files to GitHub so you can follow along.

This is a scenario where we are going to host a static website from a docker container. For a lot of people this simple exercise is their first interaction with docker, or a container in general.

A container is a runnable instance of an image. You can create, start, stop, move, or delete a container using the Docker API or CLI. You can connect a container to one or more networks, attach storage to it, or even create a new image based on its current state.


We’re going to spin up an Ubuntu Server 20.04 instance, install updates, and all of our packages needed for Python3. Lastly, we’re going to run a program to test it works.

I had been a little rusty when it comes to AWS, having not used it for a couple months. I wanted a quick little project to refresh my memory and was reading at a couple different articles on the subject. I decided just to combine them into one project for fun. Follow along, it’s really quite simple to get running.

Two things we need:


Use a Different Volume For Your Docker Images in Ubuntu | by Andy Macdonald  | clusterfk | Medium

I was recently put to task on database experience that I’ve had. Realistically, I had completed a few projects spinning up RDS Aurora instances, but that was about it. I really wanted to demonstrate my flexibility and ability to adapt using tools that are new to me. It was recommended that I run PostgreSQL in docker, and access with DBeaver.

That’s exactly what I’m going to show you here today.

We’re going to need a few things.


In this scenario, I am going to show you how to completely configure and deploy an AWS VPC with the aid of the powerful tool Terraform. Being Infrastructure as Code (IaC) this shows you just how easy it could be to replicate resources.

Feel free to follow along, I have uploaded all files to GitHub.

Here us the layout of the VPC we will be creating. It will host two EC2 instances, a webserver in a public subnet, and a Database in a private subnet.


I have uploaded all my files to GitHub so you can follow along.

I am going to take you through the steps setting up an environment for automated building, testing, and deployment, using containers and services hosted in the cloud.

What we need:


I have uploaded all the files to GitHub so you can follow along.

Docker Swarm is an open-source container orchestration platform and is the native clustering engine for and by Docker. It allows you to manage multiple containers deployed across multiple host machines.

One of the key benefits associated with docker swarm is the high level of availability offered for applications. In a docker swarm, there are multiple worker nodes and at least one manager node that is responsible for handling the worker nodes’ resources efficiently and ensuring that the cluster operates efficiently.

Let’s start setting up our cluster in…

Nick Rondeau

Breaking into the DevOps world one project at a time

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store