Problem statement

  • Use Azure registry Container for deployment.
  • Provision using terraform.


  • Azure CLI configured in OS


First, we need to create one docker image and push it to the Azure Container registry, and after write terraform code for provision.

Step-1)Create and Push image in Azure Container Registry

Step-2)Write Terraform Code

Problem Statement:

  • For Multi-Cloud I am using AWS, Azure.
  • AWS for 1 master, 1 Node.
  • Azure for 1 Node.

Solution: First we need to launch instances in AWS, Azure, for provisioning, I am using Terraform.


  • Have an Azure account
  • AWS CLIv2 installed and configured
  • Azure CLI installed and configured

Step-1)Launching Instances in AWS Azure


For solving this issue I am showing two methods, the first is manually and the second is Automation, btw I prefer automation 😁.

First Method(Manual)

docker pull centos
  • Start docker container normally.
docker run --net=host -it --name c1 centos
  • Install one GUI application for eg, I am installing firefox.
  • After the installation let’s start firefox.

Problem Statement

  • Create roles that will configure the master node and slave node separately.
  • Launch a wordpress and MySQL database connected to it in the respective slaves.
  • Expose the wordpress pod and the client able to hit the wordpress IP with its respective port.


  • For Wordpress and Mysql we need to create yml files for k8s and deploy them.

Step-1)Launch ec2-instances on AWS Cloud and Configuring k8s Multi-Node

ansible-galaxy collection install rootritesh.k8s_cluster

Learn More:

  • Only we need to copy the files from local…

Problem Statement

Ansible Roles to Configure K8S Multi-Node Cluster over AWS Cloud.


For this problem statement, I am using the Ansible collection, in this ansible collection I am creating 3 roles, one for ec2, the second for Kubernetes master, third for Kubernetes slave 🙂.


  • Dynamic Inventory configured.
  • AWS CLI configured.

Problem Statement:

  • Use multi-threading concept to get and receive data parallelly from both the Server Sides. Observe the challenges that you face to achieve this using UDP.

For this Problem Statement, we need two OS 😅.

Python Code:


  • Inventory Configured
  • AWS CLIv2 installed and Configured

Step-1)Configuring Dynamic inventory

  • For the dynamic inventory, download and ec2.ini from this given URL, and paste in the inventory folder:
  • Now add this path in ansible.cfg
  • After that, you also need to copy the key.pem for the ec2 instance launch.
  • After copying your key, make it executable by the following command:
chmod 600 Key_Name.pem



First, we need to delete all the IP addresses from all the OS routing tables. after that add the System A Ip in System B and System C routing table, at last, add the System B and System C Ip in System A routing table. huh, that’s simple 😅.

Step-1)Delete all the rules from all the Systems:

route del -net gw netmask enp0s3(NIC Card)

Prerequisite :

  • Inventory configured.

Step-1)Retreive the distro.

  • After we retrieve the distro, now we need to make the files respective to the distro, like — RedHat-8.yml, Ubuntu-16.yml, etc.

Step-2)Writing Playbook

Prerequisite :

✅ Have an AWS account

Step-1)Launch ec2 instance

  • Goto Service > EC2 > launch Ec2 instance.
  • Select Amazon Linux 2 AMI.

Ritesh Singh

DevOps/Cloud Enthusiast

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store