Organizations adopting containers and Kubernetes will end up running and maintaining multiple Kubernetes clusters. Some are used as production, some as QA while others are used as short-lived dev environments. Development teams might require third-party off-the-shelf applications (aka cluster add-ons) like Prometheus, Grafana, Elasticsearch, Kafka, Argo Workflows etc. for various reasons.

Kubernetes administrators are responsible for bootstrapping the add-ons required by the development teams. …

GitOps Tools

In this post, I am going to explain what is GitOps and ArgoCD are and how to implement GitOps deployment model on your local development environment with k3s/k3d and ArgoCD.

You may want to have a local development environment for deploying applications using GitOps model for the following reasons

  • Evaluate GitOps practice using your local environment such as your personal laptop
  • Setup a production like environment in your local for development, testing and debugging purposes
  • When it is difficult to provision a Kubernetes clusters in your organization (authorizations, delays, etc…)
  • When your organizations is about to implement Kubernetes but still…

Local dev environment for Argo Workflows can help us to develop and test workflow configurations however setting up Argo Workflows can be a pain at times. To address this issue, we will be looking into how we can setup a lightweight Kubernetes cluster using k3s and k3d and running Argo Workflows in our local machine.

What is Argo Workflows?

Argo Workflows is an open source container-native workflow engine for orchestrating parallel jobs on Kubernetes. Argo Workflows is implemented as a Kubernetes CRD (Custom Resource Definition).

  • Define workflows where each step in the workflow is a container.
  • Model multi-step workflows as a sequence of tasks…

There are a lot of reasons why you might want to have your personal Kubernetes cluster. Kubernetes cluster on your development machine, giving you fast iteration times in a production-like environment.

There are several option to setup a Kubernetes cluster on a local machine for development or testing purposes. But with a full-blown Kubernetes cluster running on your local machine, you will soon hit a wall if you want to play with multi-node cluster or multiple clusters on the same machine.

To address this issue, we will be looking into how we can setup a lightweight Kubernetes cluster using k3s…

In the last blog, I explained the observability, the difference between monitoring and observability and why the observability is important with microservice style architecture. Next, in this blog, I will explain some strategies and best practices for implementing observability.

To recap, Observability is the practice of instrumenting those systems with tools to gather actionable data that provides not only the when of an error or issue, but — more importantly — the why. The latter is what teams need to respond quickly and resolve emergencies in modern software. …

Microservices are hard

In the last decade, we saw a significant shift in how modern, internet-scale applications are built. Cloud computing and containerization technologies like Docker enabled a new breed of distributed system designs commonly referred to as microservices. These large companies such as Uber, Google, AirBnb, Netflix, Twitter have leveraged microservices to build highly scalable and reliable systems to deliver features faster to their customers. Many organizations are moving to microservices so that their developers can independently develop and deploy their services, without having to plan or coordinate their activities from other teams.

Despite the benefits and eager adoption…

Suren Raju

Site Reliability Engineer linkedin.com/in/surenraju/

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store