Code Monkey home page Code Monkey logo

dapr-over-k8s's Introduction

DAPR over K8S

This project explains how to install DAPR over K8S with Ansible and Helm solutions.

DAPR Overview

Dapr is an amazing middleware framework for cloud stack. This framework helps developper to create, use resilient and secured microservices mainly over a Kubernetes infrastructure[1]. Dapr for Distributed Application Runtime provide several piece for developers, integrators and operators who want run theirs application on a Kubernetes cluster. The main idea of Dapr is to wrap / abstract all tiers components within an standardized API. Tier providers can easely integrate theirs components inside Dapr stack without any business app code modification.

Currently the stable version version of Dapr.io is 1.8.4 (2022-08-22).

Building blocks

Dapr brings lot a cool stuff like :

  • Service-to-service invocation

    • Service call mechanism

  • State management

    • A store system for long living data

  • Publish and subscribe

    • Pub/Sub solution over Kafka or another MoM solution

  • Secrets

    • Keep sensitive information into secret mechanism

  • Configuration

    • Manage properties/values/parameters for client applications

API

Because Dapr abstracts all backend with an API, Dapr provides two kinds of API.

  • REST API

    • Buildings blocks are exposed with a REST API, applications can directly call components throught this

  • Code API

    • Depending application language, Dapr supply several code librairies to talk to the REST API inside the application.

HELM

Helm system is the package manager for all software or solution running over Kubernetes. Helm provides the same user experience as yum, apt but focus on Kubernetes technologies. As Kubernetes, Helm is YAML first. Developer can create some powerful template for his application or solution. When Operator needs to install the solution, Helm merge the Helm package with some specifics values (Env values, k8s cluster values, package values, Business values).

Ansible

Ansible is a automation tool world wide used. This tool helps IT Ops to automate all technical configuration and deployment over any infrastructures : OnPremise, Cloud, Hybrid Cloud, Edge etc…​. By using playbook, a YAML file including all commands designer by Ansible, Ops users can control, deploy, check lots of devices/services/vm/cloud infrastructure. Ansible separates execution and configuration, it means that you can reuse any playbook with another environment configuration smoothly.

Let’s run

To use this project you need at least :

  • Openshift 4.10 / Kubernetes 1.23

  • Ansible 2.13+

  • Helm 3.9+

  • Cluster Admin-Role user

Install Dapr System with Helm

What the playbook does :

  1. Create dapr-system NS

  2. Install Helm Chart Repo (DAPR and REDIS)

  3. Install DAPR Helm

  4. Install DAPR-Addon Helm

    1. Route to DAPR Dashboard

    2. Zipkin Configuration

  5. Install Redis Helm

To run this playbook you need :

Clone this repository

git clone --recurse-submodules [email protected]:gautric/dapr-over-k8s.git

Execute the playbook

ansible-playbook dapr-system.yaml

Check Dapr & Zipkin Dashboard deployment

  • For Dapr Dashboard (Mac users)

    open -a Firefox   http://`kubectl --namespace dapr-system get Route -l name=dapr-dashboard --no-headers -o custom-columns=":spec.host" `

You should get this kind of output into your browser

dapr dashboard
  • For Zipkin Dashboard (Mac users)

    open -a Firefox   http://`kubectl --namespace dapr-system get Route -l name=dapr-zipkin --no-headers -o custom-columns=":spec.host" `

You should get this kind of output into your browser

dapr zipkin

Install Dapr Sample with Helm

What the playbook does :

  1. Create dapr-sample NS

  2. Copy/Paste redis password for Dapr Component[2]

  3. Deploy Node App

  4. Deploy Python App

Execute the playbook

ansible-playbook dapr-sample.yaml

Check client / server application with log

kubectl --namespace dapr-sample  logs `kubectl --namespace dapr-sample get pods -l app=node --no-headers -o custom-columns=":metadata.name"` node

and

kubectl --namespace dapr-sample  logs `kubectl --namespace dapr-sample get pods -l app=python --no-headers -o custom-columns=":metadata.name"` python

Install Dapr Service Invocation with Helm

What the playbook does :

  1. Create dapr-service-invocation NS

  2. Deploy Service Invocation with Helm chart

    1. Checkout App (build, deploy, imagestream)

    2. Order-Process App (build, deploy, imagestream)

    3. Dapr Configuration for ServiceInvocation

Execute the playbook

ansible-playbook dapr-service-invocation.yaml

Check client / server application with log

kubectl --namespace dapr-service-invocation  logs `kubectl --namespace dapr-service-invocation get pods -l app=checkout --no-headers -o custom-columns=":metadata.name"` checkout -f

and

kubectl --namespace dapr-service-invocation  logs `kubectl --namespace dapr-service-invocation get pods -l app=order-processor --no-headers -o custom-columns=":metadata.name"` order-processor -f

Install Dapr Pub Sub with Helm

What the playbook does :

  1. Create dapr-pub-sub NS

  2. Deploy Pub Sub with Helm chart

    1. Checkout App (build, deploy, imagestream)

    2. Order-Process App (build, deploy, imagestream)

Execute the playbook

ansible-playbook dapr-pub-sub.yaml

Check client / server application with log

kubectl --namespace dapr-pub-sub  logs `kubectl --namespace dapr-pub-sub get pods -l app=checkout --no-headers -o custom-columns=":metadata.name"` checkout -f

and

kubectl --namespace dapr-pub-sub  logs `kubectl --namespace dapr-pub-sub get pods -l app=order-processor --no-headers -o custom-columns=":metadata.name"` order-processor -f

Install Dapr Pub Sub and Redis with Helm

What the playbook does :

  1. Create dapr-pubsub-config NS

  2. Deploy Pub Sub with Helm chart

    1. Pub-App

    2. Sub-App

    3. Kafka cluster and Topic creation

    4. Dapr Configuration for PubSub over Redis

Execute the playbook

ansible-playbook dapr-pub-sub-config.yaml

Check client / server application with log

kubectl --namespace dapr-pubsub-config  logs `kubectl --namespace dapr-pubsub-config get pods -l app=pub-app --no-headers -o custom-columns=":metadata.name"` pub-app -f

and

kubectl --namespace dapr-pubsub-config  logs `kubectl --namespace dapr-pubsub-config get pods -l app=sub-app --no-headers -o custom-columns=":metadata.name"` sub-app -f

Install Dapr Pub Sub and Kafka with Helm

What the playbook does :

  1. Create dapr-pubsub-kafka and dapr-kafka NS

  2. Deploy Pub Sub with Helm chart

    1. Pub-App

    2. Sub-App

    3. Kafka cluster and Topic creation

    4. Dapr Configuration for PubSub over Kafka

    5. Kafka UI

Execute the playbook

ansible-playbook dapr-pub-sub-kafka.yaml

Post some events

curl -X POST http://`kubectl --namespace dapr-pubsub-kafka get Route pub-app  --no-headers -o custom-columns=":spec.host"`/publish -H Content-Type:application/json --data @samples/pub-sub-config/messages/gadget.json

or

curl -X POST http://`kubectl --namespace dapr-pubsub-kafka get Route pub-app  --no-headers -o custom-columns=":spec.host"`/publish -H Content-Type:application/json --data @samples/pub-sub-config/messages/widget.json

or

curl -X POST http://`kubectl --namespace dapr-pubsub-kafka get Route pub-app  --no-headers -o custom-columns=":spec.host"`/publish -H Content-Type:application/json --data @samples/pub-sub-config/messages/thingamajig.json

Check client / server application with log

kubectl --namespace dapr-pubsub-kafka  logs `kubectl --namespace dapr-pubsub-kafka get pods -l app=pub-app --no-headers -o custom-columns=":metadata.name"` pub-app -f

and

kubectl --namespace dapr-pubsub-kafka  logs `kubectl --namespace dapr-pubsub-kafka get pods -l app=sub-app --no-headers -o custom-columns=":metadata.name"` sub-app -f

Retrieve message inside Kafka UI console

  • For Kafka-UI Dashboard (Mac users)

    open -a Firefox "http://`kubectl --namespace dapr-pubsub-kafka get Route kafka-ui  --no-headers -o custom-columns=":spec.host"`/ui/clusters/pubsub-dapr/topics/pubsub-dapr-topic/messages?filterQueryType=STRING_CONTAINS&attempt=0&limit=100&seekDirection=FORWARD&seekType=OFFSET&seekTo=0::0"

You should get this kind of output :

Kafka UI Dashboard
Figure 1. Kafka UI Dashboard

Install Dapr Pub Sub and AMQStream / Prometheus / Grafana with Helm

What the playbook does :

  1. Create kafka-metrics NS

  2. Deploy Pub Sub with Helm chart

    1. Pub-App

    2. Sub-App

    3. Kafka cluster and Topic creation

    4. Dapr Configuration for PubSub over Kafka

    5. Prometheus and Grafana integration

    6. Kafka UI

Note
Don’t forget to change some values inside values.yaml

Execute the playbook

ansible-playbook dapr-pub-sub-amqstream.yaml

Post some events

curl -X POST http://`kubectl --namespace kafka-metrics get Route pub-app  --no-headers -o custom-columns=":spec.host"`/publish -H Content-Type:application/json --data @samples/pub-sub-config/messages/gadget.json

or

curl -X POST http://`kubectl --namespace kafka-metrics get Route pub-app  --no-headers -o custom-columns=":spec.host"`/publish -H Content-Type:application/json --data @samples/pub-sub-config/messages/widget.json

or

curl -X POST http://`kubectl --namespace kafka-metrics get Route pub-app  --no-headers -o custom-columns=":spec.host"`/publish -H Content-Type:application/json --data @samples/pub-sub-config/messages/thingamajig.json

Check client / server application with log

kubectl --namespace kafka-metrics  logs `kubectl --namespace kafka-metrics get pods -l app=pub-app --no-headers -o custom-columns=":metadata.name"` pub-app -f

and

kubectl --namespace kafka-metrics  logs `kubectl --namespace kafka-metrics get pods -l app=sub-app --no-headers -o custom-columns=":metadata.name"` sub-app -f

Retrieve message inside Kafka UI console

  • For Kafka-UI Dashboard (Mac users)

    open -a Firefox "http://`kubectl --namespace kafka-metrics get Route kafka-ui  --no-headers -o custom-columns=":spec.host"`/ui/clusters/pubsub-dapr/topics/pubsub-dapr-topic/messages?filterQueryType=STRING_CONTAINS&attempt=0&limit=100&seekDirection=FORWARD&seekType=OFFSET&seekTo=0::0"

You should get this kind of output :

Kafka UI Dashboard
Figure 2. Kafka UI Dashboard

Retrieve message inside Grafana console

  • For Grafana Dashboard (Mac users)

    open -a Firefox https://`kubectl --namespace kafka-metrics get Route grafana  --no-headers -o custom-columns=":spec.host"`

As usual, here is the default credential for Grafana UI : admin / admin

You should get this kind of output :

General Dashboard
Figure 3. General Dashboard
Messages I/O
Figure 4. Messages I/O

1. a local installation is possible : https://docs.dapr.io/operations/hosting/self-hosted/
2. to check

dapr-over-k8s's People

Contributors

gautric avatar

Stargazers

 avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.