Code Monkey home page Code Monkey logo

sample-keda-queue-jobs's Introduction

KEDA jobs with Azure Storage Queues

This sample shows how to use KEDA to automatically schedule Kubernetes Jobs based on an Azure Storage Queue trigger.

Create an Azure Storage Queue

Using the Azure CLI, create a Resource Group, a Storage Account, and a Queue:

STORAGE_ACCOUNT_NAME=storageaccountname
export QUEUE_NAME=keda-queue
az group create -l westus -n hello-keda
az storage account create -g hello-keda -n $STORAGE_ACCOUNT_NAME
export AZURE_STORAGE_CONNECTION_STRING=$(az storage account show-connection-string --name $STORAGE_ACCOUNT_NAME --query connectionString -o tsv)
az storage queue create -n $QUEUE_NAME

You will need to choose a unique name for the STORAGE_ACCOUNT_NAME variable.

Build and push the queue consumer container image

The queue-consumer directory contains a simple Python script that consumes a single message from an Azure Storage Queue and sleeps for 30 seconds, simulating a very simple job.

The script requires two environment variables:

  • AzureWebJobsStorage: the Azure Storage connection string you obtained from the previous step.
  • QUEUE_NAME: the name of the queue to read from.

To schedule the jobs, you will need to build and push the Docker image to a container registry. For exanple, to use Docker Hub:

export REGISTRY=tomconte
cd queue-consumer/
docker build -t queue-consumer .
docker tag queue-consumer $REGISTRY/queue-consumer
docker push $REGISTRY/queue-consumer

Replace the value for REGISTRY with your own Docker hub profile name.

Install KEDA

Follow the instructions to deploy KEDA on your Kubernetes cluster.

Create the KEDA ScaledObject

The azurequeue_scaledobject_jobs.yaml YAML configuration defines the trigger and the specification of the job to run. You will need to check a few values:

  • Set the container image name
  • Check that the queue names are correct

You will also need to create a secret to store the Azure Storage connection string:

kubectl create secret generic secrets --from-literal=AzureWebJobsStorage=$AZURE_STORAGE_CONNECTION_STRING

You can then create the ScaledObject:

kubectl apply -f azurequeue_scaledobject_jobs.yaml

Nothing will happen right away since our queue is empty!

Send some messages to the queue

The send_messages.py script can be used to send a bunch of messages to the queue. First install the dependencies:

pip install -r requirements.txt

You will also need to configure the AzureWebJobsStorage environment variable (assuming you defined QUEUE_NAME above):

export AzureWebJobsStorage=$AZURE_STORAGE_CONNECTION_STRING

Let's send e.g. a hundred messages into the queue:

python send_messages.py 100

Watch KEDA at work

Now you can watch jobs being automatically scheduled until the queue has been drained:

kubectl get jobs

If anything goes wrong, it can be useful to check the logs of the KEDA operator to look for any error messages.

KOP=$(kubectl get pods -n keda -l app=keda-operator -o name)
kubectl logs $KOP keda-operator -n keda

sample-keda-queue-jobs's People

Contributors

tomconte avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.