Code Monkey home page Code Monkey logo

azure-edge-extensions-aio-datahistorian's Introduction

Azure Edge Extensions AIO Data Historian

InfluxDB-based Data Historian, deployed and managed by Azure IoT Operations.

Features

This project framework provides the following features:

  • InfluxDB deployment and configuration.
  • Secret generation with Azure Key Vault and access using Azure Key Vault Secrets Store CSI Driver.
  • Telegraf for pipelines and transformation into InfluxDB.
  • Azure IoT Operations Orchestrator for deploying and configuring the Data Historian in the cluster.
  • Azure IoT Operations Data Processor for managing the data that's captured in the Data Historian.

Getting Started

Prerequisites

  • (Optionally for Windows) WSL installed and setup.
  • Azure CLI available on the command line where this will be deployed.
  • Terraform available on the command line where this will be deployed.
  • Cluster with Azure IoT Operations deployed -> This project assumes azure-edge-extensions-aio-iac-terraform was used, however, any cluster with AIO deployed will work.
  • Owner access to a Resource Group with an existing cluster configured and connected to Azure Arc.

Quickstart

  1. Login to the AZ CLI:
    az login --tenant <tenant>.onmicrosoft.com
    • Make sure your subscription is the one that you would like to use: az account show.
    • Change to the subscription that you would like to use if needed:
      az account set -s <subscription-id>
  2. Add a <unique-name>.auto.tfvars file to the root of the deploy directory that contains the following:
    // <project-root>/deploy/<unique-name>.auto.tfvars
    
    name = "<unique-name>"
    location = "<location>"
  3. From the deploy directory execute the following:
    terraform init
    terraform apply

Usage

After the Terraform in this project has been applied, you should be able to connect to your cluster using the az connectedk8s proxy command:

az connectedk8s proxy -g rg-<unique-name> -n arc-<unique-name>

If you have a simulator set up from azure-edge-extensions-aio-iac-terraform or from a the Azure IoT Operations install, then there should be data flowing into the topics in Azure IoT Operations MQ broker. You can validate that there is data flowing by kubectl exec into a pod in your cluster that's setup with mqttui. If you deployed the azure-edge-extensions-aio-iac-terraform repo then you should already have this pod. To exec into this pod, run the following command:

kubectl exec -it -n aio deployments/mqtt-client -c mqtt-client -- sh

Once inside the pod you can run the following command to open mqttui and subscribe to all of the topics:

mqttui -b mqtts://aio-mq-dmqtt-frontend:8883 -u '$sat' --password $(cat /var/run/secrets/tokens/mq-sat) --insecure

NOTE: This method was adapted from Azure IoT Operations Documentation - Verify data is flowing and can be referenced if needed.

Connect to InfluxDB

If there is data flowing within the topics of your broker then InfluxDB should be filling with this data. Connect to your InfluxDB locally by first configuring kubectl port-forward:

kubectl port-forward service/influxdb-influxdb2 8086:80 -n aio

Then open a web browser and navigate to http://localhost:8086. Next, get your admin username and password by either referring to the Azure Key Vault Secret that was added after applying the Terraform, or if you've configured the username and password directly, use that instead.

Navigate to the datahistorian default bucket and run a query to see the measurements that have been added.

azure-edge-extensions-aio-datahistorian's People

Contributors

agreaves-ms avatar microsoftopensource avatar williamberryiii avatar

Stargazers

Cheng Chen avatar  avatar

Watchers

Erik St. Martin avatar Suneet Nangia avatar Michael Brown avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

enesgunes777

azure-edge-extensions-aio-datahistorian's Issues

Can make AIO data processor to be configurable in variable.tf?

Please provide us with the following information:

This issue is for a: (mark with an x)

- [ ] bug report -> please search issues before submitting
- [ x] feature request
- [ ] documentation issue or request
- [ ] regression (a behavior that used to work and stopped in a new release)

Minimal steps to reproduce

I am having my own Linux k3s cluster with AIO installed, and using the repo to add local historian.
When I config variables in <custom_name>.auto.tfvars by referring to configurable variables in variables.tf, I have the following questions/thoughts

  1. When executing "terraform apply", it reports error that data processor resource is not found. While it seems that the code automatically assign a name of "dp-<var.name>" to the data processor in use. However in my own cluster, my data processor is installed with a different name.
    I see that data processor is not set as a variable and configurable in var.tf, should we add this feature in?
image
  1. A suggestion that shall we extend the compulsory variables in .auto.tfvars, other than name and location? I suppose most of the user who use this repo will have their cluster with their custom naming. In this case, the must-have variables could be
    image

Any log messages given by the failure

Expected/desired behavior

OS and Version?

Windows 7, 8 or 10. Linux (which distribution). macOS (Yosemite? El Capitan? Sierra?)

Versions

Mention any other details that might be useful


Thanks! We'll be in touch soon.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.