Code Monkey home page Code Monkey logo

kafka-helm-charts's Introduction

Build Status

Helm Charts for Lenses, Lenses SQL Runners and Apache Kafka Connect and other components

This repo contains Helm Charts Apache Kafka components

Add the repo:

helm repo add lensesio https://lensesio.github.io/kafka-helm-charts/
helm repo update

Stream Reactor

Stream-reactor and Kafka Connectors any environment variable beginning with CONNECT is used to build the Kafka Connect properties file, the Connect cluster is started with this file in distributed mode. Any environment variable starting with CONNECTOR is used to make the Connector properties file, which is posted into the Connect cluster to start the connector.

Lenses

Documentation for the Lenses chart can be found here.

Lenses SQL runners

Documentation for the Lenses SQL Runner chart can be found here.

SSL/SASL

The connectors support SSL and SASL on Kafka. For this you need to provide the base64 encoded contents of the key and truststores.

The key/truststores added to a secret and mounted into /mnt/secrets

For SASL, you need to provide the base64 encoded keytab file contents. Note that the keytab path in the jaas.conf must be set to /mnt/secrets.

If the connector, for example, Cassandra requires SSL, provided the base64 contents for the key/truststores. They will be mounted into /mnt/connector-secrets and any connector config parameters are set automatically.

For connectors supporting TLS, the certs will be mounted via secrets into /mnt/connector-secrets and any connector config parameters are set automatically

FOR BASE64 encoding

Make sure to not include split lines.

openssl base64 < client.keystore.jks | tr -d '\n'

Building/Testing

Run package.sh this in turn calls scripts/lint.sh which will perform linting checks on the charts and also check we aren't going to overwrite existing charts.

If all good, checkin, tag and push the ```docs`` folder. This charts are hosted on the github page.

Contribute

Contributions are welcome for any Kafka Connector or any other component that is useful for building Data Streaming pipelines

Signing

For integrity we sign the Helm Charts. For more information see this document https://helm.sh/docs/topics/provenance/.

The steps to do so are as follows:

  • Create a GPG key, and follow the steps to create it noting the name of the key which is later used for signing the charts.
  gpg --full-generate-key
  • then export the key
  gpg --export-secret-keys > key.gpg
  openssl aes-256-cbc -k "HELM_KEY_PASSPHRASE" -in key.gpg -out key.gpg.enc
  • And commit the key.gpg.enc to Git.

The package.sh then needs to be updated with the key name, and an environment variable HELM_KEY_PASSPHRASE created in the travis build settings with the pass phrase used to encrypt the key.

kafka-helm-charts's People

Contributors

andmarios avatar andrewstevenson avatar antwnis avatar awsbot-labs avatar crystalmethod avatar deanlinaras avatar deiga avatar fabriziofortino avatar georgeyord avatar ginal avatar kujon avatar maknihamdi avatar panosjee avatar richard-mathie avatar sdlyu avatar spartakos87 avatar spirosoik avatar stephansmit avatar stheppi avatar tanenbaum avatar ulfox avatar vcandeau avatar wrijvordt avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kafka-helm-charts's Issues

[charts/kafka-connect-mongo-sync] Missing fields in values.yaml

The following configuration variables appear in the chart templates but do not appear to be documented or appear in values.yaml:

avroSchemas
bootstrapServers
clusterName
logLevel
passwordKey
restPort
secretsRef
schemaRegistryURL

In addition, ssl.persistentVolume is misnamed as ssl.persistentVolumes.

Fully documenting these variables would greatly speed up the process of understanding and deploying the Helm template.

secret.yml have no "apiVersion" key?

It would failed when run command helm install if no apiVersion.

Kubectl apply failed. Error: error: error validating "/tmp/app-template891208987": error validating data: apiVersion not set; if you choose to ignore these errors, turn validation off with --validate=false : exit status 1

helm install lenses fails

I am running a AKS (Azure) cluster.

Kubernetes Version

Client Version: version.Info{Major:"1", Minor:"10", GitVersion:"v1.10.3", GitCommit:"2bba0127d85d5a46ab4b778548be28623b32d0b0", GitTreeState:"clean", BuildDate:"2018-05-21T09:17:39Z", GoVersion:"go1.9.3", Compiler:"gc", Platform:"windows/amd64"}
Server Version: version.Info{Major:"1", Minor:"11", GitVersion:"v1.11.3", GitCommit:"a4529464e4629c21224b3d52edfe0ea91b072862", GitTreeState:"clean", BuildDate:"2018-09-09T17:53:03Z", GoVersion:"go1.10.3", Compiler:"gc", Platform:"linux/amd64"}

Helm version

Client: &version.Version{SemVer:"v2.11.0", GitCommit:"2e55dbe1fdb5fdb96b75ff144a339489417b146b", GitTreeState:"clean"}
Server: &version.Version{SemVer:"v2.11.0", GitCommit:"2e55dbe1fdb5fdb96b75ff144a339489417b146b", GitTreeState:"clean"}

I have run

helm repo add landoop https://landoop.github.io/kafka-helm-charts/

and

helm repo update.

This succeeds.

But when I run

helm install lenses --name lenses --namespace lenses

I get he following error

Error: release lenses failed: the server could not find the requested resource

I have cloned the git repo locally and run the charts form my machine. Still same result.

Could you please send me some pointers why this fails.

Elastic6 sink broken chart

Helm chart for elastic6-sink seems to be broken.

There's repeated value in values.yaml file for clusterName saying first that is the name of the consumer group, and then it says it's the name of elastic search cluster.

On the other hand image datamountaineer/kafka-connect-elastic6:1.2.0 which is the default value does not exist.

lenses chart: Improve Ingress TLS settings: cert-manager support / secret name

The secret name for ingress tls configuration is derived from the release name, this should really be configurable to be called "lenses-tls-certificate" or whatever naming convention you follow.

The tls section should also contain the hostname s.t. cert-manager can automatically provision let's encrypt certificates if installed.

I will provide a PR...

Failed to find any class that implements Connector

I tried using Influxsink connector and as well as Elasticsink connector in Helm charts and I see the below error ,

Error: the Kafka Connect API returned: Failed to find any class that implements Connector and which name matches com.datamountaineer.streamreactor.connect.elastic5.ElasticSinkConnector, available connectors are: PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='1.0.0-cp1', encodedVersion=1.0.0-cp1, type=source, typeName='source', location='classpath'} (500)
done.

This is my config,

KAFKA_HEAP_OPTS: -Xmx256M
CONNECT_LOG4J_ROOT_LOGLEVEL: INFO
CONNECT_GROUP_ID: elastic
CONNECT_BOOTSTRAP_SERVERS: kafka:9092
CONNECT_REST_PORT: 8083
CONNECT_CONFIG_STORAGE_TOPIC: connect-elastic-configs
CONNECT_OFFSET_STORAGE_TOPIC: connect-elastic-offsets
CONNECT_STATUS_STORAGE_TOPIC: connect-elastic-statuses
CONNECT_KEY_CONVERTER: org.apache.kafka.connect.storage.StringConverter
CONNECT_VALUE_CONVERTER: org.apache.kafka.connect.json.JsonConverter
CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL:
CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL:
CONNECT_INTERNAL_KEY_CONVERTER: org.apache.kafka.connect.json.JsonConverter
CONNECT_INTERNAL_VALUE_CONVERTER: org.apache.kafka.connect.json.JsonConverter
CONNECT_REST_ADVERTISED_HOST_NAME:
CONNECT_PLUGIN_PATH: /etc/landoop/jars
CONNECTOR_NAME: elastic-sink-0
CONNECTOR_GROUP_ID: elastic-sink-0
CONNECTOR_CONNECTOR_CLASS: com.datamountaineer.streamreactor.connect.elastic5.ElasticSinkConnector
CONNECTOR_TASKS_MAX: 1
CONNECTOR_CONNECT_ELASTIC_CLUSTER_NAME: elastic
CONNECTOR_CONNECT_ELASTIC_MAX_RETRIES: 20
CONNECTOR_CONNECT_ELASTIC_WRITE_TIMEOUT: 300000
CONNECTOR_CONNECT_ELASTIC_ERROR_POLICY: THROW
CONNECTOR_CONNECT_ELASTIC_XPACK_PLUGINS:
CONNECTOR_CONNECT_ELASTIC_URL_PREFIX: elasticsearch
CONNECTOR_CONNECT_PROGRESS_ENABLED: true
CONNECTOR_CONNECT_ELASTIC_XPACK_SETTINGS ():
CONNECTOR_CONNECT_ELASTIC_URL: localhost:9300
CONNECTOR_CONNECT_ELASTIC_KCQL: SELECT * FROM data
CONNECTOR_CONNECT_ELASTIC_BATCH_SIZE: 4000
CONNECTOR_CONNECT_ELASTIC_USE_HTTP: tcp

Why Statefulset?

Didn't find a better place to ask, but I just want understand why landoop connectors are deployed as Statefulset and not as normal Deployment if they are stateless, I have checked ConfluentInc helm charts as well and they instead have it defined as simple Deployment.

I went through the landoop documentation and just didnt find the reason, it just says to deploy as statefulset and have a headless service.

Thanks

Can not read the license file:/data/license.json

Hi! i'm trying to install lenses helm chart with command:

helm install lenses --name lenses --namespace lenses -f lenses/config.yml

lenses/config.yml contains such block:

lenses:
  license: |
    This
    is
    a
    license

All seems to be OK, but after "helm install" command I see such errors in pod's logs:

2018-07-10 11:23:16,869 ERROR [c.l.k.l.Main$:156] Can not read the license file:/data/license.json
2018-07-10 11:23:16,870 ERROR [c.l.k.l.Main$:157] Lenses will shutdown.

So what's wrong with my license?

Cannot login to fresh installation

Hi! I just deployed lenses with helm. I didn't change default username/password, so they should be admin/admin, as it is described in lenses documentation. But when I try to login with admin/admin I see such error: "Could not login". If I go to browser developer's console, I see this messages:

Failed at method [POST] [lenses/api/login] with error:
{"data":{"success":false,"token":null,"user":null,"schemaRegistryDelete":true},"status":401,"config":{"method":"POST","transformRequest":[null],"transformResponse":[null],"jsonpCallbackParam":"callback","url":"/api/login","data":{"user":"admin","password":"*******"},"dataType":"json","headers":{"Content-Type":"application/json","Accept":"application/json, text/plain"}},"statusText":"Unauthorized"}

vendor.689882bfec3849ba2cda.bundle.js:1 Failed at method [GET] [lenses/api/user/profile] with error:
{"data":"CredentialsMissing","status":401,"config":{"method":"GET","transformRequest":[null],"transformResponse":[null],"jsonpCallbackParam":"callback","url":"/api/user/profile","dataType":"json","headers":{"Accept":"application/json, text/plain"}},"statusText":"Unauthorized"}

So what are the default login/password for fresh lenses installation?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.