Code Monkey home page Code Monkey logo

confluent-kafka-dotnet's Introduction

Confluent's .NET Client for Apache KafkaTM

Travis Build Status Build status Chat on Slack

confluent-kafka-dotnet is Confluent's .NET client for Apache Kafka and the Confluent Platform.

Features:

  • High performance - confluent-kafka-dotnet is a lightweight wrapper around librdkafka, a finely tuned C client.

  • Reliability - There are a lot of details to get right when writing an Apache Kafka client. We get them right in one place (librdkafka) and leverage this work across all of our clients (also confluent-kafka-python and confluent-kafka-go).

  • Supported - Commercial support is offered by Confluent.

  • Future proof - Confluent, founded by the creators of Kafka, is building a streaming platform with Apache Kafka at its core. It's high priority for us that client features keep pace with core Apache Kafka and components of the Confluent Platform.

confluent-kafka-dotnet is derived from Andreas Heider's rdkafka-dotnet. We're fans of his work and were very happy to have been able to leverage rdkafka-dotnet as the basis of this client. Thanks Andreas!

Referencing

confluent-kafka-dotnet is distributed via NuGet. We provide three packages:

  • Confluent.Kafka [net45, netstandard1.3] - The core client library.
  • Confluent.Kafka.Avro [net452, netstandard2.0] - Provides a serializer and deserializer for working with Avro serialized data with Confluent Schema Registry integration.
  • Confluent.SchemaRegistry [net452, netstandard1.4] - Confluent Schema Registry client (a dependency of Confluent.Kafka.Avro).

To install Confluent.Kafka from within Visual Studio, search for Confluent.Kafka in the NuGet Package Manager UI, or run the following command in the Package Manager Console:

Install-Package Confluent.Kafka -Version 0.11.5

To add a reference to a dotnet core project, execute the following at the command line:

dotnet add package -v 0.11.5 Confluent.Kafka

Development Branch

We have started working towards a 1.0 release of the library which will occur after we add idempotence and transaction features. In order to best accomodate these and other changes, we will be making breaking changes to the API in that release. You can track our progress on the 1.0-experimental branch (as well as corresponding packages on nuget.org). We have already added an AdminClient as well as support for message headers and custom timestamps amongst other things. Note that all work on this branch is subject to change and should not be considered production ready. All feedback is very welcome! You can review the current CHANGELOG here.

Also, nuget packages corresponding to all release branch commits are available from the following nuget package source (Note: this is not a web url - you should specify it in the nuget package manger): https://ci.appveyor.com/nuget/confluent-kafka-dotnet. The version suffix of these nuget packages matches the appveyor build number. You can see which commit a particular build number corresponds to by looking at the AppVeyor build history

Usage

Take a look in the examples directory for example usage. The integration tests also serve as good examples.

For an overview of configuration properties, refer to the librdkafka documentation.

Basic Producer Example

using System;
using System.Text;
using System.Collections.Generic;
using Confluent.Kafka;
using Confluent.Kafka.Serialization;

public class Program
{
  public static void Main()
  {
    var config = new Dictionary<string, object> 
    { 
        { "bootstrap.servers", "localhost:9092" } 
    };

    using (var producer = new Producer<Null, string>(config, null, new StringSerializer(Encoding.UTF8)))
    {
      var dr = producer.ProduceAsync("my-topic", null, "test message text").Result;
      Console.WriteLine($"Delivered '{dr.Value}' to: {dr.TopicPartitionOffset}");
    }
  }
}

Basic Consumer Example

using System;
using System.Text;
using System.Collections.Generic;
using Confluent.Kafka;
using Confluent.Kafka.Serialization;

public class Program
{
  public static void Main()
  {
    var conf = new Dictionary<string, object> 
    { 
      { "group.id", "test-consumer-group" },
      { "bootstrap.servers", "localhost:9092" },
      { "auto.commit.interval.ms", 5000 },
      { "auto.offset.reset", "earliest" }
    };

    using (var consumer = new Consumer<Null, string>(conf, null, new StringDeserializer(Encoding.UTF8)))
    {
      consumer.OnMessage += (_, msg)
        => Console.WriteLine($"Read '{msg.Value}' from: {msg.TopicPartitionOffset}");

      consumer.OnError += (_, error)
        => Console.WriteLine($"Error: {error}");

      consumer.OnConsumeError += (_, msg)
        => Console.WriteLine($"Consume error ({msg.TopicPartitionOffset}): {msg.Error}");

      consumer.Subscribe("my-topic");

      while (true)
      {
        consumer.Poll(TimeSpan.FromMilliseconds(100));
      }
    }
  }
}

AvroGen tool

The Avro serializer and deserializer provided by Confluent.Kafka.Avro can be used with the GenericRecord class or with specific classes generated using the avrogen tool, available via Nuget (.NET Core 2.1 required):

dotnet tool install -g Confluent.Apache.Avro.AvroGen

Usage:

avrogen -s your_schema.asvc .

Confluent Cloud

The Confluent Cloud example demonstrates how to configure the .NET client for use with Confluent Cloud.

Known Issues

The mechanism used by librdkafka to poll simultaneously for both new application and socket events is not supported on Windows. If you are on Windows and experiencing poor latency (which may happen in low throughput scenarios in particular), as a workaround, set socket.blocking.max.ms to 1 to limit the time librdkafka will block waiting for network events to 1ms (the trade-off being higher CPU usage). We will optimize the librdkafka control loop for use on Windows in a future version of the library.

Build

To build the library or any test or example project, run the following from within the relevant project directory:

dotnet restore
dotnet build

To run an example project, run the following from within the example's project directory:

dotnet run <args>

Tests

Unit Tests

From within the test/Confluent.Kafka.UnitTests directory, run:

dotnet test

Integration Tests

From within the Confluent Platform (or Apache Kafka) distribution directory, run the following two commands (in separate terminal windows) to set up a single broker test Kafka cluster:

./bin/zookeeper-server-start ./etc/kafka/zookeeper.properties

./bin/kafka-server-start ./etc/kafka/server.properties

Now use the bootstrap-topics.sh script in the test/Confleunt.Kafka.IntegrationTests directory to set up the prerequisite topics:

./bootstrap-topics.sh <confluent platform path> <zookeeper>

then:

dotnet test

Copyright (c) 2016-2017 Confluent Inc., 2015-2016, Andreas Heider

confluent-kafka-dotnet's People

Contributors

adison88 avatar ah- avatar andreycha avatar bjornicus avatar denisivan0v avatar ducas avatar edenhill avatar gitter-badger avatar glenthomas avatar gwenshap avatar manbearwiz avatar maximgurschi avatar mhowlett avatar mstewio avatar namelessvoid avatar nikolaydekov avatar pedroreys avatar reyronald avatar ryanmelenanoesis avatar szkarlen avatar tomasdeml avatar treziac avatar vinodres avatar wangyanjun avatar yartat avatar zikifer avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.