Code Monkey home page Code Monkey logo

kafka-connect-tools's People

Contributors

56quarters avatar andrewstevenson avatar andreybratus avatar antwnis avatar johnhofman avatar jwang47 avatar rollulus avatar waffle-iron avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kafka-connect-tools's Issues

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0

Tried to build this on Azure Pipelines and got this error above, so I ran the build again with the "--warning-mode all" to see the deprecated features used and got this messages bellow:

git clone https://github.com/lensesio/kafka-connect-tools.git
Cloning into 'kafka-connect-tools'...
cd kafka-connect-tools
gradle buildCli --warning-mode all
Welcome to Gradle 6.3!

Here are the highlights of this release:
 - Java 14 support
 - Improved error messages for unexpected failures

For more details see https://docs.gradle.org/6.3/release-notes.html

Starting a Gradle Daemon (subsequent builds will be faster)

> Configure project :
The maven plugin has been deprecated. This is scheduled to be removed in Gradle 7.0. Please use the maven-publish plugin instead. Consult the upgrading guide for further information: https://docs.gradle.org/6.3/userguide/upgrading_version_5.html#legacy_publication_system_is_deprecated_and_replaced_with_the_publish_plugins
 	at build_bp759d9y9oh4pwffs7cy18zv6$_run_closure2.doCall(D:\a\r1\a\kafka-connect-tools\build.gradle:38)
 	(Run with --stacktrace to get the full stack trace of this deprecation warning.)
 The compile configuration has been deprecated for dependency declaration. This will fail with an error in Gradle 7.0. Please use the implementation configuration instead. Consult the upgrading guide for further information: https://docs.gradle.org/6.3/userguide/upgrading_version_5.html#dependencies_should_no_longer_be_declared_using_the_compile_and_runtime_configurations
 	at build_bp759d9y9oh4pwffs7cy18zv6$_run_closure3.doCall(D:\a\r1\a\kafka-connect-tools\build.gradle:54)
 	(Run with --stacktrace to get the full stack trace of this deprecation warning.)
 The testCompile configuration has been deprecated for dependency declaration. This will fail with an error in Gradle 7.0. Please use the testImplementation configuration instead. Consult the upgrading guide for further information: https://docs.gradle.org/6.3/userguide/upgrading_version_5.html#dependencies_should_no_longer_be_declared_using_the_compile_and_runtime_configurations
 	at build_bp759d9y9oh4pwffs7cy18zv6$_run_closure3.doCall(D:\a\r1\a\kafka-connect-tools\build.gradle:61)
 	(Run with --stacktrace to get the full stack trace of this deprecation warning.)
 The testRuntime configuration has been deprecated for dependency declaration. This will fail with an error in Gradle 7.0. Please use the testRuntimeOnly configuration instead. Consult the upgrading guide for further information: https://docs.gradle.org/6.3/userguide/upgrading_version_5.html#dependencies_should_no_longer_be_declared_using_the_compile_and_runtime_configurations
 	at build_bp759d9y9oh4pwffs7cy18zv6$_run_closure3.doCall(D:\a\r1\a\kafka-connect-tools\build.gradle:63)
 	(Run with --stacktrace to get the full stack trace of this deprecation warning.)
 The AbstractArchiveTask.baseName property has been deprecated. This is scheduled to be removed in Gradle 7.0. Please use the archiveBaseName property instead. See https://docs.gradle.org/6.3/dsl/org.gradle.api.tasks.bundling.AbstractArchiveTask.html#org.gradle.api.tasks.bundling.AbstractArchiveTask:baseName for more details.
	at build_bp759d9y9oh4pwffs7cy18zv6$_run_closure5.doCall(D:\a\r1\a\kafka-connect-tools\build.gradle:77)
	(Run with --stacktrace to get the full stack trace of this deprecation warning.)

> Task :clean UP-TO-DATE
> Task :compileJava NO-SOURCE
> Task :compileScala
> Task :processResources NO-SOURCE
> Task :classes
> Task :shadowJar
> Task :fatJar
> Task :buildCli FAILED

FAILURE: Build failed with an exception.
4 actionable tasks: 3 executed, 1 up-to-date

* What went wrong:
Execution failed for task ':buildCli'.
> A problem occurred starting process 'command 'bin/package.sh''

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1m 30s

I'm not a Java developer, so I don't know anything about Maven or Gradle, am I doing something wrong?

Unable to validate sink properties

Per documentation I should be able to validate the sink properties via

connect-cli validate < kudu_sink.properties

However I get

Error: Please specify the connector-name

Tried

connect-cli validate kudu-sink < kudu_sink.properties

but that throws

java.lang.Exception:  Error: the Kafka Connect API returned status code 400
	at com.datamountaineer.connect.tools.RestKafkaConnectApi.non2xxException(RestKafkaConnectApi.scala:122)
	at com.datamountaineer.connect.tools.RestKafkaConnectApi.com$datamountaineer$connect$tools$RestKafkaConnectApi$$req(RestKafkaConnectApi.scala:139)
	at com.datamountaineer.connect.tools.RestKafkaConnectApi$$anonfun$connectorPluginsValidate$1.apply(RestKafkaConnectApi.scala:270)
	at com.datamountaineer.connect.tools.RestKafkaConnectApi$$anonfun$connectorPluginsValidate$1.apply(RestKafkaConnectApi.scala:270)
	at scala.util.Try$.apply(Try.scala:192)
	at com.datamountaineer.connect.tools.RestKafkaConnectApi.connectorPluginsValidate(RestKafkaConnectApi.scala:270)
	at com.datamountaineer.connect.tools.ExecuteCommand$.apply(Cli.scala:71)
	at com.datamountaineer.connect.tools.Cli$.main(Cli.scala:194)
	at com.datamountaineer.connect.tools.Cli.main(Cli.scala)

Build failed with an exception

when I run gradle buildCli
I get

FAILURE: Build failed with an exception.

* Where:
Build file '/Users/brian.lipp/Documents/GitHub/kafka-connect-tools/build.gradle' line: 39

* What went wrong:
A problem occurred evaluating root project 'kafka-connect-cli'.
> Failed to apply plugin [id 'com.github.maiflai.scalatest']
   > java.lang.UnsupportedOperationException (no error message)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 0s

I'm using Gradle 4.7

Exception occurs when run a elastic-sink

java.lang.NoSuchMethodError: io.netty.util.internal.ObjectUtil.checkPositive(ILjava/lang/String;)I
at io.netty.util.NettyRuntime$AvailableProcessorsHolder.setAvailableProcessors(NettyRuntime.java:44)
at io.netty.util.NettyRuntime$AvailableProcessorsHolder.availableProcessors(NettyRuntime.java:70)
at io.netty.util.NettyRuntime.availableProcessors(NettyRuntime.java:98)
at org.elasticsearch.transport.netty4.Netty4Utils.setAvailableProcessors(Netty4Utils.java:83)
at org.elasticsearch.transport.netty4.Netty4Transport.(Netty4Transport.java:138)
at org.elasticsearch.transport.Netty4Plugin.lambda$getTransports$0(Netty4Plugin.java:93)
at org.elasticsearch.client.transport.TransportClient.buildTemplate(TransportClient.java:176)
at org.elasticsearch.client.transport.TransportClient.(TransportClient.java:268)
at org.elasticsearch.transport.client.PreBuiltTransportClient.(PreBuiltTransportClient.java:127)
at org.elasticsearch.transport.client.PreBuiltTransportClient.(PreBuiltTransportClient.java:113)
at org.elasticsearch.transport.client.PreBuiltTransportClient.(PreBuiltTransportClient.java:103)
at com.sksamuel.elastic4s.TcpClientConstructors$class.transport(TcpClient.scala:104)
at com.sksamuel.elastic4s.TcpClient$.transport(TcpClient.scala:112)
at com.datamountaineer.streamreactor.connect.elastic5.KElasticClient$.getClient(KElasticClient.scala:47)
at com.datamountaineer.streamreactor.connect.elastic5.ElasticWriter$.apply(ElasticWriter.scala:45)
at com.datamountaineer.streamreactor.connect.elastic5.ElasticSinkTask.start(ElasticSinkTask.scala:44)
at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:232)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:145)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:146)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:190)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Configure custom read timeout interval to handle sockettimeout

Hi.

Is it possible to configure a custom read timeout interval for http requests? The stacktrace below was generated by cli v. 0.5. This is not a problem on client side, because obviously it depends on server response, but in some cases it could be useful.

java.net.SocketTimeoutException: Read timed out
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1890)
at sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1885)
at java.security.AccessController.doPrivileged(Native Method)
at sun.net.www.protocol.http.HttpURLConnection.getChainedException(HttpURLConnection.java:1884)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1457)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1441)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
at scalaj.http.HttpRequest.exec(Http.scala:351)
at scalaj.http.HttpRequest.execute(Http.scala:322)
at scalaj.http.HttpRequest.asString(Http.scala:537)
at com.datamountaineer.connect.tools.ScalajHttpClient$.request(RestKafkaConnectApi.scala:62)

CLI does not correctly validate sink properties file

My sink properties file had an error. Instead of

connect.kudu.master

I had

connect.kudu.kaster

cli created a new connector and said properties was valid. However, confluent connect log logged a bunch of errors and exceptions which went away after correcting the property.

Basic auth support?

Great CLI! Does it include basic auth support? I see no references to it in the repo.

Properties parsing regex rejects valid properties

Hi,

According to the documentation for the Java Properties class, properties may be split over multiple lines. However, the regex currently used to parse these properties doesn't support this.

For example, this setting:

fields.whitelist=\
     id,\
     timestamp,\
     search_id

... should be parsed into the JSON

{"fields.whitelist":"id,timestamp,search_id"}

But instead it is parsed into

{"fields.whitelist":"\\"}

I'd like to propose changing the ExecuteCommand.propsToMap method to use the Properties class instead of a regex. If this is OK, I can submit a PR for this. Thanks!

Error SocketTimeoutException when trying to create cassandra connector

This my config file cassandra-sink-distributed-orders.properties :
name=cassandra-sink-orders connector.class=com.datamountaineer.streamreactor.connect.cassandra.sink.CassandraSinkConnector tasks.max=1 topics=orders-topic connect.cassandra.export.route.query=INSERT INTO orders SELECT * FROM orders-topic connect.cassandra.contact.points=localhost connect.cassandra.port=9042 connect.cassandra.key.space=demo connect.cassandra.contact.points=localhost connect.cassandra.username=cassandra connect.cassandra.password=cassandra

This the command to create the connector :
./cli create cassandra-sink-orders < cassandra-sink-distributed-orders.properties

I'm getting this after a few seconds :
java.net.SocketTimeoutException: Read timed out at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1926) at sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1921) at java.security.AccessController.doPrivileged(Native Method) at sun.net.www.protocol.http.HttpURLConnection.getChainedException(HttpURLConnection.java:1920) at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1490) at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1474) at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480) at scalaj.http.HttpRequest.exec(Http.scala:351) at scalaj.http.HttpRequest.execute(Http.scala:322) at scalaj.http.HttpRequest.asString(Http.scala:537) at com.datamountaineer.connect.tools.ScalajHttpClient$.request(RestKafkaConnectApi.scala:39) at com.datamountaineer.connect.tools.RestKafkaConnectApi.com$datamountaineer$connect$tools$RestKafkaConnectApi$$req(RestKafkaConnectApi.scala:129) at com.datamountaineer.connect.tools.RestKafkaConnectApi$$anonfun$addConnector$1.apply(RestKafkaConnectApi.scala:164) at com.datamountaineer.connect.tools.RestKafkaConnectApi$$anonfun$addConnector$1.apply(RestKafkaConnectApi.scala:165) at scala.util.Try$.apply(Try.scala:192) at com.datamountaineer.connect.tools.RestKafkaConnectApi.addConnector(RestKafkaConnectApi.scala:164) at com.datamountaineer.connect.tools.ExecuteCommand$.apply(Cli.scala:55) at com.datamountaineer.connect.tools.Cli$.main(Cli.scala:167) at com.datamountaineer.connect.tools.Cli.main(Cli.scala) Caused by: java.net.SocketTimeoutException: Read timed out at java.net.SocketInputStream.socketRead0(Native Method) at java.net.SocketInputStream.socketRead(SocketInputStream.java:116) at java.net.SocketInputStream.read(SocketInputStream.java:171) at java.net.SocketInputStream.read(SocketInputStream.java:141) at java.io.BufferedInputStream.fill(BufferedInputStream.java:246) at java.io.BufferedInputStream.read1(BufferedInputStream.java:286) at java.io.BufferedInputStream.read(BufferedInputStream.java:345) at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:704) at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:647) at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1569) at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1474) at scalaj.http.HttpRequest.exec(Http.scala:349) ... 11 more

Build problems with Java 11?

Using ./gradlew buildCli, I am getting this error from the build:

FAILURE: Build failed with an exception.

  • What went wrong:
    Could not determine java version from '11.0.2'.

  • Try:
    Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.

Pause and resume commands use HTTP POST instead of PUT

Hi there,

First off, great project, thanks so much for creating it.

I noticed that pause and resume commands are using POST when talking to the Kafka Connect REST API instead of PUT

Specifically, these methods should be using PUT:

https://github.com/datamountaineer/kafka-connect-tools/blob/master/src/main/scala/com/datamountaineer/connect/tools/RestKafkaConnectApi.scala#L198

https://github.com/datamountaineer/kafka-connect-tools/blob/master/src/main/scala/com/datamountaineer/connect/tools/RestKafkaConnectApi.scala#L206

This is the result printed from the Connect Worker:

[2017-02-08 17:09:54,503] INFO 127.0.0.1 - - [08/Feb/2017:22:09:54 +0000] "POST /connectors/sqlserver-test-sink/pause HTTP/1.1" 405 323  2 (org.apache.kafka.connect.runtime.rest.RestServer:60)

Note the HTTP 405 response.

The API for these is specified here:

http://docs.confluent.io/3.1.2/connect/restapi.html#put--connectors-(string-name)-pause

I can submit a PR for these changes if you'd like. Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.