lensesio / kafka-connect-tools Goto Github PK
View Code? Open in Web Editor NEWKafka Connect Tooling
License: Apache License 2.0
Kafka Connect Tooling
License: Apache License 2.0
In order for other tools ( see Confluent Platform CSD ) to pick up & use
Tried to build this on Azure Pipelines and got this error above, so I ran the build again with the "--warning-mode all" to see the deprecated features used and got this messages bellow:
git clone https://github.com/lensesio/kafka-connect-tools.git
Cloning into 'kafka-connect-tools'...
cd kafka-connect-tools
gradle buildCli --warning-mode all
Welcome to Gradle 6.3!
Here are the highlights of this release:
- Java 14 support
- Improved error messages for unexpected failures
For more details see https://docs.gradle.org/6.3/release-notes.html
Starting a Gradle Daemon (subsequent builds will be faster)
> Configure project :
The maven plugin has been deprecated. This is scheduled to be removed in Gradle 7.0. Please use the maven-publish plugin instead. Consult the upgrading guide for further information: https://docs.gradle.org/6.3/userguide/upgrading_version_5.html#legacy_publication_system_is_deprecated_and_replaced_with_the_publish_plugins
at build_bp759d9y9oh4pwffs7cy18zv6$_run_closure2.doCall(D:\a\r1\a\kafka-connect-tools\build.gradle:38)
(Run with --stacktrace to get the full stack trace of this deprecation warning.)
The compile configuration has been deprecated for dependency declaration. This will fail with an error in Gradle 7.0. Please use the implementation configuration instead. Consult the upgrading guide for further information: https://docs.gradle.org/6.3/userguide/upgrading_version_5.html#dependencies_should_no_longer_be_declared_using_the_compile_and_runtime_configurations
at build_bp759d9y9oh4pwffs7cy18zv6$_run_closure3.doCall(D:\a\r1\a\kafka-connect-tools\build.gradle:54)
(Run with --stacktrace to get the full stack trace of this deprecation warning.)
The testCompile configuration has been deprecated for dependency declaration. This will fail with an error in Gradle 7.0. Please use the testImplementation configuration instead. Consult the upgrading guide for further information: https://docs.gradle.org/6.3/userguide/upgrading_version_5.html#dependencies_should_no_longer_be_declared_using_the_compile_and_runtime_configurations
at build_bp759d9y9oh4pwffs7cy18zv6$_run_closure3.doCall(D:\a\r1\a\kafka-connect-tools\build.gradle:61)
(Run with --stacktrace to get the full stack trace of this deprecation warning.)
The testRuntime configuration has been deprecated for dependency declaration. This will fail with an error in Gradle 7.0. Please use the testRuntimeOnly configuration instead. Consult the upgrading guide for further information: https://docs.gradle.org/6.3/userguide/upgrading_version_5.html#dependencies_should_no_longer_be_declared_using_the_compile_and_runtime_configurations
at build_bp759d9y9oh4pwffs7cy18zv6$_run_closure3.doCall(D:\a\r1\a\kafka-connect-tools\build.gradle:63)
(Run with --stacktrace to get the full stack trace of this deprecation warning.)
The AbstractArchiveTask.baseName property has been deprecated. This is scheduled to be removed in Gradle 7.0. Please use the archiveBaseName property instead. See https://docs.gradle.org/6.3/dsl/org.gradle.api.tasks.bundling.AbstractArchiveTask.html#org.gradle.api.tasks.bundling.AbstractArchiveTask:baseName for more details.
at build_bp759d9y9oh4pwffs7cy18zv6$_run_closure5.doCall(D:\a\r1\a\kafka-connect-tools\build.gradle:77)
(Run with --stacktrace to get the full stack trace of this deprecation warning.)
> Task :clean UP-TO-DATE
> Task :compileJava NO-SOURCE
> Task :compileScala
> Task :processResources NO-SOURCE
> Task :classes
> Task :shadowJar
> Task :fatJar
> Task :buildCli FAILED
FAILURE: Build failed with an exception.
4 actionable tasks: 3 executed, 1 up-to-date
* What went wrong:
Execution failed for task ':buildCli'.
> A problem occurred starting process 'command 'bin/package.sh''
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 30s
I'm not a Java developer, so I don't know anything about Maven or Gradle, am I doing something wrong?
curl -X PUT -H "Content-Type: application/json" -d "{}" http://localhost:8083/connector-plugins
Per documentation I should be able to validate the sink properties via
connect-cli validate < kudu_sink.properties
However I get
Error: Please specify the connector-name
Tried
connect-cli validate kudu-sink < kudu_sink.properties
but that throws
java.lang.Exception: Error: the Kafka Connect API returned status code 400
at com.datamountaineer.connect.tools.RestKafkaConnectApi.non2xxException(RestKafkaConnectApi.scala:122)
at com.datamountaineer.connect.tools.RestKafkaConnectApi.com$datamountaineer$connect$tools$RestKafkaConnectApi$$req(RestKafkaConnectApi.scala:139)
at com.datamountaineer.connect.tools.RestKafkaConnectApi$$anonfun$connectorPluginsValidate$1.apply(RestKafkaConnectApi.scala:270)
at com.datamountaineer.connect.tools.RestKafkaConnectApi$$anonfun$connectorPluginsValidate$1.apply(RestKafkaConnectApi.scala:270)
at scala.util.Try$.apply(Try.scala:192)
at com.datamountaineer.connect.tools.RestKafkaConnectApi.connectorPluginsValidate(RestKafkaConnectApi.scala:270)
at com.datamountaineer.connect.tools.ExecuteCommand$.apply(Cli.scala:71)
at com.datamountaineer.connect.tools.Cli$.main(Cli.scala:194)
at com.datamountaineer.connect.tools.Cli.main(Cli.scala)
hi, @andrewstevenson,
Is it possible to include multi --endpoints?
So that we could manage a connect-cluster by using only one CLI.
look forward to your kind reply. โบ
If they don't match the framework will die in Distributed mode looking up the configs in Kafka.
when I run gradle buildCli
I get
FAILURE: Build failed with an exception.
* Where:
Build file '/Users/brian.lipp/Documents/GitHub/kafka-connect-tools/build.gradle' line: 39
* What went wrong:
A problem occurred evaluating root project 'kafka-connect-cli'.
> Failed to apply plugin [id 'com.github.maiflai.scalatest']
> java.lang.UnsupportedOperationException (no error message)
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 0s
I'm using Gradle 4.7
It should be possible to get rid of the JVM overhead using the GraalVM Native Image mechanism. I will take a look at this later, but have you folks already experimented with this route before?
java.lang.NoSuchMethodError: io.netty.util.internal.ObjectUtil.checkPositive(ILjava/lang/String;)I
at io.netty.util.NettyRuntime$AvailableProcessorsHolder.setAvailableProcessors(NettyRuntime.java:44)
at io.netty.util.NettyRuntime$AvailableProcessorsHolder.availableProcessors(NettyRuntime.java:70)
at io.netty.util.NettyRuntime.availableProcessors(NettyRuntime.java:98)
at org.elasticsearch.transport.netty4.Netty4Utils.setAvailableProcessors(Netty4Utils.java:83)
at org.elasticsearch.transport.netty4.Netty4Transport.(Netty4Transport.java:138)
at org.elasticsearch.transport.Netty4Plugin.lambda$getTransports$0(Netty4Plugin.java:93)
at org.elasticsearch.client.transport.TransportClient.buildTemplate(TransportClient.java:176)
at org.elasticsearch.client.transport.TransportClient.(TransportClient.java:268)
at org.elasticsearch.transport.client.PreBuiltTransportClient.(PreBuiltTransportClient.java:127)
at org.elasticsearch.transport.client.PreBuiltTransportClient.(PreBuiltTransportClient.java:113)
at org.elasticsearch.transport.client.PreBuiltTransportClient.(PreBuiltTransportClient.java:103)
at com.sksamuel.elastic4s.TcpClientConstructors$class.transport(TcpClient.scala:104)
at com.sksamuel.elastic4s.TcpClient$.transport(TcpClient.scala:112)
at com.datamountaineer.streamreactor.connect.elastic5.KElasticClient$.getClient(KElasticClient.scala:47)
at com.datamountaineer.streamreactor.connect.elastic5.ElasticWriter$.apply(ElasticWriter.scala:45)
at com.datamountaineer.streamreactor.connect.elastic5.ElasticSinkTask.start(ElasticSinkTask.scala:44)
at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:232)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:145)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:146)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:190)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Hi.
Is it possible to configure a custom read timeout interval for http requests? The stacktrace below was generated by cli v. 0.5. This is not a problem on client side, because obviously it depends on server response, but in some cases it could be useful.
java.net.SocketTimeoutException: Read timed out
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1890)
at sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1885)
at java.security.AccessController.doPrivileged(Native Method)
at sun.net.www.protocol.http.HttpURLConnection.getChainedException(HttpURLConnection.java:1884)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1457)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1441)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
at scalaj.http.HttpRequest.exec(Http.scala:351)
at scalaj.http.HttpRequest.execute(Http.scala:322)
at scalaj.http.HttpRequest.asString(Http.scala:537)
at com.datamountaineer.connect.tools.ScalajHttpClient$.request(RestKafkaConnectApi.scala:62)
Assume an environment variable is set to locate confluent install and start connector cluster on specified hosts.
Allow setting log4j and connector properties.
My sink properties file had an error. Instead of
connect.kudu.master
I had
connect.kudu.kaster
cli created a new connector and said properties was valid. However, confluent connect log logged a bunch of errors and exceptions which went away after correcting the property.
Great CLI! Does it include basic auth support? I see no references to it in the repo.
Hi,
According to the documentation for the Java Properties
class, properties may be split over multiple lines. However, the regex currently used to parse these properties doesn't support this.
For example, this setting:
fields.whitelist=\
id,\
timestamp,\
search_id
... should be parsed into the JSON
{"fields.whitelist":"id,timestamp,search_id"}
But instead it is parsed into
{"fields.whitelist":"\\"}
I'd like to propose changing the ExecuteCommand.propsToMap
method to use the Properties
class instead of a regex. If this is OK, I can submit a PR for this. Thanks!
This my config file cassandra-sink-distributed-orders.properties :
name=cassandra-sink-orders connector.class=com.datamountaineer.streamreactor.connect.cassandra.sink.CassandraSinkConnector tasks.max=1 topics=orders-topic connect.cassandra.export.route.query=INSERT INTO orders SELECT * FROM orders-topic connect.cassandra.contact.points=localhost connect.cassandra.port=9042 connect.cassandra.key.space=demo connect.cassandra.contact.points=localhost connect.cassandra.username=cassandra connect.cassandra.password=cassandra
This the command to create the connector :
./cli create cassandra-sink-orders < cassandra-sink-distributed-orders.properties
I'm getting this after a few seconds :
java.net.SocketTimeoutException: Read timed out at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1926) at sun.net.www.protocol.http.HttpURLConnection$10.run(HttpURLConnection.java:1921) at java.security.AccessController.doPrivileged(Native Method) at sun.net.www.protocol.http.HttpURLConnection.getChainedException(HttpURLConnection.java:1920) at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1490) at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1474) at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480) at scalaj.http.HttpRequest.exec(Http.scala:351) at scalaj.http.HttpRequest.execute(Http.scala:322) at scalaj.http.HttpRequest.asString(Http.scala:537) at com.datamountaineer.connect.tools.ScalajHttpClient$.request(RestKafkaConnectApi.scala:39) at com.datamountaineer.connect.tools.RestKafkaConnectApi.com$datamountaineer$connect$tools$RestKafkaConnectApi$$req(RestKafkaConnectApi.scala:129) at com.datamountaineer.connect.tools.RestKafkaConnectApi$$anonfun$addConnector$1.apply(RestKafkaConnectApi.scala:164) at com.datamountaineer.connect.tools.RestKafkaConnectApi$$anonfun$addConnector$1.apply(RestKafkaConnectApi.scala:165) at scala.util.Try$.apply(Try.scala:192) at com.datamountaineer.connect.tools.RestKafkaConnectApi.addConnector(RestKafkaConnectApi.scala:164) at com.datamountaineer.connect.tools.ExecuteCommand$.apply(Cli.scala:55) at com.datamountaineer.connect.tools.Cli$.main(Cli.scala:167) at com.datamountaineer.connect.tools.Cli.main(Cli.scala) Caused by: java.net.SocketTimeoutException: Read timed out at java.net.SocketInputStream.socketRead0(Native Method) at java.net.SocketInputStream.socketRead(SocketInputStream.java:116) at java.net.SocketInputStream.read(SocketInputStream.java:171) at java.net.SocketInputStream.read(SocketInputStream.java:141) at java.io.BufferedInputStream.fill(BufferedInputStream.java:246) at java.io.BufferedInputStream.read1(BufferedInputStream.java:286) at java.io.BufferedInputStream.read(BufferedInputStream.java:345) at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:704) at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:647) at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1569) at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1474) at scalaj.http.HttpRequest.exec(Http.scala:349) ... 11 more
Using ./gradlew buildCli, I am getting this error from the build:
FAILURE: Build failed with an exception.
What went wrong:
Could not determine java version from '11.0.2'.
Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
Required for a publication to Maven, better than empty docs.
Hi there,
First off, great project, thanks so much for creating it.
I noticed that pause and resume commands are using POST
when talking to the Kafka Connect REST API instead of PUT
Specifically, these methods should be using PUT
:
This is the result printed from the Connect Worker:
[2017-02-08 17:09:54,503] INFO 127.0.0.1 - - [08/Feb/2017:22:09:54 +0000] "POST /connectors/sqlserver-test-sink/pause HTTP/1.1" 405 323 2 (org.apache.kafka.connect.runtime.rest.RestServer:60)
Note the HTTP 405 response.
The API for these is specified here:
http://docs.confluent.io/3.1.2/connect/restapi.html#put--connectors-(string-name)-pause
I can submit a PR for these changes if you'd like. Thanks!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.