Code Monkey home page Code Monkey logo

flink-shaded's Introduction

Apache Flink Shaded Dependencies

This repository contains a number of shaded dependencies for the Apache Flink project.

The purpose of these dependencies is to provide a single instance of a shaded dependency in the Flink distribution, instead of each individual module shading the dependency.

Shaded dependencies contained here do not expose any transitive dependencies. They may or may not be self-contained.

When using these dependencies it is recommended to work directly against the shaded namespaces.

Sources

We currently do not release jars containing the shaded sources due to the unanswered legal questions raised here.

However, it is possible to build these jars locally by cloning the repository and calling mvn clean package -Dshade-sources.

About

Apache Flink is an open source project of The Apache Software Foundation (ASF).

flink-shaded's People

Contributors

aloyszhang avatar dependabot[bot] avatar gjl avatar hequn8128 avatar huangxingbo avatar martijnvisser avatar nicok avatar piyushnarang avatar pnowojski avatar rmetzger avatar slinkydeveloper avatar snuyanzin avatar sunjincheng121 avatar tillrohrmann avatar twalthr avatar xcomp avatar xintongsong avatar zentol avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

flink-shaded's Issues

Simplify artifact version scheme

The current version scheme for the released artifacts is a bit confusing:

Current: flink-shaded-asm-5-1.0-5.0.4
Proposed: flink-shaded-asm_5.0.4-1.0

Netty jar still references netty-router

We no longer bundle netty-router in the netty jar, but the shade-plugin relocation still refers to netty-router and the licenses for netty-router are still included in the jar.

Update developerConnection

With the move to gitbox the following line in the root pom needs to be adjusted:
<developerConnection>scm:git:https://git-wip-us.apache.org/repos/asf/flink-shaded.git</developerConnection>

It should point to https://github.com/apache/flink-shaded.git

Bump netty to 4.0.56

With the reworked network stack we may be able to finally update our netty version, which is also a prerequisite for Java 9 support.

Jackson dependencies not properly hidden

Dependencies that are defined in parent poms (in this case flink-shaded-jackson-parent) are still visible to maven for child modules even if they shade them.

We have to explicitly define the dependencies in the child poms and only keep dependencyManagement entries in the parent.

Add release script

We should have a release-script for flink-shaded like we do for Flink.

Add extend flink-shaded-jackson module containing module-jsonSchema

The RESTAPIGenerator uses jackson-module-jsonSchema to generate a JSON schema for requests/responses. Unfortunately it isn't enough for flink-docs to declare the dependency, since it works against the original jackson annotation and not the relocated ones we're using. As such we need a new module that contains the jsonSchema module, and ideally bundles jackson as well to remain self-contained.

Add force-shading module

To consolidate shading-related module we should move force-shading from flink to flink-shaded.

Publish maven artifact containing shaded sources

When working against the shaded dependencies it is not possible to jump to the source of a class, since we don't build source jars.

The source jars can be created by adding the following lines to the maven-shade-plugin configuration:

<createSourcesJar>true</createSourcesJar>
<shadeSourcesContent>true</shadeSourcesContent>

This is mostly blocked by legal questions:

  • Do we have to modify the release process for these jars?
    • Do we have to vote on the shaded source release?
    • Do we have to modify NOTICE/LICENSE files?
  • Are we allowed to publish a sources jar that isn't actually the source of the project?
  • Are we allowed to publish a sources jar that not necessarily can be compiled to the binary?
    • If the shaded source and binary are the result of 2 separate shading processes we can't guarantee it.

Shaded netty does not work with epoll activated

If epoll is activated via the (undocumented) taskmanager.network.netty.transport: epoll configuration option, the netty stack will crash with

2017-12-19 11:45:53,859 ERROR org.apache.flink.runtime.taskmanager.TaskManager              - Error while starting up taskManager
java.lang.NoClassDefFoundError: io/netty/util/NetUtil
	at java.lang.ClassLoader$NativeLibrary.load(Native Method)
	at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
	at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
	at java.lang.Runtime.load0(Runtime.java:809)
	at java.lang.System.load(System.java:1086)
	at org.apache.flink.shaded.netty4.io.netty.util.internal.NativeLibraryLoader.load(NativeLibraryLoader.java:193)
	at org.apache.flink.shaded.netty4.io.netty.channel.epoll.Native.<clinit>(Native.java:49)
	at org.apache.flink.shaded.netty4.io.netty.channel.epoll.EpollEventArray.<clinit>(EpollEventArray.java:40)
	at org.apache.flink.shaded.netty4.io.netty.channel.epoll.EpollEventLoop.<init>(EpollEventLoop.java:65)
	at org.apache.flink.shaded.netty4.io.netty.channel.epoll.EpollEventLoopGroup.newChild(EpollEventLoopGroup.java:76)
	at org.apache.flink.shaded.netty4.io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:64)
	at org.apache.flink.shaded.netty4.io.netty.channel.MultithreadEventLoopGroup.<init>(MultithreadEventLoopGroup.java:49)
	at org.apache.flink.shaded.netty4.io.netty.channel.epoll.EpollEventLoopGroup.<init>(EpollEventLoopGroup.java:61)
	at org.apache.flink.shaded.netty4.io.netty.channel.epoll.EpollEventLoopGroup.<init>(EpollEventLoopGroup.java:49)
	at org.apache.flink.runtime.io.network.netty.NettyClient.initEpollBootstrap(NettyClient.java:160)
	at org.apache.flink.runtime.io.network.netty.NettyClient.init(NettyClient.java:80)
	at org.apache.flink.runtime.io.network.netty.NettyConnectionManager.start(NettyConnectionManager.java:52)
	at org.apache.flink.runtime.io.network.NetworkEnvironment.start(NetworkEnvironment.java:289)
	at org.apache.flink.runtime.taskexecutor.TaskManagerServices.fromConfiguration(TaskManagerServices.java:160)
	at org.apache.flink.runtime.taskmanager.TaskManager$.startTaskManagerComponentsAndActor(TaskManager.scala:2003)
	at org.apache.flink.runtime.taskmanager.TaskManager$.runTaskManager(TaskManager.scala:1832)
	at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$1.apply$mcV$sp(TaskManager.scala:1944)
	at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$1.apply(TaskManager.scala:1922)
	at org.apache.flink.runtime.taskmanager.TaskManager$$anonfun$1.apply(TaskManager.scala:1922)
	at scala.util.Try$.apply(Try.scala:192)
	at org.apache.flink.runtime.akka.AkkaUtils$.retryOnBindException(AkkaUtils.scala:759)
	at org.apache.flink.runtime.taskmanager.TaskManager$.runTaskManager(TaskManager.scala:1922)
	at org.apache.flink.runtime.taskmanager.TaskManager$.selectNetworkInterfaceAndRunTaskManager(TaskManager.scala:1691)
	at org.apache.flink.runtime.taskmanager.TaskManager$$anon$2.call(TaskManager.scala:1592)
	at org.apache.flink.runtime.taskmanager.TaskManager$$anon$2.call(TaskManager.scala:1590)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
	at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
	at org.apache.flink.runtime.taskmanager.TaskManager$.main(TaskManager.scala:1590)
	at org.apache.flink.runtime.taskmanager.TaskManager.main(TaskManager.scala)
Caused by: java.lang.ClassNotFoundException: io.netty.util.NetUtil
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 36 more

The cause is libnetty-transport-native-epoll.so not being aligned with the shaded namespace and we can fix this as described in netty/netty#6665 and netty/netty@075a54a:

  • System.setProperty("io.netty.packagePrefix", "org.apache.flink.shaded."); inside projects using flink-shaded
  • rename META-INF/native/libnetty-transport-native-epoll.so to META-INF/native/liborg-apache-flink-shaded-netty-transport-native-epoll.so

[FLINK-11219] Upgrade Jackson version to 2.9.6

  1. Upgrade Jackson version to 2.9.6
    Because the Jackson version supported by calcite1.18.0 has been upgraded to Jackson 2.9.6.
  2. Need to upgrade flink jackson in the shaded version
    Because many flink dependency from the flink-shaded.

Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop is not in the classpath/dependencies

Hi Team,

Recently, I'm using S3 as Flink 1.13.1's HA storage directory as below. But I meet an error when deploying the job manager and task manager to k8s. I've already added the jar in folder s3-fs-hadoop under the path /opt/flink/plugins. I'm not sure whether missed the Hadoop shaded jar or not? if yes, which version shaded jar? Thanks in advance!

flink-conf.yaml
high-availability.storageDir: s3://dsw-dia-test/recovery
state.backend: filesystem
state.checkpoints.dir: s3://dsw-dia-test/checkpoints
state.backend.fs.checkpointdir: s3://dsw-dia-test/checkpoints
s3.path.style.access:true
fs.allowed-fallback-filesystems: s3
s3.endpoint: s3.us-south.cloud-object-storage.appdomain.cloud
s3.access-key:*************
s3.secret-key:***************

the plugins for file system
drwxr-xr-x 1 flink flink 4096 Jul 4 11:35 s3-fs-hadoop
drwxr-xr-x 1 flink flink 4096 Jul 4 11:35 s3-fs-presto

flink@63f9c0076cfc:~/plugins/s3-fs-hadoop$ ls -lrt
total 19796
-rw-r--r-- 1 flink flink 20269950 Jul 2 02:25 flink-s3-fs-hadoop-1.13.1.jar

flink@63f9c0076cfc:~/plugins/s3-fs-presto$ ls -lrt
total 32692
-rw-r--r-- 1 flink flink 33474159 May 25 12:20 flink-s3-fs-presto-1.13.1.jar

The error was blow

org.apache.flink.runtime.entrypoint.ClusterEntrypointException: Failed to initialize the cluster entrypoint StandaloneSessionClusterEntrypoint.
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:212) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runClusterEntrypoint(ClusterEntrypoint.java:600) [flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.StandaloneSessionClusterEntrypoint.main(StandaloneSessionClusterEntrypoint.java:59) [flink-dist_2.11-1.13.1.jar:1.13.1]
Caused by: java.io.IOException: Could not create FileSystem for highly available storage path (s3://dsw-dia-test/recovery/flink)
at org.apache.flink.runtime.blob.BlobUtils.createFileSystemBlobStore(BlobUtils.java:92) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.blob.BlobUtils.createBlobStoreFromConfig(BlobUtils.java:76) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.highavailability.HighAvailabilityServicesUtils.createHighAvailabilityServices(HighAvailabilityServicesUtils.java:115) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.createHaServices(ClusterEntrypoint.java:353) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.initializeServices(ClusterEntrypoint.java:311) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:239) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$1(ClusterEntrypoint.java:189) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:28) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:186) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
... 2 more
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 's3'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded. For a full list of supported file systems, please see https://ci.apache.org/projects/flink/flink-docs-stable/ops/filesystems/.
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:530) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:407) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.core.fs.Path.getFileSystem(Path.java:274) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.blob.BlobUtils.createFileSystemBlobStore(BlobUtils.java:89) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.blob.BlobUtils.createBlobStoreFromConfig(BlobUtils.java:76) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.highavailability.HighAvailabilityServicesUtils.createHighAvailabilityServices(HighAvailabilityServicesUtils.java:115) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.createHaServices(ClusterEntrypoint.java:353) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.initializeServices(ClusterEntrypoint.java:311) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:239) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$1(ClusterEntrypoint.java:189) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:28) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:186) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
... 2 more
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop is not in the classpath/dependencies.
at org.apache.flink.core.fs.UnsupportedSchemeFactory.create(UnsupportedSchemeFactory.java:55) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:526) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:407) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.core.fs.Path.getFileSystem(Path.java:274) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.blob.BlobUtils.createFileSystemBlobStore(BlobUtils.java:89) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.blob.BlobUtils.createBlobStoreFromConfig(BlobUtils.java:76) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.highavailability.HighAvailabilityServicesUtils.createHighAvailabilityServices(HighAvailabilityServicesUtils.java:115) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.createHaServices(ClusterEntrypoint.java:353) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.initializeServices(ClusterEntrypoint.java:311) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:239) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$1(ClusterEntrypoint.java:189) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:28) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:186) ~[flink-dist_2.11-1.13.1.jar:1.13.1]
... 2 more

Add flink-shaded-jackson

We should add a shaded fasterxml.jackson dependency.

Included artifacts:

  • jackson-core
  • jackson-databind
  • jackson-annotations

Version: 2.7.4

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.