Code Monkey home page Code Monkey logo

apache / linkis Goto Github PK

View Code? Open in Web Editor NEW
3.2K 262.0 1.1K 88.03 MB

Apache Linkis builds a computation middleware layer to facilitate connection, governance and orchestration between the upper applications and the underlying data engines.

Home Page: https://linkis.apache.org/

License: Apache License 2.0

Shell 1.31% Java 46.66% Scala 27.68% Python 0.10% JavaScript 3.35% HTML 0.02% SCSS 0.87% Vue 7.15% CSS 0.03% Dockerfile 0.06% Smarty 0.07% Ruby 10.04% TypeScript 2.50% Less 0.17%
sql spark hive pyspark livy linkis engine storage resource-manager application-manager

linkis's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

linkis's Issues

Log framework is not unified

In Linkis' source code, I have found that your log framework is not unified, in most places, you use slf4j + log4j2, but you use just log4j somewhere. That's not a good code style for inspecting errors
在Linkis的源码中,我发现你的日志框架并不统一,大部分地方你用的是slf4j + log4j2,但你有的地方只用了log4j。 这不是检查错误的好代码风格

create UDF fail

image

ERROR INFO:
{"method":null,"status":1,"message":"errCode: 52004 ,desc: You must register IOClient before you can use proxy mode.(必须先注册IOClient,才能使用代理模式) ,ip: bdpdwc010003 ,port: 12345 ,serviceKind: cloud-publicservice","data":{}}

start-all.sh脚本日志打印不够详细

1,我是在CDH平台的,编译后安装完成,启动脚本执行日志打印如下
We will start all linkis applications, it will take some time, please wait
<-------------------------------->
Begin to start Eureka Server
INFO: + End to start Eureka Server
<-------------------------------->
<-------------------------------->
Begin to start Gateway
INFO: + End to start Gateway
<-------------------------------->
<-------------------------------->
Begin to start Public Service
INFO: + End to start Public Service
<-------------------------------->
<-------------------------------->
Begin to start metadata
INFO: + End to start Metadata
<-------------------------------->
<-------------------------------->
Begin to start Resource Manager
INFO: + End to start Resource Manager
<-------------------------------->
sleep 15 seconds to wait RM to be ready
<-------------------------------->
Begin to start Spark Entrance
End to end Spark Entrance started
<-------------------------------->
<-------------------------------->
Begin to Spark Engine Manager
End to start Spark Engine Manager
<-------------------------------->
<-------------------------------->
Begin to start Hive Entrance
End to start Hive Entrance
<-------------------------------->
<-------------------------------->
Begin to start Hive Engine Manager
End to start Hive Engine Manager
<-------------------------------->
<-------------------------------->
Begin to start Python Entrance
End to start Python Entrance
<-------------------------------->
<-------------------------------->
Begin to start Python Engine Manager
End to start Python Engine Manager
<-------------------------------->
start-all shell script executed completely
但是服务没有启动,也不知道什么错,在安装目录单启动服务正常启动

run hql script fail

HQL
select * from hduser05db.alte5new limit 100

ERROR
2019-07-25 22:12:48.012 ERROR Request engine failed, possibly due to insufficient resources or background process error(请求引擎失败,可能是由于资源不足或后台进程错误)!
2019-07-25 22:12:48.012 ERROR DWCException{errCode=20010, desc='ClassNotFoundException: org.apache.hadoop.mapreduce.TaskAttemptContext', ip='bdpdwc010003', port=12350, serviceKind='hiveEntrance'}
at com.webank.wedatasphere.linkis.entrance.execute.EngineRequester$EngineInitThread.notifyThread(EngineRequester.scala:108)

linkis suppport hbase

image
大神,看到这里说Linkis 支持hbase,想请教下目前linkis支持多hbase集群的同时访问吗

import csv file to hive script log ERROR,but script result succeeded

RUN HQL script
val source = """{"path":"/tmp/linkis/leeli/Result_asda.sql_07221642.csv","pathType":"share","encoding":"utf-8","fieldDelimiter":",","hasHeader":true,"sheet":"","quote":"","escapeQuotes":false}"""
val destination = """{"database":"leeli_ind","tableName":"Result_asda","importData":false,"isPartition":false,"partition":"","partitionValue":"","isOverwrite":false,"columns":[{"name":"Pregnancies","comment":"","type":"string","dateFormat":""},{"name":"Glucose","comment":"","type":"string","dateFormat":""},{"name":"BloodPressure","comment":"","type":"string","dateFormat":""},{"name":"SkinThickness","comment":"","type":"string","dateFormat":""},{"name":"Insulin","comment":"","type":"string","dateFormat":""},{"name":"BMI","comment":"","type":"string","dateFormat":""},{"name":"DiabetesPedigreeFunction","comment":"","type":"string","dateFormat":""},{"name":"Age","comment":"","type":"string","dateFormat":""},{"name":"Outcome","comment":"","type":"string","dateFormat":""}]}"""
com.webank.wedatasphere.linkis.engine.imexport.LoadData.loadDataToTable(spark,source,destination)

ERROR LOG info:

2019-07-25 18:21:29.304 ERROR [sparkEngineEngine-Thread-4] org.apache.hadoop.hdfs.KeyProviderCache 87 createKeyProviderURI - Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!

Compile ERROR

mvn -N install
[INFO] Scanning for projects...
Downloading: http://maven.oschina.net/content/groups/public/org/springframework/
cloud/spring-cloud-dependencies/Finchley.RELEASE/spring-cloud-dependencies-Finch
ley.RELEASE.pom
[ERROR] [ERROR] Some problems were encountered while processing the POMs:
[WARNING] 'build.pluginManagement.plugins.plugin.(groupId:artifactId)' must be u
nique but found duplicate declaration of plugin org.apache.maven.plugins:maven-d
eploy-plugin @ line 245, column 25
[ERROR] Non-resolvable import POM: Could not transfer artifact org.springframewo
rk.cloud:spring-cloud-dependencies:pom:Finchley.RELEASE from/to nexus-osc (http:
//maven.oschina.net/content/groups/public/): maven.oschina.net @ line 127, colum
n 25
@
[ERROR] The build could not read 1 project -> [Help 1]
[ERROR]
[ERROR] The project com.webank.wedatasphere.linkis:linkis:0.5.0 (C:\Users\neil
jianliu\Linkis\pom.xml) has 1 error
[ERROR] Non-resolvable import POM: Could not transfer artifact org.springfra
mework.cloud:spring-cloud-dependencies:pom:Finchley.RELEASE from/to nexus-osc (h
ttp://maven.oschina.net/content/groups/public/): maven.oschina.net @ line 127, c
olumn 25: Unknown host maven.oschina.net -> [Help 2]
[ERROR]

初始化运行后,打开网页异常

在启动linkis之后,通过页面登陆后,报出操作失败异常,异常如下:
2019-07-29 18:23:32.886 [ERROR] [qtp1356763258-66 ] c.w.w.l.s.r.RestfulCatchAOP (83) [apply] - operation failed(操作失败)s com.webank.wedatasphere.linkis.filesystem.exception.WorkSpaceException: User local root directory does not exist, please contact administrator to add(用户本地根目录不存在,请联系管理员添加)
at com.webank.wedatasphere.linkis.filesystem.restful.api.FsRestfulApi.getUserRootPath(FsRestfulApi.java:122) ~[linkis-workspace-0.5.0.jar:?]
at com.webank.wedatasphere.linkis.filesystem.restful.api.FsRestfulApi$$FastClassBySpringCGLIB$$1392dc82.invoke() ~[linkis-workspace-0.5.0.jar:?]
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204) ~[spring-core-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:746) ~[spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) ~[spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.aop.aspectj.MethodInvocationProceedingJoinPoint.proceed(MethodInvocationProceedingJoinPoint.java:88) ~[spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at com.webank.wedatasphere.linkis.server.restful.RestfulCatchAOP$$anonfun$1.apply(RestfulCatchAOP.scala:48) ~[linkis-module-0.5.0.jar:?]
at com.webank.wedatasphere.linkis.server.restful.RestfulCatchAOP$$anonfun$1.apply(RestfulCatchAOP.scala:48) ~[linkis-module-0.5.0.jar:?]
at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:48) ~[linkis-common-0.5.0.jar:?]
at com.webank.wedatasphere.linkis.server.package$.catchMsg(package.scala:57) ~[linkis-module-0.5.0.jar:?]
at com.webank.wedatasphere.linkis.server.package$.catchIt(package.scala:89) ~[linkis-module-0.5.0.jar:?]
at com.webank.wedatasphere.linkis.server.restful.RestfulCatchAOP.dealResponseRestful(RestfulCatchAOP.scala:47) ~[linkis-module-0.5.0.jar:?]
at sun.reflect.GeneratedMethodAccessor63.invoke(Unknown Source) ~[?:?]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
at org.springframework.aop.aspectj.AbstractAspectJAdvice.invokeAdviceMethodWithGivenArgs(AbstractAspectJAdvice.java:644) ~[spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.aop.aspectj.AbstractAspectJAdvice.invokeAdviceMethod(AbstractAspectJAdvice.java:633) ~[spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.aop.aspectj.AspectJAroundAdvice.invoke(AspectJAroundAdvice.java:70) ~[spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:174) ~[spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:92) ~[spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:185) ~[spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:688) ~[spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at com.webank.wedatasphere.linkis.filesystem.restful.api.FsRestfulApi$$EnhancerBySpringCGLIB$$4e176bd2.getUserRootPath() ~[linkis-workspace-0.5.0.jar:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:144) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:161) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$ResponseOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:160) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:99) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:389) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:347) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:102) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:309) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.internal.Errors.process(Errors.java:315) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.internal.Errors.process(Errors.java:297) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.internal.Errors.process(Errors.java:267) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:292) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1139) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:460) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:386) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:334) ~[jaxrs-ri-2.21.jar:2.21.]
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:221) ~[jaxrs-ri-2.21.jar:2.21.]
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:865) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1655) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.websocket.server.WebSocketUpgradeFilter.doFilter(WebSocketUpgradeFilter.java:215) ~[websocket-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
at com.webank.wedatasphere.linkis.server.security.SecurityFilter.doFilter(SecurityFilter.scala:100) ~[linkis-module-0.5.0.jar:?]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200) ~[spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.filterAndRecordMetrics(WebMvcMetricsFilter.java:158) ~[spring-boot-actuator-2.0.3.RELEASE.jar:2.0.3.RELEASE]
at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.filterAndRecordMetrics(WebMvcMetricsFilter.java:126) ~[spring-boot-actuator-2.0.3.RELEASE.jar:2.0.3.RELEASE]
at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.doFilterInternal(WebMvcMetricsFilter.java:111) ~[spring-boot-actuator-2.0.3.RELEASE.jar:2.0.3.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.springframework.boot.actuate.web.trace.servlet.HttpTraceFilter.doFilterInternal(HttpTraceFilter.java:90) ~[spring-boot-actuator-2.0.3.RELEASE.jar:2.0.3.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99) ~[spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.springframework.web.filter.HttpPutFormContentFilter.doFilterInternal(HttpPutFormContentFilter.java:109) ~[spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:93) ~[spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200) ~[spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548) ~[jetty-security-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1595) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1317) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1564) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1219) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.server.Server.handle(Server.java:531) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:352) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:281) ~[jetty-io-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:102) ~[jetty-io-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118) ~[jetty-io-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333) ~[jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310) ~[jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168) ~[jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:132) ~[jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:762) [jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:680) [jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]

需要手动创建在cofig.sh中配置的WORKSPACE_PATH=file:///tmp/linkis/ 路径,并且在此路径下增加登录用户同名的目录。
建议在初始化时,根据初始化用户将相关的配置初始化全部做到位。避免搭建起来之后使用还需调试的过程。
建议在日志中提时需要创建的目录的详情,避免看见日志也不清楚怎么操作的情况。

0.8.0版本 stop-all.sh脚本无法stop掉进程linkis-metadata

Describe the bug
0.8.0版本 stop-all.sh脚本无法stop掉进程linkis-metadata

To Reproduce
Steps to reproduce the behavior:

  1. ./bin/start-all.sh
  2. ./bin/stop-all.sh

Expected behavior
停掉所有相关进程,但linkis-metadata进程没有停掉

Screenshots
image

Additional context
发现stop-all.sh脚本中变量名DATABASE_NAME的值与start-all.sh中该变量名的值不一致,导致无法正确stop掉相应进程

duplicate maven-deploy-plugin in pom.xml

[WARNING] 'build.pluginManagement.plugins.plugin.(groupId:artifactId)' must be unique but found duplicate declaration of plugin org.apache.maven.plugins:maven-deploy-plugin @ line 245, column 25

Support for connecting to Linkis via JDBC

I've noticed that the interfaces Linkis currently exposes are REST and WebSocket. Is it possible that Linkis can also provide JDBC interface in later version so that applications like Tableau is able to connect to Linkis.

Readme and Ubuntu Support

install.sh not suport Ubuntu .
install.sh 不支持 Ubuntu。

error :

sudo: yum: command not found
Failed to + install expect

HIVE2.3.6编译失败

Describe the bug
A clear and concise description of what the bug is.
org.htrace:htrace-core:jar:3.0.4
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for linkis 0.9.0:
[INFO]
[INFO] linkis ............................................. SUCCESS [ 0.127 s]
[INFO] linkis-common ...................................... SUCCESS [ 14.829 s]
[INFO] linkis-protocol .................................... SUCCESS [ 11.010 s]
[INFO] linkis-scheduler ................................... SUCCESS [ 10.213 s]
[INFO] linkis-module ...................................... SUCCESS [ 22.047 s]
[INFO] linkis-cloudRPC .................................... SUCCESS [ 22.046 s]
[INFO] linkis-mybatis ..................................... SUCCESS [ 8.621 s]
[INFO] linkis-httpclient .................................. SUCCESS [ 8.497 s]
[INFO] linkis-storage ..................................... SUCCESS [ 13.553 s]
[INFO] linkis-resourcemanager-common ...................... SUCCESS [ 17.230 s]
[INFO] linkis-resourcemanager-client ...................... SUCCESS [ 8.956 s]
[INFO] linkis-resourcemanager-server ...................... SUCCESS [ 38.023 s]
[INFO] linkis-udf ......................................... SUCCESS [ 14.450 s]
[INFO] linkis-application ................................. SUCCESS [ 13.878 s]
[INFO] linkis-jobhistory .................................. SUCCESS [ 11.678 s]
[INFO] linkis-configuration ............................... SUCCESS [ 12.561 s]
[INFO] linkis-variable .................................... SUCCESS [ 11.228 s]
[INFO] linkis-workspace ................................... SUCCESS [ 15.583 s]
[INFO] linkis-metadata .................................... FAILURE [ 14.605 s]
[INFO] linkis-ujes-engine ................................. SKIPPED
[INFO] linkis-ujes-enginemanager .......................... SKIPPED
[INFO] linkis-ujes-entrance ............................... SKIPPED
[INFO] linkis-ujes-spark-engine ........................... SKIPPED
[INFO] linkis-ujes-spark-enginemanager .................... SKIPPED
[INFO] linkis-ujes-spark-entracne ......................... SKIPPED
[INFO] linkis-hive-engine ................................. SKIPPED
[INFO] linkis-hive-engineManager .......................... SKIPPED
[INFO] linkis-hive-entrance ............................... SKIPPED
[INFO] linkis-python-engine ............................... SKIPPED
[INFO] linkis-python-enginemanager ........................ SKIPPED
[INFO] linkis-python-entracne ............................. SKIPPED
[INFO] linkis-pipeline-engine ............................. SKIPPED
[INFO] linkis-pipeline-enginemanager ...................... SKIPPED
[INFO] linkis-pipeline-entrance ........................... SKIPPED
[INFO] linkis-jdbc-entrance ............................... SKIPPED
[INFO] linkis-gateway-core ................................ SKIPPED
[INFO] linkis-gateway-springcloudgateway .................. SKIPPED
[INFO] linkis-gateway-ujes-support ........................ SKIPPED
[INFO] linkis-gateway-httpclient-support .................. SKIPPED
[INFO] linkis-eureka-server ............................... SKIPPED
[INFO] publicservice ...................................... SKIPPED
[INFO] linkis-ujes-client ................................. SKIPPED
[INFO] wedatasphere-linkis ................................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 04:29 min
[INFO] Finished at: 2019-09-29T16:31:38+08:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-assembly-plugin:3.0.0:single (make-assembly) on project linkis-metadata: Assembly is incorrectly configured: linkis-metadata: Assembly is incorrectly configured: linkis-metadata:
[ERROR] Assembly: linkis-metadata is not configured correctly: One or more filters had unmatched criteria. Check debug log for more information.
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-assembly-plugin:3.0.0:single (make-assembly) on project linkis-metadata: Assembly is incorrectly configured: linkis-metadata
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:215)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:225)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:406)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:347)
Caused by: org.apache.maven.plugin.MojoFailureException: Assembly is incorrectly configured: linkis-metadata
at org.apache.maven.plugins.assembly.mojos.AbstractAssemblyMojo.execute (AbstractAssemblyMojo.java:538)
at org.apache.maven.plugins.assembly.mojos.SingleAssemblyMojo.execute (SingleAssemblyMojo.java:58)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:225)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:406)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:347)
[ERROR]
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :linkis-metadata
To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

NullPointException caused by the file of linkis.properties does not exist

Describe the bug
NullPointexception caused by the file of linkis.properties does not exist. There is an info log print that references the file
image

To Reproduce
Steps to reproduce the behavior:

  1. delete the file of linkis.properties
  2. start application

Describe the solution you'd like
Delete this line of info logs

Linkis support Flink computing engine

description:Linkis support for Flink Feature,This includes FlinkSQL(StreamSQL and Batch SQL) and scala support.Hope to build on Flink1.9 as a release iteration.
描述:Linkis 对 Flink Feature 的支持,包括 FlinkSQL(StreamSQL 和 Batch SQL)和 scala 支持。希望在 Flink1.9 上构建作为发布迭代。

run python script fail

python
print 123

ERROR
2019-07-25 22:08:37.398 INFO [qtp1192854820-28] com.webank.wedatasphere.linkis.filesystem.service.FsService 79 getFileSystem - com.webank.wedatasphere.linkis.filesystem.exception.W
orkSpaceException: Requesting IO-Engine to initialize fileSystem failed!(请求IO-Engine初始化fileSystem失败!)
2019-07-25 22:08:37.398 INFO [qtp1192854820-28] com.webank.wedatasphere.linkis.filesystem.service.FsService 87 getFileSystem - leeli gets the hdfs type filesystem using a total of
1 milliseconds(leeli获取hdfs类型的filesystem一共使用了1毫秒)
2019-07-25 22:08:37.398 ERROR [qtp1192854820-28] com.webank.wedatasphere.linkis.server.restful.RestfulCatchAOP 83 apply - operation failed(操作失败)s com.webank.wedatasphere.linkis.
filesystem.exception.WorkSpaceException: The user has obtained the filesystem for more than 2s. Please contact the administrator.(用户获取filesystem的时间超过2s,请联系管理员)
at com.webank.wedatasphere.linkis.filesystem.restful.api.FsRestfulApi.fsValidate(FsRestfulApi.java:101) ~[linkis-workspace-0.5.0.jar:?]
at com.webank.wedatasphere.linkis.filesystem.restful.api.FsRestfulApi.getDirFileTrees(FsRestfulApi.java:258) ~[linkis-workspace-0.5.0.jar:?]
at com.webank.wedatasphere.linkis.filesystem.restful.api.FsRestfulApi$$FastClassBySpringCGLIB$$1392dc82.invoke() ~[linkis-workspace-0.5.0.jar:?]
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204) ~[spring-core-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:746) ~[spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) ~[spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.aop.aspectj.MethodInvocationProceedingJoinPoint.proceed(MethodInvocationProceedingJoinPoint.java:88) ~[spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at com.webank.wedatasphere.linkis.server.restful.RestfulCatchAOP$$anonfun$1.apply(RestfulCatchAOP.scala:48) ~[linkis-module-0.5.0.jar:?]
at com.webank.wedatasphere.linkis.server.restful.RestfulCatchAOP$$anonfun$1.apply(RestfulCatchAOP.scala:48) ~[linkis-module-0.5.0.jar:?]
at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:48) [linkis-common-0.5.0.jar:?]
at com.webank.wedatasphere.linkis.server.package$.catchMsg(package.scala:57) [linkis-module-0.5.0.jar:?]
at com.webank.wedatasphere.linkis.server.package$.catchIt(package.scala:89) [linkis-module-0.5.0.jar:?]
at com.webank.wedatasphere.linkis.server.restful.RestfulCatchAOP.dealResponseRestful(RestfulCatchAOP.scala:47) [linkis-module-0.5.0.jar:?]
at sun.reflect.GeneratedMethodAccessor66.invoke(Unknown Source) ~[?:?]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_141]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_141]
at org.springframework.aop.aspectj.AbstractAspectJAdvice.invokeAdviceMethodWithGivenArgs(AbstractAspectJAdvice.java:644) [spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.aop.aspectj.AbstractAspectJAdvice.invokeAdviceMethod(AbstractAspectJAdvice.java:633) [spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]
at org.springframework.aop.aspectj.AspectJAroundAdvice.invoke(AspectJAroundAdvice.java:70) [spring-aop-5.0.7.RELEASE.jar:5.0.7.RELEASE]

Who is using Apache Linkis?

Who is using Linkis

We’d like to thank everyone in this community for your constant support of Linkis. We’re confident that, with our effort and your support, this community could grow more prosperous and serve a greater number of users.

Our Intentions

  1. Linkis cannot grow without the voice from the community.
  2. Linkis desires more contributions from more partners.
  3. Linkis provides a series of enterprise level features as a computation middleware. We hope to get closer to the practical scenarios to plan the roadmap for future releases of Linkis.

Our Expectation

We would appreciate it if you could leave us a comment with the following information:

  • Your company, colleage or any other organizations
  • Your city & nation
  • Your contact infomation: Weibo, email or WeChat
  • Your practical business scenarios

Sample:

  • Organization:Webank
  • Location:Shenzhen, China
  • Contact information:[email protected]
  • Business scenario:Served as the middleware of script submission/execution for the data analysis tool Scriptis, the data visualization tool and the task scheduling tool.

Thanks again!!!
Your support is the biggest motivator for facilitating the progress of open-sourcing the Linkis!

Yours sincerely,
Linkis Team


谁在使用Linkis

感谢社区每一位关注并使用Linkis的伙伴。我们会持续投入,争取将Linkis社区和生态打造的更加繁荣,让更多伙伴从Linkis中受益。

此Issue初衷

  1. Linkis的成长,离不开社区的声音
  2. Linkis需要更多伙伴参与进来一起贡献
  3. Linkis作为计算中间件,提供了非常多的企业级特性,我们希望了解大家的实际应用场景,以便规划Linkis的后续版本

期待

期望您提交一条评论, 内容包括:

  • 您所在公司、学校或组织
  • 您所在的国家和城市
  • 您的联系方式: 微博、邮箱或微信
  • 您的实际业务场景

示例:

  • 公司:微众银行
  • 地点:**深圳
  • 联系方式:[email protected]
  • 业务场景:作为Scriptis数据分析工具、数据可视化工具和调度工具的脚本提交执行中间件

再次感谢!!!
您的支持是Linkis开源前进的最大动力!!

Linkis团队

Linkis should support divide publicService into mutiple micro-services.

Is your feature request related to a problem? Please describe.
no

Describe the solution you'd like
Linkis should support divide publicService into mutiple micro-services. Split the interface service of Publicservice into JobHistory, fileSystem, Configuration, UDF, variable and other services. Convenient front-end call.
We can use a paramter to control if publicService divided or not.

Describe alternatives you've considered
no

部分脚本文件中文格式导致无法运行

install.sh、start-all、stop-all等文件包含中文格式,无法运行。
另外linkis各个组件内部包含有yml文件,虽然使用dos2unxi转换了,但是在yml文件Eureka配置那里并没有转换完全,导致读取yml文件失败。
ps:精简版时发现

install.sh 有问题!

在install.sh 中有一句话
ssh $SERVER_IP "cd $SERVER_HOME/$SERVERNAME/lib;" $SERVER_CONF_PATH"

这句话的引号不匹配。

hive progress bug

Hi, I have set up your system in my cluster, and it can run hive sql. It works awesome. But sometimes it is not right showing the hive mapreduce progress as pic shows below, can you fix it?
image

job configuration Graininess smaller

Is your feature request related to a problem? Please describe.
when i submit spark job ,I found that i can't setting configuration for diffrent job(in a short time),console is good,but i am tied of change the setting frequently,and i hope the configuration Graininess smaller

Describe the solution you'd like
i hope i can setting some start up configuration when i run a job(or a history job) just like variable ,i hope the configuration can overwirte the console

Describe alternatives you've considered
i think the Efficiency of persist in the script file may be higher than database

Additional context
nothing

JDBC Engine Support for Linkis

Now, I noticed that linkis supports spark,hive and Python engines. I'd like to implement a jdbc engine to connect to TiDB,MySql,PostgreSQL,HiveServer2 or Spark ThriftServer.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.