Code Monkey home page Code Monkey logo

easyml's People

Contributors

callmevan avatar floydlin avatar hdsgdhr avatar nkxujun avatar sinllychen avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

easyml's Issues

move Dockerfile to github

Thus we need to download the Dockerfile and all dependent files and configuration files from our google drive disk or Baidu Cloud.

I have checked the files in the google drive disk. And I found there are many tars of open source projects. Are they official tars? Or have easyml customed some of them? If they are official tars, why not just add the download link in Dockerfile?

BTW, I think it is a good way to put the files and configuration into github. Maybe the community can make some suggestions about building and deploying.

系统安全问题

老师您好,您的平台使我们大数据处理数据分析的一个方便快捷可视化工具。
但是我有一个问题,平台能够直接上传程序和算法,甚至可以直接向服务器上传Shell脚本。
怎样保证系统的安全和稳定性是如何保证的,如何避免异常脚本文件对系统可能造成的不利影响

gwt compile报错

Compile报错:
环境:OSX 10.11.6
IDE:IntelliJ IDEA
有人遇到过吗?

Loading inherited module 'eml.studio.EMLStudio'
[ERROR] A generator must extend com.google.gwt.core.ext.Generator
Loading modules
eml.studio.EMLStudio
Loading inherited module 'eml.studio.EMLStudio'
[ERROR] A generator must extend com.google.gwt.core.ext.Generator
[ERROR] Line 27: Unexpected exception while processing element 'generate-with'
[ERROR] Line 27: Unexpected exception while processing element 'generate-with'
com.google.gwt.core.ext.UnableToCompleteException: (see previous log entries)

at com.google.gwt.dev.cfg.ModuleDefSchema$BodySchema.__generate_with_begin(ModuleDefSchema.java:483)
at sun.reflect.GeneratedMethodAccessor20.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.google.gwt.dev.util.xml.HandlerMethod.invokeBegin(HandlerMethod.java:230)
at com.google.gwt.dev.util.xml.ReflectiveParser$Impl.startElement(ReflectiveParser.java:294)
at org.apache.xerces.parsers.AbstractSAXParser.startElement(Unknown Source)
at org.apache.xerces.impl.dtd.XMLDTDValidator.startElement(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanStartElement(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl$FragmentContentDispatcher.dispatch(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source)
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source)
at com.google.gwt.dev.util.xml.ReflectiveParser$Impl.parse(ReflectiveParser.java:347)
at com.google.gwt.dev.util.xml.ReflectiveParser$Impl.access$200(ReflectiveParser.java:68)
at com.google.gwt.dev.util.xml.ReflectiveParser.parse(ReflectiveParser.java:418)
at com.google.gwt.dev.cfg.ModuleDefLoader.nestedLoad(ModuleDefLoader.java:326)
at com.google.gwt.dev.cfg.ModuleDefLoader.load(ModuleDefLoader.java:246)
at com.google.gwt.dev.cfg.ModuleDefLoader.doLoadModule(ModuleDefLoader.java:195)
at com.google.gwt.dev.cfg.ModuleDefLoader.loadFromResources(ModuleDefLoader.java:172)
at com.google.gwt.dev.codeserver.Recompiler.loadModule(Recompiler.java:141)
at com.google.gwt.dev.codeserver.Recompiler.compile(Recompiler.java:82)
at com.google.gwt.dev.codeserver.ModuleState.(ModuleState.java:54)
at com.google.gwt.dev.codeserver.CodeServer.start(CodeServer.java:88)
at com.google.gwt.dev.codeserver.CodeServer.main(CodeServer.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.google.gwt.dev.shell.SuperDevListener$1.run(SuperDevListener.java:112)
[ERROR] Failure while parsing XML

build抛出异常

编译的时候抛出如下异常,是否还需要其他额外的配置?

log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
java.lang.RuntimeException: not found: includeSourceMapUrl
	at com.google.gwt.dev.codeserver.Recompiler.overrideConfig(Recompiler.java:630)
	at com.google.gwt.dev.codeserver.Recompiler.loadModule(Recompiler.java:499)
	at com.google.gwt.dev.codeserver.Recompiler.initWithoutPrecompile(Recompiler.java:204)
	at com.google.gwt.dev.codeserver.Outbox.maybePrecompile(Outbox.java:89)
	at com.google.gwt.dev.codeserver.Outbox.<init>(Outbox.java:61)
	at com.google.gwt.dev.codeserver.CodeServer.makeOutboxTable(CodeServer.java:192)
	at com.google.gwt.dev.codeserver.CodeServer.start(CodeServer.java:151)
	at com.google.gwt.dev.codeserver.CodeServer.main(CodeServer.java:104)
	at com.google.gwt.dev.codeserver.CodeServer.main(CodeServer.java:55)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at com.google.gwt.dev.shell.SuperDevListener.runCodeServer(SuperDevListener.java:112)
	at com.google.gwt.dev.shell.SuperDevListener.start(SuperDevListener.java:91)
	at com.google.gwt.dev.DevMode.ensureCodeServerListener(DevMode.java:666)
	at com.google.gwt.dev.DevModeBase.doStartup(DevModeBase.java:810)
	at com.google.gwt.dev.DevMode.doStartup(DevMode.java:551)

怎样丰富左侧列表的算法或模型?

您好,我搭建好了runtime server environment后,目前可以访问自己搭建的EasyML网站,
左边列出的算法比较少,我是需要通过顶部的'Upload Program'按钮自己丰富算法或模型吗?
因为本人对机器学习是小白一枚,目前的需要就是搭建这个平台,敬请指教!!!

导入工程出错,maven clean 出现错误

#18 进行到marven clean时,出现以下错误,主要集中在pom.xml文件中

[INFO] Scanning for projects...
[ERROR] [ERROR] Some problems were encountered while processing the POMs:
[WARNING] 'dependencies.dependency.systemPath' for eml:typeparser:jar should not point at files within the project directory, ${project.basedir}/lib/typeparser-1.0.jar will be unresolvable by dependent projects @ line 134, column 16
[WARNING] 'dependencies.dependency.systemPath' for eml:uploader:jar should not point at files within the project directory, ${project.basedir}/lib/uploader-1.1.0.jar will be unresolvable by dependent projects @ line 146, column 16
[WARNING] 'dependencies.dependency.systemPath' for eml:gwt-links:jar should not point at files within the project directory, ${project.basedir}/lib/gwt-links-1.3.jar will be unresolvable by dependent projects @ line 153, column 16
[ERROR] 'dependencies.dependency.systemPath' for jdk.tools:jdk.tools:jar must specify an absolute path but is ${JAVA_HOME}/lib/tools.jar @ line 160, column 16
@
[ERROR] The build could not read 1 project -> [Help 1]
[ERROR]
[ERROR] The project eml:EMLstudio:0.1 (/home/stefan/KD/EasyML-master/pom.xml) has 1 error
[ERROR] 'dependencies.dependency.systemPath' for jdk.tools:jdk.tools:jar must specify an absolute path but is ${JAVA_HOME}/lib/tools.jar @ line 160, column 16
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException

``

怎么用JAVA编写算法包上传呢?

您好,我尝试过用JAVA写算法包上传到你们提供的练习网址上,但是运行失败了。因此冒昧地请教一下,JAVA的程序应该如何编写,主要是输入和输出参数的疑惑。他们是接受一个路径然后打开相应的文件吗?如果是这样,那么我输出时应该指定哪个文件呢?因为没接触过scala语言,因此看示例有难度。望指教!

自己编写的算法包有规范吗?

查看了项目的文档和介绍,发现项目里没有提供_用户编写自己的算法包的规范_的描述,能否给一个简单的demo Job【如下图】,描述一下其中涉及到的算法包的接口规范,能让它们顺利地衔接运行,最好能新建一个example项目。

123

画图工具

请问起重的画图工具用的什么构建的?d3?请告知一下谢谢

环境搭建小问题~

您好,我是一名研一的学生,对您的平台非常感兴趣,但是有些地方没大看明白,就是我这里是Windows系统下,用的是VMware安装了三台虚拟机(centos),hadoop和spark集群已经安装完毕,都能正常运作,我想在我的集群上在使用您的平台,接下来该怎么部署,能不能简单说下,万分感谢!

关于环境搭建遇到的问题

1.通过百度云下的EML包,运行run_containers.sh时会因为nkxujun/emlsql报错,通过docker images看到本地的名字是nkxujun/mysql_eml,替换过后可以运行

2.在Configure local hosts步骤中,贴了张hosts的配置图,有点误导性,实际上应该参照run_containers.sh中指定的ip来配置吧

3.按照QuickStart的流程执行,直到最后一步访问http://hadoop-master:18080/EMLStudio/,页面能正常打开,但是其它请求都返回500.(未解决)

4.执行sh /root/start-oozie.sh 的时候会有错误:Error: IO_ERROR : java.net.ConnectException: Connection refused,看到QuickStart里的截图也是这样,这个错误是否可以忽略不管?

5.在IDEA上运行可以通过localhost访问,请问在IDEA上运行和通过http://hadoop-master:18080/EMLStudio/是否等价?

单机版问题

您好,我把项目导入后,跑了一下(IDEA跑的),页面能打开,但是点击登录提示密码错误,还有点击注册还有游客登录都没反应。我看了一下IEDA下面有这个错误:
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1121)
at com.mysql.jdbc.MysqlIO.(MysqlIO.java:357)
at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2482)
at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2519)
at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2304)
大体截取了一部分,请问是我没配置好,还是有其他的错误。

提交任务后,后台Oozie出现org.apache.oozie.action.ActionExecutor.convertException错误

你好,麻烦我想问一下,我按照quicklystart的步骤搭建好了环境,第一次可以正常运行,但是以后按照同样的步骤,在提交任务后会出现任务执行错误,后台看了一下oozie也运行正常,能不能麻烦您帮我看一下什么原因。
oozie的执行日志如下:
2017-07-26 06:23:24,857 WARN ParameterVerifier:542 - SERVER[hadoop-master] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] The application does not define formal parameters in its XML definition
2017-07-26 06:23:24,874 WARN LiteWorkflowAppService:542 - SERVER[hadoop-master] USER[-] GROUP[-] TOKEN[-] APP[-] JOB[-] ACTION[-] libpath [/EML/oozie/APP-PATH-f34582f8-4948-4fc5-84d3-44337e03b1c2/lib] does not exist
2017-07-26 06:23:25,040 INFO ActionStartXCommand:539 - SERVER[hadoop-master] USER[root] GROUP[-] TOKEN[] APP[Examples of distributed abalone age prediction] JOB[0000004-170726060341179-oozie-root-W] ACTION[0000004-170726060341179-oozie-root-W@:start:] Start action [0000004-170726060341179-oozie-root-W@:start:] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10]
2017-07-26 06:23:25,041 WARN ActionStartXCommand:542 - SERVER[hadoop-master] USER[root] GROUP[-] TOKEN[] APP[Examples of distributed abalone age prediction] JOB[0000004-170726060341179-oozie-root-W] ACTION[0000004-170726060341179-oozie-root-W@:start:] [0000004-170726060341179-oozie-root-W@:start:]Action status=DONE
2017-07-26 06:23:25,041 WARN ActionStartXCommand:542 - SERVER[hadoop-master] USER[root] GROUP[-] TOKEN[] APP[Examples of distributed abalone age prediction] JOB[0000004-170726060341179-oozie-root-W] ACTION[0000004-170726060341179-oozie-root-W@:start:] [0000004-170726060341179-oozie-root-W@:start:]Action updated in DB!
2017-07-26 06:23:25,241 INFO ActionStartXCommand:539 - SERVER[hadoop-master] USER[root] GROUP[-] TOKEN[] APP[Examples of distributed abalone age prediction] JOB[0000004-170726060341179-oozie-root-W] ACTION[0000004-170726060341179-oozie-root-W@LibSVM2LabeledPoint-15bc2b9f8ca-3180] Start action [0000004-170726060341179-oozie-root-W@LibSVM2LabeledPoint-15bc2b9f8ca-3180] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10]
2017-07-26 06:23:25,339 WARN ActionStartXCommand:542 - SERVER[hadoop-master] USER[root] GROUP[-] TOKEN[] APP[Examples of distributed abalone age prediction] JOB[0000004-170726060341179-oozie-root-W] ACTION[0000004-170726060341179-oozie-root-W@LibSVM2LabeledPoint-15bc2b9f8ca-3180] Error starting action [LibSVM2LabeledPoint-15bc2b9f8ca-3180]. ErrorType [ERROR], ErrorCode [IndexOutOfBoundsException], Message [IndexOutOfBoundsException: Index: 0, Size: 0]
org.apache.oozie.action.ActionExecutorException: IndexOutOfBoundsException: Index: 0, Size: 0
at org.apache.oozie.action.ActionExecutor.convertException(ActionExecutor.java:401)
at org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:915)
at org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1071)
at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:217)
at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:62)
at org.apache.oozie.command.XCommand.call(XCommand.java:280)
at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:323)
at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:252)
at org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:174)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IndexOutOfBoundsException: Index: 0, Size: 0
at java.util.ArrayList.rangeCheck(ArrayList.java:653)
at java.util.ArrayList.get(ArrayList.java:429)
at org.apache.oozie.action.hadoop.JavaActionExecutor.addSystemShareLibForAction(JavaActionExecutor.java:545)
at org.apache.oozie.action.hadoop.JavaActionExecutor.addAllShareLibs(JavaActionExecutor.java:637)
at org.apache.oozie.action.hadoop.JavaActionExecutor.setLibFilesArchives(JavaActionExecutor.java:628)
at org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:822)
... 10 more
2017-07-26 06:23:25,340 WARN ActionStartXCommand:542 - SERVER[hadoop-master] USER[root] GROUP[-] TOKEN[] APP[Examples of distributed abalone age prediction] JOB[0000004-170726060341179-oozie-root-W] ACTION[0000004-170726060341179-oozie-root-W@LibSVM2LabeledPoint-15bc2b9f8ca-3180] Setting Action Status to [DONE]
2017-07-26 06:23:25,460 INFO ActionEndXCommand:539 - SERVER[hadoop-master] USER[root] GROUP[-] TOKEN[] APP[Examples of distributed abalone age prediction] JOB[0000004-170726060341179-oozie-root-W] ACTION[0000004-170726060341179-oozie-root-W@LibSVM2LabeledPoint-15bc2b9f8ca-3180] ERROR is considered as FAILED for SLA
2017-07-26 06:23:25,615 INFO ActionStartXCommand:539 - SERVER[hadoop-master] USER[root] GROUP[-] TOKEN[] APP[Examples of distributed abalone age prediction] JOB[0000004-170726060341179-oozie-root-W] ACTION[0000004-170726060341179-oozie-root-W@fail] Start action [0000004-170726060341179-oozie-root-W@fail] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10]
2017-07-26 06:23:25,615 WARN ActionStartXCommand:542 - SERVER[hadoop-master] USER[root] GROUP[-] TOKEN[] APP[Examples of distributed abalone age prediction] JOB[0000004-170726060341179-oozie-root-W] ACTION[0000004-170726060341179-oozie-root-W@fail] [0000004-170726060341179-oozie-root-W@fail]Action status=DONE
2017-07-26 06:23:25,615 WARN ActionStartXCommand:542 - SERVER[hadoop-master] USER[root] GROUP[-] TOKEN[] APP[Examples of distributed abalone age prediction] JOB[0000004-170726060341179-oozie-root-W] ACTION[0000004-170726060341179-oozie-root-W@fail] [0000004-170726060341179-oozie-root-W@fail]Action updated in DB!

【重要】关于工程编译平台及开发环境搭建若干说明

Hi,大家好,关于近期有很多用户向我们反馈idea无法运行EML工程的问题以及开发过程中出现较多环境搭建问题,现做出如下几点说明:

一、关于工程编译环境

1、IDEA版本大家需要下载有GWT插件的或者能够支持GWT插件安装的版本,据反馈IDEA Community(社区版)目前不支持GWT插件,我们自己内部测试的版本是ideaIU-15.0.3教育版,该版本自带GWT插件。(注:IDEA环境下的工程打包目前会有一些问题,建议通过Eclipse进行打包)

2、除了IDEA编译环境,大家也可以使用Eclipse进行EML工程的搭建,当然也是需要保证Eclipse已经安装了GWT插件以及Maven的环境。插件的在线安装地址可参考Eclipse在线安装GWT插件。 针对Eclipse 4.4 luna版本的GWT插件可从此下载
具体步骤如下:
a、导入工程。File -> Import -> Import Existing Projects into Workspace
1234324

b、Maven clean。右击工程->Run As -> maven clean。

c、Maven编译。右击工程->Run As->maven build->goal命令中输入compile,同时勾选skip test(一般情况下无需进行test的编译)
4ed34799-1f41-473b-ac07-37c5aa2f0ab9

注:若编译失败出现java.tools错误,请将JRE System Library默认的jre7替换为jdk。如下图所示
qq 20171106102151

d、GWT编译。右击工程->Run As->maven build->goal命令中输入gwt:compile,同时勾选skip test(一般情况下无需进行test的编译) 该编译会持续较久时间,请耐心等待。

e、运行工程。右击工程->Run As->Web Application(GWT Super Dev Mode)->选择index.html 即可执行
fdsfsd

f、执行完之后,系统会在Development Mode下显示本地连接,点击连接即可打开EML主页
43

g、关于工程打包部署。右击工程->Run As->Maven build-> goal中输入package,运行成功之后,会在工程的Target目录下生成对应名字的war包,将war包拷贝到tomcat容器中部署即可
12345678

注意:在进行打包之前一定要先进行c和d步骤,否则打的包会是开发包而不是编译之后的包。

二、关于开发环境搭建

考虑到大家在使用Docker集群搭建环境出现很多问题,我们近期会在Wiki上发布关于Windows安装EML集群以及Ubuntu安装EML集群的详细说明,敬请大家关注!

Job无法运行

已经把环境搭建成功,服务器是在ubuntu下运行docker,在win7系统下访问。
环境搭建完成以后,遇到两个问题:
1.在win7下可以正常访问 http://hadoop-master:18080/EMLStudio/#login,但是登陆的时候一直登陆失败。
2.通过IDE编译以后,访问http://127.0.0.1:8888/EMLStudio.html#monitor,可以登陆进去,运行系统自带的例子一直submit failed。IDe的错误提示
image

[mkdirs]hdfs://hadoop-master:9000/EML/oozie/APP-PATH-7c5498ac-f557-4a10-9a03-b2971d789875
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /EML/oozie/APP-PATH-7c5498ac-f557-4a10-9a03-b2971d789875/workflow.xml could only be replicated to 0 nodes instead of minReplication (=1). There are 2 datanode(s) running and 2 node(s) are excluded in this operation.

at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1571)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:725)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

at org.apache.hadoop.ipc.Client.call(Client.java:1347)
at org.apache.hadoop.ipc.Client.call(Client.java:1300)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy44.addBlock(Unknown Source)
at sun.reflect.GeneratedMethodAccessor206.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy44.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:330)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1231)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1078)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514)

demo版本与对外github上发布的是否一致?有哪些区别?

我并没有走docker拉取镜像,只是mvn package 后部署了war包
对比了下界面,发现少不少东西,是少了什么步骤吗?
另外我看跑oozie里,应该有个spark.jar,这个是在镜像里才有?源码部分目前没有放出?
这个操作栏也没看到
image

image
image
谢谢!

图片显示很乱

我在centos7上面安装docker是可以正常跑起来的,不过我编译代码登录进去后有些图片不能显示,
image
这个图片指向这个地址:127.0.0.1:8888/EMLStudio/clear.cache.gif,但是这个地址在源代码里面根本找不到,而且这里主页面登录端口开的8888不是教程上面说的18080

运行 index.html 报错:Unknown argument: -superDevMode

开发者你好,

我使用 Eclipse 编译 EasyML 是成功的,但是用 gwt 运行报错,错误如下:

Unknown argument: -superDevMode
Google Web Toolkit 2.5.0
DevMode [-noserver] [-port port-number | "auto"] [-whitelist whitelist-string] [-blacklist
 blacklist-string] [-logdir directory] [-logLevel level] [-gen dir] [-bindAddress
host-name-or-address] [-codeServerPort port-number | "auto"] [-server
servletContainerLauncher[:args]] [-startupUrl url] [-war dir] [-deploy dir] 
[-extra dir] [-workDir dir] module[s] 

where 
  -noserver        Prevents the embedded web server from running
  ......
  module[s]        Specifies the name(s) of the module(s) to host

我的环境是:

Eclipse IDE for C/C++ Developers

Version: Oxygen.1a Release (4.7.1a)
Build id: 20171005-1200

然后从 Eclipse 市场下载的:
GWT Eclipse Plugin 3.0.0.201710131939
Eclipse.org - m2e 1.8.2.20171007-0217

Index Page 的 Run Configuration 是这样的:

image

image

不知道此问题该如何解决,多谢!

谢谢!

创建job出错

环境基本搭建好了,但是创建job的时候报了如下错误:

<workflow-app xmlns="uri:oozie:workflow:0.4" name="aaa">
    <start to="end"/>
    <kill name="fail">
        <message>Map/Reduce failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <end name="end"/>
</workflow-app>
[mkdirs]hdfs://master:9000/EML/oozie/APP-PATH-72385ded-559a-459b-b4d5-78182334a34b
E0504 : E0504: App directory [/EML/oozie/APP-PATH-72385ded-559a-459b-b4d5-78182334a34b] does not exist

########################
ps:
1、oozie能够单独运行job。
2、会不会是因为程序在HDFS上创建hdfs目录没有成功?在后台也确实找不到这个目录。可否指点如何解决这个问题。十分感谢。

【Important】 About EasyML Wiki

Hello, every github developers, EasyML wiki is already open. The address is https://github.com/ICT-BDA/EasyML/wiki.

We have published some detail articles about EasyML in wiki, including the user's guide, debug tips for developers, common problems in installing or runing EasyML and so on.

Welcome to our wiki and join us for developing EasyML!

是否支持第三方库的动态接入?

直接用中文吧,整个系统实现起来还是有点工程量的,不知道这个系统是否支持第三方库的动态接入?如果你们能够实现第三方库的动态导入就会更加吸引人了。我的意思是,你们在这个系统上添加一个接口描述层,第三方的机器学习库如果想接入你们的系统,只需要提供一个接口描述文件,就可以动态地生成这个第三方库的可视化拖拽单元,这样的话整个系统可能会更加cool一点,更加新引人了

Google Drive上的环境无法构建,缺少文件

使用从 google drive 上下载的文件进行build时,出现如下错误

mv: cannot stat '/tmp/EMLStudio.war': No such file or directory

研究发现文件夹中缺少一些构建需要的文件,google drive 上的文件需要更新到最新,谢谢!

【Important】About Easy Machine Learning We Chat Exchange Group

Hi,every github developer:
In order to facilitate the follow-up technical exchanges, our team build up a We Chat Exchange Group about Easy Machine learning . Everyone interest in our project can join our wechat group, the group's QR code is shown below. Welcome to join us!

QR code

#账号失效问题

由于服务器机房宕机,数据库数据未被修复,所以对外提供服务的网站上之前注册的账号不能使用,对此给大家带来不便请谅解,目前如果需要使用可以重新注册或者使用Guest账号登录,谢谢大家理解~

用了百度云盘最新版的安装包,但是本地idea启动并访问网页,submit job时出现以下错误

你好,我已经用了百度云盘最新版的安装包,但是本地idea启动并访问网页,submit job时出现以下错误
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /EML/oozie/APP-PATH-c55523eb-74fb-4d1e-b72c-055b26481b39/workflow.xml could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1571)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3107)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3031)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:725)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

at org.apache.hadoop.ipc.Client.call(Client.java:1347)
at org.apache.hadoop.ipc.Client.call(Client.java:1300)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy46.addBlock(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy46.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:330)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1226)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1078)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:514)

非常棒的工具,但是试用过程中有几个疑问

  • 是否可以自己上传非Java based模型(e.g. 用tensorflow导出的rnn)?我看Readme中有一个upload program的截图,截图中的那个模型是一个jar包么?

  • 我用IDEA打开edit configuration的时候,并没有看到GWT Configuration,回到步骤2,步骤2中也没什么需要做的(我已经执行了maven clean&compile并且External libraries中和截图显示的一样),按照quick start无法正常启动,所以是不是配置GWT还有一些坑?

Does it have an online version?

Very remarkable work it is.
But the real problem is, one has to configure his machine before running this library. It is really burdensome and time-consuming.
Does it have an online version so we can login directly through our browser without even downloading nor configuring? Given the fact that the main interface is web, I believe it is quite doable.
And this could dramatically make EML much more popular.
And plus, the link you provided (http://159.226.40.104:18080/dev/) is not working.

关于gwt编译报错,大家可以看看这里或许有帮助

EasyML程序pom文件里面默认gwtversion采用的是2.5.0版本,在编译的时候涉及到2块
1)2.5.0 去下面对应的依赖jgwt jar包
2)GWT Maven Plugin 编译插件 gwt-maven-plugin 2.5.0
这个version是和上面一致的
我的开发环境是 win7+IntelliJ IDEA 2017.2 + ide socks代理+ maven socks代理 (代理方便下载被墙的包)
按照官方文档把项目导入idea,导入后maven自动下载依赖包和plugin,dependency包全部下载成功,但是plugin :gwt-maven-plugin 2.5.0下载不了 (这些都是ide自动去下载),然后google 这个插件在maven仓库中的版本,随便尝试了2.7.0版本(前面的gwtVersion和GWT Maven Plugin都修改成这个版本),刷新ide的maven重新导入,都下载成功了,然后gwt compile 报错,2.7.0版本的相对默认2.5.0版本的应该是缺少某些类,报错如下:
[INFO] --- gwt-maven-plugin:2.7.0:compile (default-cli) @ EMLstudio ---
[INFO] Loading inherited module 'eml.studio.IndexPage'
[INFO] [ERROR] Line 18: Property 'gwt.logging.popupHandler' not found
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
陆续尝试2.8.0和2.5.1,均报错,看来只能使用默认的2.5.0版本,但ide 下载不了2.5.0的插件(已经配置了代理,一般的jar都不成问题),于是尝试cmd方式,去自动去下载2.5.0的gwt-maven-plugin,而且能下载成功,最后编译成功,ide 重新刷新下maven包引用,之前下载不了的plugin也重新下载进来了,done。

总结起来:
1)可以先不用ide导入项目,先cd到项目根目录,然后采用cmd方式编译
mvn clean complie -DskipTests 编译下载jar
2)gwt编译 mvn gwt:clean mvn gwt:compile
3)用ide 导入项目

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.