muhammadbilalyar / hadoop-on-window Goto Github PK
View Code? Open in Web Editor NEWHADOOP 2.8.0 (22 March, 2017) INSTALLATION ON WINDOW 10
HADOOP 2.8.0 (22 March, 2017) INSTALLATION ON WINDOW 10
For installing hadoop 3.0, the installation steps remain the same, except for the last step.
Port 50700 has changed to port 9870.
You might want to set the hdfs in the following way in hdfs-site.xml, otherwise might get a java.net.URISyntaxException: Illegal character in opaque part at index 2:
<property>
<name>dfs.namenode.name.dir</name>
<value>file:///C:/hadoop-2.8.0/data/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:///C:/hadoop-2.8.0/data/datanode</value>
</property>
Also, I think the JAVA_HOME and HADOOP_HOME environment variables should be set without the trailing /bin.
Otherwise very good step by step instructions. Thanks.
Hi Bilal
Thanks for posting the steps for Hadoop Installation. These are very helpful. While starting all services, I was successfully start namenode and datanode. However, I am facing issues with Resource Manager and NodeManager. I've pasted the errors below. Appreciate your help. Thanks.
18/05/20 13:17:53 FATAL resourcemanager.ResourceManager: Error starting ResourceManager
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/server/timelineservice/collector/TimelineCollectorManager
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.getDeclaredMethods(Class.java:1975)
at com.google.inject.spi.InjectionPoint.getInjectionPoints(InjectionPoint.java:662)
at com.google.inject.spi.InjectionPoint.forInstanceMethodsAndFields(InjectionPoint.java:356)
at com.google.inject.spi.InjectionPoint.forInstanceMethodsAndFields(InjectionPoint.java:375)
at com.google.inject.internal.BindingBuilder.toInstance(BindingBuilder.java:82)
at org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebApp.setup(RMWebApp.java:56)
at org.apache.hadoop.yarn.webapp.WebApp.configureServlets(WebApp.java:160)
at com.google.inject.servlet.ServletModule.configure(ServletModule.java:53)
at com.google.inject.AbstractModule.configure(AbstractModule.java:59)
at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:223)
at com.google.inject.spi.Elements.getElements(Elements.java:101)
at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:133)
at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:103)
at com.google.inject.Guice.createInjector(Guice.java:95)
at com.google.inject.Guice.createInjector(Guice.java:72)
at com.google.inject.Guice.createInjector(Guice.java:62)
at org.apache.hadoop.yarn.webapp.WebApps$Builder.build(WebApps.java:356)
at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:401)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:1116)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1226)
at org.apache.hadoop.service.AbstractService.start(AbstractService.java:194)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1422)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.yarn.server.timelineservice.collector.TimelineCollectorManager
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 36 more
18/05/20 13:17:53 INFO ipc.Server: Stopping server on 8032
18/05/20 13:43:56 ERROR nodemanager.NodeManager: Error starting NodeManager
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Your endpoint configuration is wrong; For more details see: http://wiki.apache.org/hadoop/UnsetHostnameOrPort
at org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.serviceStart(NodeStatusUpdaterImpl.java:258)
at org.apache.hadoop.service.AbstractService.start(AbstractService.java:194)
at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:121)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceStart(NodeManager.java:454)
at org.apache.hadoop.service.AbstractService.start(AbstractService.java:194)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.initAndStartNodeManager(NodeManager.java:837)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.main(NodeManager.java:897)
Caused by: java.net.ConnectException: Your endpoint configuration is wrong; For more details see: http://wiki.apache.org/hadoop/UnsetHostnameOrPort
at sun.reflect.GeneratedConstructorAccessor28.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:824)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1497)
at org.apache.hadoop.ipc.Client.call(Client.java:1439)
at org.apache.hadoop.ipc.Client.call(Client.java:1349)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
at com.sun.proxy.$Proxy73.registerNodeManager(Unknown Source)
at org.apache.hadoop.yarn.server.api.impl.pb.client.ResourceTrackerPBClientImpl.registerNodeManager(ResourceTrackerPBClientImpl.java:73)
at sun.reflect.GeneratedMethodAccessor11.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
at com.sun.proxy.$Proxy74.registerNodeManager(Unknown Source)
at org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.registerWithRM(NodeStatusUpdaterImpl.java:363)
at org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.serviceStart(NodeStatusUpdaterImpl.java:252)
... 6 more
Caused by: java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:687)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:790)
at org.apache.hadoop.ipc.Client$Connection.access$3500(Client.java:411)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1554)
at org.apache.hadoop.ipc.Client.call(Client.java:1385)
... 22 more
18/05/20 13:43:56 INFO nodemanager.NodeManager: SHUTDOWN_MSG:
Command ( hdfs namenode –format ) is not working.
Error message:
C:\Spark\hadoop\New folder\hadoop-2.8.0.tar\hadoop-2.8.0\hadoop-2.8.0\bin>hdfs namenode -format
'C:\Spark\hadoop\New' is not recognized as an internal or external command,
operable program or batch file.
'-classpath' is not recognized as an internal or external command,
operable program or batch file.
Could you please help me in troubleshooting this issue
Hi,
Nice work. I am able to run the namenode and datanode. But the resourcemanager and nodemanager is not starting.
I am getting below error.
yarn run v1.3.2
error Couldn't find a package.json file in "C:\Hadoop\sbin"
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command
Can you please suggest.
In article Step by step Hadoop 2.8.0 installation on Window 10
in etc/hadoop/yarn-site.xml
, should it be
<property>
<name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
instead of
<property>
<name>yarn.nodemanager.auxservices.mapreduce.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
?
This is my error i am facing ..
2023-03-15 22:03:58,379 INFO nodemanager.NodeManager: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NodeManager
STARTUP_MSG: host = DESKTOP-T8AUQ59/192.168.43.26
STARTUP_MSG: args = []
STARTUP_MSG: version = 3.3.0
STARTUP_MSG: classpath = C:\hadoop-3.3.0\etc\hadoop;C:\hadoop-3.3.0\etc\hadoop;C:\hadoop-3.3.0\etc\hadoop;C:\hadoop-3.3.0\share\hadoop\common;C:\hadoop-3.3.0\share\hadoop\common\lib\accessors-smart-1.2.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\animal-sniffer-annotations-1.17.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\asm-5.0.4.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\audience-annotations-0.5.0.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\avro-1.7.7.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\checker-qual-2.5.2.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\commons-beanutils-1.9.4.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\commons-cli-1.2.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\commons-codec-1.11.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\commons-collections-3.2.2.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\commons-compress-1.19.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\commons-configuration2-2.1.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\commons-daemon-1.0.13.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\commons-io-2.5.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\commons-lang3-3.7.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\commons-logging-1.1.3.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\commons-math3-3.1.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\commons-net-3.6.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\commons-text-1.4.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\curator-client-4.2.0.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\curator-framework-4.2.0.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\curator-recipes-4.2.0.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\dnsjava-2.1.7.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\failureaccess-1.0.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\gson-2.2.4.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\guava-27.0-jre.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\hadoop-annotations-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\hadoop-auth-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\hadoop-shaded-protobuf_3_7-1.0.0.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\htrace-core4-4.1.0-incubating.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\httpclient-4.5.6.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\httpcore-4.4.10.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\j2objc-annotations-1.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jackson-annotations-2.10.3.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jackson-core-2.10.3.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jackson-core-asl-1.9.13.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jackson-databind-2.10.3.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jackson-jaxrs-1.9.13.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jackson-mapper-asl-1.9.13.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jackson-xc-1.9.13.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\javax.activation-api-1.2.0.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\javax.servlet-api-3.1.0.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jaxb-api-2.2.11.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jaxb-impl-2.2.3-1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jcip-annotations-1.0-1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jersey-core-1.19.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jersey-json-1.19.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jersey-server-1.19.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jersey-servlet-1.19.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jettison-1.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jetty-http-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jetty-io-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jetty-security-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jetty-server-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jetty-servlet-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jetty-util-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jetty-webapp-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jetty-xml-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jsch-0.1.55.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\json-smart-2.3.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jsp-api-2.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jsr305-3.0.2.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jsr311-api-1.1.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\jul-to-slf4j-1.7.25.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\kerb-admin-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\kerb-client-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\kerb-common-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\kerb-core-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\kerb-crypto-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\kerb-identity-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\kerb-server-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\kerb-simplekdc-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\kerb-util-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\kerby-asn1-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\kerby-config-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\kerby-pkix-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\kerby-util-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\kerby-xdr-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\log4j-1.2.17.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\metrics-core-3.2.4.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\netty-3.10.6.Final.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\nimbus-jose-jwt-7.9.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\paranamer-2.3.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\protobuf-java-2.5.0.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\re2j-1.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\slf4j-api-1.7.25.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\slf4j-log4j12-1.7.25.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\snappy-java-1.0.5.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\stax2-api-3.1.4.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\token-provider-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\woodstox-core-5.0.3.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\zookeeper-3.5.6.jar;C:\hadoop-3.3.0\share\hadoop\common\lib\zookeeper-jute-3.5.6.jar;C:\hadoop-3.3.0\share\hadoop\common\hadoop-common-3.3.0-tests.jar;C:\hadoop-3.3.0\share\hadoop\common\hadoop-common-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\common\hadoop-kms-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\common\hadoop-nfs-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\common\hadoop-registry-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\accessors-smart-1.2.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\animal-sniffer-annotations-1.17.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\asm-5.0.4.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\audience-annotations-0.5.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\avro-1.7.7.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\checker-qual-2.5.2.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-beanutils-1.9.4.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-cli-1.2.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-codec-1.11.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-collections-3.2.2.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-compress-1.19.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-configuration2-2.1.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-daemon-1.0.13.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-io-2.5.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-lang3-3.7.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-logging-1.1.3.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-math3-3.1.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-net-3.6.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\commons-text-1.4.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\curator-client-4.2.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\curator-framework-4.2.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\curator-recipes-4.2.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\dnsjava-2.1.7.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\failureaccess-1.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\gson-2.2.4.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\guava-27.0-jre.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\hadoop-annotations-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\hadoop-auth-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\hadoop-shaded-protobuf_3_7-1.0.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\htrace-core4-4.1.0-incubating.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\httpclient-4.5.6.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\httpcore-4.4.10.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\j2objc-annotations-1.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jackson-annotations-2.10.3.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jackson-core-2.10.3.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jackson-core-asl-1.9.13.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jackson-databind-2.10.3.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jackson-jaxrs-1.9.13.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jackson-mapper-asl-1.9.13.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jackson-xc-1.9.13.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\javax.activation-api-1.2.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\javax.servlet-api-3.1.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jaxb-api-2.2.11.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jaxb-impl-2.2.3-1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jcip-annotations-1.0-1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jersey-core-1.19.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jersey-json-1.19.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jersey-server-1.19.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jersey-servlet-1.19.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jettison-1.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-http-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-io-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-security-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-server-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-servlet-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-util-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-util-ajax-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-webapp-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jetty-xml-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jsch-0.1.55.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\json-simple-1.1.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\json-smart-2.3.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jsr305-3.0.2.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\jsr311-api-1.1.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-admin-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-client-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-common-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-core-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-crypto-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-identity-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-server-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-simplekdc-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\kerb-util-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\kerby-asn1-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\kerby-config-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\kerby-pkix-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\kerby-util-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\kerby-xdr-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\leveldbjni-all-1.8.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\log4j-1.2.17.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\netty-3.10.6.Final.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\netty-all-4.1.50.Final.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\nimbus-jose-jwt-7.9.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\okhttp-2.7.5.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\okio-1.6.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\paranamer-2.3.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\protobuf-java-2.5.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\re2j-1.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\snappy-java-1.0.5.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\stax2-api-3.1.4.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\token-provider-1.0.1.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\woodstox-core-5.0.3.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\zookeeper-3.5.6.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\lib\zookeeper-jute-3.5.6.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-3.3.0-tests.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-client-3.3.0-tests.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-client-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-httpfs-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-native-client-3.3.0-tests.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-native-client-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-nfs-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-rbf-3.3.0-tests.jar;C:\hadoop-3.3.0\share\hadoop\hdfs\hadoop-hdfs-rbf-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn;C:\hadoop-3.3.0\share\hadoop\yarn\lib\aopalliance-1.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\asm-analysis-7.1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\asm-commons-7.1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\asm-tree-7.1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\bcpkix-jdk15on-1.60.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\bcprov-jdk15on-1.60.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\ehcache-3.3.1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\fst-2.50.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\geronimo-jcache_1.0_spec-1.0-alpha-1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\guice-4.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\guice-servlet-4.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\HikariCP-java7-2.4.12.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jackson-jaxrs-base-2.10.3.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jackson-jaxrs-json-provider-2.10.3.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jackson-module-jaxb-annotations-2.10.3.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jakarta.activation-api-1.2.1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jakarta.xml.bind-api-2.3.2.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\java-util-1.9.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\javax-websocket-client-impl-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\javax-websocket-server-impl-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\javax.inject-1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\javax.websocket-api-1.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\javax.websocket-client-api-1.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jersey-client-1.19.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jersey-guice-1.19.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jetty-annotations-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jetty-client-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jetty-jndi-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jetty-plus-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jline-3.9.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jna-5.2.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\json-io-2.5.1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\metrics-core-3.2.4.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\mssql-jdbc-6.2.1.jre7.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\objenesis-2.6.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\snakeyaml-1.16.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\swagger-annotations-1.5.4.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\websocket-api-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\websocket-client-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\websocket-common-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\websocket-server-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\websocket-servlet-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-api-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-applications-distributedshell-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-applications-mawo-core-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-applications-unmanaged-am-launcher-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-client-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-common-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-registry-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-applicationhistoryservice-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-common-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-nodemanager-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-resourcemanager-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-router-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-sharedcachemanager-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-tests-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-timeline-pluginstorage-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-timelineservice-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-web-proxy-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-services-api-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-services-core-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-app-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-common-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-core-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-hs-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-hs-plugins-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-jobclient-3.3.0-tests.jar;C:\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-jobclient-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-nativetask-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-shuffle-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-client-uploader-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\mapreduce\hadoop-mapreduce-examples-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-api-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-applications-distributedshell-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-applications-mawo-core-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-applications-unmanaged-am-launcher-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-client-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-common-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-registry-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-applicationhistoryservice-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-common-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-nodemanager-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-resourcemanager-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-router-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-sharedcachemanager-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-tests-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-timeline-pluginstorage-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-timelineservice-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-server-web-proxy-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-services-api-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\hadoop-yarn-services-core-3.3.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\aopalliance-1.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\asm-analysis-7.1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\asm-commons-7.1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\asm-tree-7.1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\bcpkix-jdk15on-1.60.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\bcprov-jdk15on-1.60.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\ehcache-3.3.1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\fst-2.50.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\geronimo-jcache_1.0_spec-1.0-alpha-1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\guice-4.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\guice-servlet-4.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\HikariCP-java7-2.4.12.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jackson-jaxrs-base-2.10.3.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jackson-jaxrs-json-provider-2.10.3.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jackson-module-jaxb-annotations-2.10.3.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jakarta.activation-api-1.2.1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jakarta.xml.bind-api-2.3.2.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\java-util-1.9.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\javax-websocket-client-impl-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\javax-websocket-server-impl-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\javax.inject-1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\javax.websocket-api-1.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\javax.websocket-client-api-1.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jersey-client-1.19.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jersey-guice-1.19.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jetty-annotations-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jetty-client-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jetty-jndi-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jetty-plus-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jline-3.9.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\jna-5.2.0.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\json-io-2.5.1.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\metrics-core-3.2.4.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\mssql-jdbc-6.2.1.jre7.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\objenesis-2.6.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\snakeyaml-1.16.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\swagger-annotations-1.5.4.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\websocket-api-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\websocket-client-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\websocket-common-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\websocket-server-9.4.20.v20190813.jar;C:\hadoop-3.3.0\share\hadoop\yarn\lib\websocket-servlet-9.4.20.v20190813.jar;C:\hadoop-3.3.0\etc\hadoop\nm-config\log4j.properties
STARTUP_MSG: build = https://gitbox.apache.org/repos/asf/hadoop.git -r aa96f1871bfd858f9bac59cf2a81ec470da649af; compiled by 'brahma' on 2020-07-06T18:44Z
STARTUP_MSG: java = 1.8.0_202
/
2023-03-15 22:04:01,132 INFO resourceplugin.ResourcePluginManager: No Resource plugins found from configuration!
2023-03-15 22:04:01,132 INFO resourceplugin.ResourcePluginManager: Found Resource plugins from configuration: null
2023-03-15 22:04:01,140 INFO resourceplugin.ResourcePluginManager: The pluggable device framework is not enabled. If you want, please set true to yarn.nodemanager.pluggable-device-framework.enabled
2023-03-15 22:04:02,601 INFO event.AsyncDispatcher: Registering class org.apache.hadoop.yarn.server.nodemanager.containermanager.container.ContainerEventType for class org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl$ContainerEventDispatcher
2023-03-15 22:04:02,642 INFO event.AsyncDispatcher: Registering class org.apache.hadoop.yarn.server.nodemanager.containermanager.application.ApplicationEventType for class org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl$ApplicationEventDispatcher
2023-03-15 22:04:02,647 INFO event.AsyncDispatcher: Registering class org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.event.LocalizationEventType for class org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl$LocalizationEventHandlerWrapper
2023-03-15 22:04:02,651 INFO event.AsyncDispatcher: Registering class org.apache.hadoop.yarn.server.nodemanager.containermanager.AuxServicesEventType for class org.apache.hadoop.yarn.server.nodemanager.containermanager.AuxServices
2023-03-15 22:04:02,667 INFO event.AsyncDispatcher: Registering class org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorEventType for class org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl
2023-03-15 22:04:02,670 INFO event.AsyncDispatcher: Registering class org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainersLauncherEventType for class org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainersLauncher
2023-03-15 22:04:02,674 INFO event.AsyncDispatcher: Registering class org.apache.hadoop.yarn.server.nodemanager.containermanager.scheduler.ContainerSchedulerEventType for class org.apache.hadoop.yarn.server.nodemanager.containermanager.scheduler.ContainerScheduler
2023-03-15 22:04:02,694 INFO tracker.NMLogAggregationStatusTracker: the rolling interval seconds for the NodeManager Cached Log aggregation status is 600
2023-03-15 22:04:02,783 INFO event.AsyncDispatcher: Registering class org.apache.hadoop.yarn.server.nodemanager.ContainerManagerEventType for class org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl
2023-03-15 22:04:02,785 INFO event.AsyncDispatcher: Registering class org.apache.hadoop.yarn.server.nodemanager.NodeManagerEventType for class org.apache.hadoop.yarn.server.nodemanager.NodeManager
2023-03-15 22:04:02,963 INFO impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties
2023-03-15 22:04:03,074 INFO impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s).
2023-03-15 22:04:03,074 INFO impl.MetricsSystemImpl: NodeManager metrics system started
2023-03-15 22:04:03,099 INFO health.NodeHealthScriptRunner: Missing location for the node health check script "script".
2023-03-15 22:04:03,139 INFO nodemanager.DirectoryCollection: Disk Validator 'basic' is loaded.
2023-03-15 22:04:03,157 INFO nodemanager.DirectoryCollection: Disk Validator 'basic' is loaded.
2023-03-15 22:04:03,248 INFO nodemanager.NodeResourceMonitorImpl: Using ResourceCalculatorPlugin : org.apache.hadoop.yarn.util.ResourceCalculatorPlugin@38089a5a
2023-03-15 22:04:03,252 INFO event.AsyncDispatcher: Registering class org.apache.hadoop.yarn.server.nodemanager.containermanager.loghandler.event.LogHandlerEventType for class org.apache.hadoop.yarn.server.nodemanager.containermanager.loghandler.NonAggregatingLogHandler
2023-03-15 22:04:03,256 INFO event.AsyncDispatcher: Registering class org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.sharedcache.SharedCacheUploadEventType for class org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.sharedcache.SharedCacheUploadService
2023-03-15 22:04:03,256 INFO containermanager.ContainerManagerImpl: AMRMProxyService is disabled
2023-03-15 22:04:03,257 INFO localizer.ResourceLocalizationService: per directory file limit = 8192
2023-03-15 22:04:04,307 ERROR nodemanager.NodeManager: Error starting NodeManager
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$POSIX.stat(Ljava/lang/String;)Lorg/apache/hadoop/io/nativeio/NativeIO$POSIX$Stat;
at org.apache.hadoop.io.nativeio.NativeIO$POSIX.stat(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$POSIX.getStat(NativeIO.java:608)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfoByNativeIO(RawLocalFileSystem.java:823)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:737)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:705)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.initializeLocalDir(ResourceLocalizationService.java:1451)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.initializeLocalDirs(ResourceLocalizationService.java:1421)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.serviceInit(ResourceLocalizationService.java:261)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164)
at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:109)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.serviceInit(ContainerManagerImpl.java:327)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164)
at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:109)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceInit(NodeManager.java:494)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:164)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.initAndStartNodeManager(NodeManager.java:962)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.main(NodeManager.java:1042)
2023-03-15 22:04:04,324 INFO service.AbstractService: Service org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService failed in state STOPPED
java.lang.NullPointerException
at java.util.concurrent.ConcurrentHashMap.replaceNode(ConcurrentHashMap.java:1106)
at java.util.concurrent.ConcurrentHashMap.remove(ConcurrentHashMap.java:1097)
at java.util.Collections$SetFromMap.remove(Collections.java:5460)
at org.apache.hadoop.yarn.server.nodemanager.DirectoryCollection.deregisterDirsChangeListener(DirectoryCollection.java:269)
at org.apache.hadoop.yarn.server.nodemanager.LocalDirsHandlerService.deregisterLocalDirsChangeListener(LocalDirsHandlerService.java:290)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.serviceStop(ResourceLocalizationService.java:425)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:159)
at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:133)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.serviceStop(ContainerManagerImpl.java:716)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:159)
at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:133)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceStop(NodeManager.java:504)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:67)
at org.apache.hadoop.service.CompositeService$CompositeServiceShutdownHook.run(CompositeService.java:185)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2023-03-15 22:04:04,326 WARN service.CompositeService: When stopping the service org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService
java.lang.NullPointerException
at java.util.concurrent.ConcurrentHashMap.replaceNode(ConcurrentHashMap.java:1106)
at java.util.concurrent.ConcurrentHashMap.remove(ConcurrentHashMap.java:1097)
at java.util.Collections$SetFromMap.remove(Collections.java:5460)
at org.apache.hadoop.yarn.server.nodemanager.DirectoryCollection.deregisterDirsChangeListener(DirectoryCollection.java:269)
at org.apache.hadoop.yarn.server.nodemanager.LocalDirsHandlerService.deregisterLocalDirsChangeListener(LocalDirsHandlerService.java:290)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.serviceStop(ResourceLocalizationService.java:425)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:159)
at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:133)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.serviceStop(ContainerManagerImpl.java:716)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:159)
at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:133)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceStop(NodeManager.java:504)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:67)
at org.apache.hadoop.service.CompositeService$CompositeServiceShutdownHook.run(CompositeService.java:185)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2023-03-15 22:04:04,328 INFO service.AbstractService: Service org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl failed in state STOPPED
java.lang.NullPointerException
at java.util.concurrent.ConcurrentHashMap.replaceNode(ConcurrentHashMap.java:1106)
at java.util.concurrent.ConcurrentHashMap.remove(ConcurrentHashMap.java:1097)
at java.util.Collections$SetFromMap.remove(Collections.java:5460)
at org.apache.hadoop.yarn.server.nodemanager.DirectoryCollection.deregisterDirsChangeListener(DirectoryCollection.java:269)
at org.apache.hadoop.yarn.server.nodemanager.LocalDirsHandlerService.deregisterLocalDirsChangeListener(LocalDirsHandlerService.java:290)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.serviceStop(ResourceLocalizationService.java:425)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:159)
at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:133)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.serviceStop(ContainerManagerImpl.java:716)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:159)
at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:133)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceStop(NodeManager.java:504)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:67)
at org.apache.hadoop.service.CompositeService$CompositeServiceShutdownHook.run(CompositeService.java:185)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2023-03-15 22:04:04,337 WARN service.CompositeService: When stopping the service org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl
java.lang.NullPointerException
at java.util.concurrent.ConcurrentHashMap.replaceNode(ConcurrentHashMap.java:1106)
at java.util.concurrent.ConcurrentHashMap.remove(ConcurrentHashMap.java:1097)
at java.util.Collections$SetFromMap.remove(Collections.java:5460)
at org.apache.hadoop.yarn.server.nodemanager.DirectoryCollection.deregisterDirsChangeListener(DirectoryCollection.java:269)
at org.apache.hadoop.yarn.server.nodemanager.LocalDirsHandlerService.deregisterLocalDirsChangeListener(LocalDirsHandlerService.java:290)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.serviceStop(ResourceLocalizationService.java:425)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:159)
at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:133)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.serviceStop(ContainerManagerImpl.java:716)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:159)
at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:133)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceStop(NodeManager.java:504)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:67)
at org.apache.hadoop.service.CompositeService$CompositeServiceShutdownHook.run(CompositeService.java:185)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2023-03-15 22:04:04,343 INFO service.AbstractService: Service NodeManager failed in state STOPPED
java.lang.NullPointerException
at java.util.concurrent.ConcurrentHashMap.replaceNode(ConcurrentHashMap.java:1106)
at java.util.concurrent.ConcurrentHashMap.remove(ConcurrentHashMap.java:1097)
at java.util.Collections$SetFromMap.remove(Collections.java:5460)
at org.apache.hadoop.yarn.server.nodemanager.DirectoryCollection.deregisterDirsChangeListener(DirectoryCollection.java:269)
at org.apache.hadoop.yarn.server.nodemanager.LocalDirsHandlerService.deregisterLocalDirsChangeListener(LocalDirsHandlerService.java:290)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.serviceStop(ResourceLocalizationService.java:425)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:159)
at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:133)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.serviceStop(ContainerManagerImpl.java:716)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:159)
at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:133)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceStop(NodeManager.java:504)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:67)
at org.apache.hadoop.service.CompositeService$CompositeServiceShutdownHook.run(CompositeService.java:185)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2023-03-15 22:04:04,344 WARN service.AbstractService: When stopping the service NodeManager
java.lang.NullPointerException
at java.util.concurrent.ConcurrentHashMap.replaceNode(ConcurrentHashMap.java:1106)
at java.util.concurrent.ConcurrentHashMap.remove(ConcurrentHashMap.java:1097)
at java.util.Collections$SetFromMap.remove(Collections.java:5460)
at org.apache.hadoop.yarn.server.nodemanager.DirectoryCollection.deregisterDirsChangeListener(DirectoryCollection.java:269)
at org.apache.hadoop.yarn.server.nodemanager.LocalDirsHandlerService.deregisterLocalDirsChangeListener(LocalDirsHandlerService.java:290)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.serviceStop(ResourceLocalizationService.java:425)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:159)
at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:133)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.serviceStop(ContainerManagerImpl.java:716)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:159)
at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:133)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceStop(NodeManager.java:504)
at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:220)
at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:54)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:102)
at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:67)
at org.apache.hadoop.service.CompositeService$CompositeServiceShutdownHook.run(CompositeService.java:185)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2023-03-15 22:04:04,351 INFO nodemanager.NodeManager: SHUTDOWN_MSG:
/
SHUTDOWN_MSG: Shutting down NodeManager at DESKTOP-T8AUQ59/192.168.43.26
************************************************************/
C:\hadoop-3.3.0\sbin>
Hi, how should I set JAVA_HOME
Error: JAVA_HOME is incorrectly set.
Please update C:\hadoop-2.10.0\etc\hadoop\hadoop-env.cmd
Hi Sir..
Thanks for your tutorial...it is the best !!!
When I go to testing stage, I get two windows (apps) with this message:
Error: loading: c:\java\bin\server\jvm.dll
Some advice about it please
I just followed your instructions but I got this error again and again.
Error: Could not find or load main class ul
Hi,
thanks for the help of installation 👍 .
There are a few points though:
(1) Perhaps the fs.defaultFS should be configured in the core-site.xml? I found it impossible to run without this configuration. Not sure if it can be replicated
(2) Under Windows10 1803 17134.523, it appears that hadoop has issues to access file:///, when dir of datanode and namenode are under C:\hadoop-2.8.0\data@@@.
Changing it to /hadoop-2.8.0/data/datanode (or namenode) appears to solve the issue.
(3) I believe in Section Hadoop Configuration - 3: "Open cmd and typing command "hdfs namenode –format" . You will see.."
The " –" above in the sentence seems to an endash but not a hyphen "-". (a typo?)
It is still an amazing job, though :)
Hi Muhammad Bilal Thanks for yout tutorial for install hadoop on window,
but i have question, if me have clustering server whether step installation is same ?
thanks
The final step that includes open localhost50700 is not working, even I tried to replace the port number with 9870 but still not able to get the desired result.Please help.
Error msg asked to update Hadoop-env.cmd code
Following is the code, there is no "contrib" folder:
set JAVA_HOME=C:\Java\jdk1.8.0_181
@Rem The jsvc implementation to use. Jsvc is required to run secure datanodes.
@Rem set JSVC_HOME=%JSVC_HOME%
@Rem set HADOOP_CONF_DIR=
@Rem Extra Java CLASSPATH elements. Automatically insert capacity-scheduler.
if exist %HADOOP_HOME%\contrib\capacity-scheduler (
if not defined HADOOP_CLASSPATH (
set HADOOP_CLASSPATH=%HADOOP_HOME%\contrib\capacity-scheduler*.jar
) else (
set HADOOP_CLASSPATH=%HADOOP_CLASSPATH%;%HADOOP_HOME%\contrib\capacity-scheduler*.jar
)
)
I appreciate any assistance you can provide to help me solve this problem.
Error when running hdfs namenode -format
It appears that Hadoop has an incorrect classpath to Hadoop-env.sh
I check all steps three times and installed your download for bin and env
I downloaded Hadoop 2.8.0
Is HADOOP_HOME = C:\Hadoop-2.8.0\bin correct ( I tried with and with \bin and got the same results
Please reply thanks
Stuck a job
18/08/07 21:58:40 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
18/08/07 21:58:42 INFO input.FileInputFormat: Total input files to process : 1
18/08/07 21:58:43 INFO mapreduce.JobSubmitter: number of splits:1
18/08/07 21:58:43 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1533659271999_0001
18/08/07 21:58:44 INFO impl.YarnClientImpl: Submitted application application_1533659271999_0001
18/08/07 21:58:44 INFO mapreduce.Job: The url to track the job: http://DESKTOP-HES8I7I:8088/proxy/application_1533659271999_0001/
18/08/07 21:58:44 INFO mapreduce.Job: Running job: job_1533659271999_0001
whats Problem
I followed this steps and installed hadoop in standalone mode. can you please share steps for cluster?
Thanks
Pls , my datanode is shutting down with this error :
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2924)
2020-08-20 17:13:14,547 INFO util.ExitUtil: Exiting with status 1: org.apache.hadoop.util.DiskChecker$DiskErrorException: Too many failed volumes - current valid volumes: 0, volumes configured: 2, volumes failed: 2, volume failures tolerated: 1.
This is my hdfs-site.xml config
dfs.replication
3
dfs.datanode.failed.volumes.tolerated
1
dfs.namenode.name.dir
/Users/Samson/Desktop/services/hadoop2/hadoop-3.1.3/Nodes/NameNode
dfs.datanode.data.dir
/Users/Samson/Desktop/services/hadoop2/hadoop-3.1.3/Nodes/DataNode,/D:DataNode
Please , how can you help me ? .
After executing the start-all.cmd I am getting 4 cmd panels , 3 of the cmd panels gets the mentioned error except yarn resource manager .And localhost 8088 opens but there is 0 active node.Please kindly help.
When running the hdfs namenode -format i am getting the below error
Error: Could not find or load main class S
Could you please help me to resolve the issue.
Hi Bilal,
I am trying to setup Hadoop on Windows 2008 R2 based on the steps you given.Below is the error I am getting.Could you please verify below error and provide resolution if you have any idea.
Microsoft Windows [Version 6.1.7601]
Copyright (c) 2009 Microsoft Corporation. All rights reserved.
C:\hadoop-2.8.0\bin>hdfs namenode -format
'C:\Program' is not recognized as an internal or external command,
operable program or batch file.
18/04/05 15:41:08 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: user = User
STARTUP_MSG: host = ********************
STARTUP_MSG: args = [ûformat]
STARTUP_MSG: version = 2.8.0
STARTUP_MSG: classpath = C:\hadoop-2.8.0\etc\hadoop;C:\hadoop-2.8.0\share\hado
op\common\lib\activation-1.1.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\apached
s-i18n-2.0.0-M15.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\apacheds-kerberos-c
odec-2.0.0-M15.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\api-asn1-api-1.0.0-M2
0.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\api-util-1.0.0-M20.jar;C:\hadoop-2
.8.0\share\hadoop\common\lib\asm-3.2.jar;C:\hadoop-2.8.0\share\hadoop\common\lib
\avro-1.7.4.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\commons-beanutils-1.7.0.
jar;C:\hadoop-2.8.0\share\hadoop\common\lib\commons-beanutils-core-1.8.0.jar;C:
hadoop-2.8.0\share\hadoop\common\lib\commons-cli-1.2.jar;C:\hadoop-2.8.0\share\h
adoop\common\lib\commons-codec-1.4.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\c
ommons-collections-3.2.2.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\commons-com
press-1.4.1.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\commons-configuration-1.
6.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\commons-digester-1.8.jar;C:\hadoop
-2.8.0\share\hadoop\common\lib\commons-io-2.4.jar;C:\hadoop-2.8.0\share\hadoop\c
ommon\lib\commons-lang-2.6.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\commons-l
ogging-1.1.3.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\commons-math3-3.1.1.jar
;C:\hadoop-2.8.0\share\hadoop\common\lib\commons-net-3.1.jar;C:\hadoop-2.8.0\sha
re\hadoop\common\lib\curator-client-2.7.1.jar;C:\hadoop-2.8.0\share\hadoop\commo
n\lib\curator-framework-2.7.1.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\curato
r-recipes-2.7.1.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\gson-2.2.4.jar;C:\ha
doop-2.8.0\share\hadoop\common\lib\guava-11.0.2.jar;C:\hadoop-2.8.0\share\hadoop
\common\lib\hadoop-annotations-2.8.0.jar;C:\hadoop-2.8.0\share\hadoop\common\lib
\hadoop-auth-2.8.0.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\hamcrest-core-1.3
.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\htrace-core4-4.0.1-incubating.jar;C
:\hadoop-2.8.0\share\hadoop\common\lib\httpclient-4.5.2.jar;C:\hadoop-2.8.0\shar
e\hadoop\common\lib\httpcore-4.4.4.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\j
ackson-core-asl-1.9.13.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\jackson-jaxrs
-1.9.13.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\jackson-mapper-asl-1.9.13.ja
r;C:\hadoop-2.8.0\share\hadoop\common\lib\jackson-xc-1.9.13.jar;C:\hadoop-2.8.0
share\hadoop\common\lib\java-xmlbuilder-0.4.jar;C:\hadoop-2.8.0\share\hadoop\com
mon\lib\jaxb-api-2.2.2.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\jaxb-impl-2.2
.3-1.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\jcip-annotations-1.0.jar;C:\had
oop-2.8.0\share\hadoop\common\lib\jersey-core-1.9.jar;C:\hadoop-2.8.0\share\hado
op\common\lib\jersey-json-1.9.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\jersey
-server-1.9.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\jets3t-0.9.0.jar;C:\hado
op-2.8.0\share\hadoop\common\lib\jettison-1.1.jar;C:\hadoop-2.8.0\share\hadoop\c
ommon\lib\jetty-6.1.26.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\jetty-sslengi
ne-6.1.26.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\jetty-util-6.1.26.jar;C:\h
adoop-2.8.0\share\hadoop\common\lib\jsch-0.1.51.jar;C:\hadoop-2.8.0\share\hadoop
\common\lib\json-smart-1.1.1.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\jsp-api
-2.1.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\jsr305-3.0.0.jar;C:\hadoop-2.8.
0\share\hadoop\common\lib\junit-4.11.jar;C:\hadoop-2.8.0\share\hadoop\common\lib
\log4j-1.2.17.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\mockito-all-1.8.5.jar;
C:\hadoop-2.8.0\share\hadoop\common\lib\netty-3.6.2.Final.jar;C:\hadoop-2.8.0\sh
are\hadoop\common\lib\nimbus-jose-jwt-3.9.jar;C:\hadoop-2.8.0\share\hadoop\commo
n\lib\paranamer-2.3.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\protobuf-java-2.
5.0.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\servlet-api-2.5.jar;C:\hadoop-2.
8.0\share\hadoop\common\lib\slf4j-api-1.7.10.jar;C:\hadoop-2.8.0\share\hadoop\co
mmon\lib\slf4j-log4j12-1.7.10.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\snappy
-java-1.0.4.1.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\stax-api-1.0-2.jar;C:
hadoop-2.8.0\share\hadoop\common\lib\xmlenc-0.52.jar;C:\hadoop-2.8.0\share\hadoo
p\common\lib\xz-1.0.jar;C:\hadoop-2.8.0\share\hadoop\common\lib\zookeeper-3.4.6.
jar;C:\hadoop-2.8.0\share\hadoop\common\hadoop-common-2.8.0-tests.jar;C:\hadoop-
2.8.0\share\hadoop\common\hadoop-common-2.8.0.jar;C:\hadoop-2.8.0\share\hadoop\c
ommon\hadoop-nfs-2.8.0.jar;C:\hadoop-2.8.0\share\hadoop\hdfs;C:\hadoop-2.8.0\sha
re\hadoop\hdfs\lib\asm-3.2.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\lib\commons-cli
-1.2.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\lib\commons-codec-1.4.jar;C:\hadoop-2
.8.0\share\hadoop\hdfs\lib\commons-daemon-1.0.13.jar;C:\hadoop-2.8.0\share\hadoo
p\hdfs\lib\commons-io-2.4.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\lib\commons-lang
-2.6.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\lib\commons-logging-1.1.3.jar;C:\hado
op-2.8.0\share\hadoop\hdfs\lib\guava-11.0.2.jar;C:\hadoop-2.8.0\share\hadoop\hdf
s\lib\hadoop-hdfs-client-2.8.0.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\lib\htrace-
core4-4.0.1-incubating.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\lib\jackson-core-as
l-1.9.13.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\lib\jackson-mapper-asl-1.9.13.jar
;C:\hadoop-2.8.0\share\hadoop\hdfs\lib\jersey-core-1.9.jar;C:\hadoop-2.8.0\share
\hadoop\hdfs\lib\jersey-server-1.9.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\lib\jet
ty-6.1.26.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\lib\jetty-util-6.1.26.jar;C:\had
oop-2.8.0\share\hadoop\hdfs\lib\jsr305-3.0.0.jar;C:\hadoop-2.8.0\share\hadoop\hd
fs\lib\leveldbjni-all-1.8.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\lib\log4j-1.2.17
.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\lib\netty-3.6.2.Final.jar;C:\hadoop-2.8.0
\share\hadoop\hdfs\lib\netty-all-4.0.23.Final.jar;C:\hadoop-2.8.0\share\hadoop\h
dfs\lib\okhttp-2.4.0.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\lib\okio-1.4.0.jar;C:
\hadoop-2.8.0\share\hadoop\hdfs\lib\protobuf-java-2.5.0.jar;C:\hadoop-2.8.0\shar
e\hadoop\hdfs\lib\servlet-api-2.5.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\lib\xerc
esImpl-2.9.1.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\lib\xml-apis-1.3.04.jar;C:\ha
doop-2.8.0\share\hadoop\hdfs\lib\xmlenc-0.52.jar;C:\hadoop-2.8.0\share\hadoop\hd
fs\hadoop-hdfs-2.8.0-tests.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\hadoop-hdfs-2.8
.0.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\hadoop-hdfs-client-2.8.0-tests.jar;C:\h
adoop-2.8.0\share\hadoop\hdfs\hadoop-hdfs-client-2.8.0.jar;C:\hadoop-2.8.0\share
\hadoop\hdfs\hadoop-hdfs-native-client-2.8.0-tests.jar;C:\hadoop-2.8.0\share\had
oop\hdfs\hadoop-hdfs-native-client-2.8.0.jar;C:\hadoop-2.8.0\share\hadoop\hdfs\h
adoop-hdfs-nfs-2.8.0.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\activation-1.1.ja
r;C:\hadoop-2.8.0\share\hadoop\yarn\lib\aopalliance-1.0.jar;C:\hadoop-2.8.0\shar
e\hadoop\yarn\lib\asm-3.2.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\commons-cli-
1.2.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\commons-codec-1.4.jar;C:\hadoop-2.
8.0\share\hadoop\yarn\lib\commons-collections-3.2.2.jar;C:\hadoop-2.8.0\share\ha
doop\yarn\lib\commons-compress-1.4.1.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\c
ommons-io-2.4.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\commons-lang-2.6.jar;C:
hadoop-2.8.0\share\hadoop\yarn\lib\commons-logging-1.1.3.jar;C:\hadoop-2.8.0\sha
re\hadoop\yarn\lib\commons-math-2.2.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\cu
rator-client-2.7.1.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\curator-test-2.7.1.
jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\fst-2.24.jar;C:\hadoop-2.8.0\share\had
oop\yarn\lib\guava-11.0.2.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\guice-3.0.ja
r;C:\hadoop-2.8.0\share\hadoop\yarn\lib\guice-servlet-3.0.jar;C:\hadoop-2.8.0\sh
are\hadoop\yarn\lib\jackson-core-asl-1.9.13.jar;C:\hadoop-2.8.0\share\hadoop\yar
n\lib\jackson-jaxrs-1.9.13.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\jackson-map
per-asl-1.9.13.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\jackson-xc-1.9.13.jar;C
:\hadoop-2.8.0\share\hadoop\yarn\lib\javassist-3.18.1-GA.jar;C:\hadoop-2.8.0\sha
re\hadoop\yarn\lib\javax.inject-1.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\jaxb
-api-2.2.2.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\jaxb-impl-2.2.3-1.jar;C:\ha
doop-2.8.0\share\hadoop\yarn\lib\jersey-client-1.9.jar;C:\hadoop-2.8.0\share\had
oop\yarn\lib\jersey-core-1.9.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\jersey-gu
ice-1.9.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\jersey-json-1.9.jar;C:\hadoop-
2.8.0\share\hadoop\yarn\lib\jersey-server-1.9.jar;C:\hadoop-2.8.0\share\hadoop\y
arn\lib\jettison-1.1.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\jetty-6.1.26.jar;
C:\hadoop-2.8.0\share\hadoop\yarn\lib\jetty-util-6.1.26.jar;C:\hadoop-2.8.0\shar
e\hadoop\yarn\lib\jsr305-3.0.0.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\leveldb
jni-all-1.8.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\log4j-1.2.17.jar;C:\hadoop
-2.8.0\share\hadoop\yarn\lib\netty-3.6.2.Final.jar;C:\hadoop-2.8.0\share\hadoop
yarn\lib\objenesis-2.1.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\protobuf-java-2
.5.0.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\servlet-api-2.5.jar;C:\hadoop-2.8
.0\share\hadoop\yarn\lib\stax-api-1.0-2.jar;C:\hadoop-2.8.0\share\hadoop\yarn\li
b\xz-1.0.jar;C:\hadoop-2.8.0\share\hadoop\yarn\lib\zookeeper-3.4.6-tests.jar;C:
hadoop-2.8.0\share\hadoop\yarn\lib\zookeeper-3.4.6.jar;C:\hadoop-2.8.0\share\had
oop\yarn\hadoop-yarn-api-2.8.0.jar;C:\hadoop-2.8.0\share\hadoop\yarn\hadoop-yarn
-applications-distributedshell-2.8.0.jar;C:\hadoop-2.8.0\share\hadoop\yarn\hadoo
p-yarn-applications-unmanaged-am-launcher-2.8.0.jar;C:\hadoop-2.8.0\share\hadoop
\yarn\hadoop-yarn-client-2.8.0.jar;C:\hadoop-2.8.0\share\hadoop\yarn\hadoop-yarn
-common-2.8.0.jar;C:\hadoop-2.8.0\share\hadoop\yarn\hadoop-yarn-registry-2.8.0.j
ar;C:\hadoop-2.8.0\share\hadoop\yarn\hadoop-yarn-server-applicationhistoryservic
e-2.8.0.jar;C:\hadoop-2.8.0\share\hadoop\yarn\hadoop-yarn-server-common-2.8.0.ja
r;C:\hadoop-2.8.0\share\hadoop\yarn\hadoop-yarn-server-nodemanager-2.8.0.jar;C:
hadoop-2.8.0\share\hadoop\yarn\hadoop-yarn-server-resourcemanager-2.8.0.jar;C:\h
adoop-2.8.0\share\hadoop\yarn\hadoop-yarn-server-sharedcachemanager-2.8.0.jar;C:
\hadoop-2.8.0\share\hadoop\yarn\hadoop-yarn-server-tests-2.8.0.jar;C:\hadoop-2.8
.0\share\hadoop\yarn\hadoop-yarn-server-timeline-pluginstorage-2.8.0.jar;C:\hado
op-2.8.0\share\hadoop\yarn\hadoop-yarn-server-web-proxy-2.8.0.jar;C:\hadoop-2.8.
0\share\hadoop\mapreduce\lib\aopalliance-1.0.jar;C:\hadoop-2.8.0\share\hadoop\ma
preduce\lib\asm-3.2.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\lib\avro-1.7.4.ja
r;C:\hadoop-2.8.0\share\hadoop\mapreduce\lib\commons-compress-1.4.1.jar;C:\hadoo
p-2.8.0\share\hadoop\mapreduce\lib\commons-io-2.4.jar;C:\hadoop-2.8.0\share\hado
op\mapreduce\lib\guice-3.0.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\lib\guice-
servlet-3.0.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\lib\hadoop-annotations-2.
8.0.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\lib\hamcrest-core-1.3.jar;C:\hado
op-2.8.0\share\hadoop\mapreduce\lib\jackson-core-asl-1.9.13.jar;C:\hadoop-2.8.0
share\hadoop\mapreduce\lib\jackson-mapper-asl-1.9.13.jar;C:\hadoop-2.8.0\share\h
adoop\mapreduce\lib\javax.inject-1.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\li
b\jersey-core-1.9.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\lib\jersey-guice-1.
9.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\lib\jersey-server-1.9.jar;C:\hadoop
-2.8.0\share\hadoop\mapreduce\lib\junit-4.11.jar;C:\hadoop-2.8.0\share\hadoop\ma
preduce\lib\leveldbjni-all-1.8.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\lib\lo
g4j-1.2.17.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\lib\netty-3.6.2.Final.jar;
C:\hadoop-2.8.0\share\hadoop\mapreduce\lib\paranamer-2.3.jar;C:\hadoop-2.8.0\sha
re\hadoop\mapreduce\lib\protobuf-java-2.5.0.jar;C:\hadoop-2.8.0\share\hadoop\map
reduce\lib\snappy-java-1.0.4.1.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\lib\xz
-1.0.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\hadoop-mapreduce-client-app-2.8.
0.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\hadoop-mapreduce-client-common-2.8.
0.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\hadoop-mapreduce-client-core-2.8.0.
jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\hadoop-mapreduce-client-hs-2.8.0.jar;
C:\hadoop-2.8.0\share\hadoop\mapreduce\hadoop-mapreduce-client-hs-plugins-2.8.0.
jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\hadoop-mapreduce-client-jobclient-2.8
.0-tests.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\hadoop-mapreduce-client-jobc
lient-2.8.0.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\hadoop-mapreduce-client-s
huffle-2.8.0.jar;C:\hadoop-2.8.0\share\hadoop\mapreduce\hadoop-mapreduce-example
s-2.8.0.jar
STARTUP_MSG: build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r 91f
2b7a13d1e97be65db92ddabc627cc29ac0009; compiled by 'jdu' on 2017-03-17T04:12Z
STARTUP_MSG: java = 10
/
18/04/05 15:41:08 INFO namenode.NameNode: createNameNode [ûformat]
[Fatal Error] core-site.xml:22:2: The markup in the document following the root
element must be well-formed.
18/04/05 15:41:09 FATAL conf.Configuration: error parsing conf core-site.xml
org.xml.sax.SAXParseException; systemId: file:/C:/hadoop-2.8.0/etc/hadoop/core-s
ite.xml; lineNumber: 22; columnNumber: 2; The markup in the document following t
he root element must be well-formed.
at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
at java.xml/javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java
:151)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2531)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2519)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:
2590)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java
:2543)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2426
)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1151)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1123)
at org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:14
59)
at org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(Gen
ericOptionsParser.java:322)
at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(Gener
icOptionsParser.java:488)
at org.apache.hadoop.util.GenericOptionsParser.(GenericOptionsPars
er.java:170)
at org.apache.hadoop.util.GenericOptionsParser.(GenericOptionsPars
er.java:153)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNo
de.java:1532)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:16
71)
18/04/05 15:41:09 ERROR namenode.NameNode: Failed to start namenode.
java.lang.RuntimeException: org.xml.sax.SAXParseException; systemId: file:/C:/ha
doop-2.8.0/etc/hadoop/core-site.xml; lineNumber: 22; columnNumber: 2; The markup
in the document following the root element must be well-formed.
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:
2696)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java
:2543)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2426
)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1151)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1123)
at org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:14
59)
at org.apache.hadoop.util.GenericOptionsParser.processGeneralOptions(Gen
ericOptionsParser.java:322)
at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(Gener
icOptionsParser.java:488)
at org.apache.hadoop.util.GenericOptionsParser.(GenericOptionsPars
er.java:170)
at org.apache.hadoop.util.GenericOptionsParser.(GenericOptionsPars
er.java:153)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNo
de.java:1532)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:16
71)
Caused by: org.xml.sax.SAXParseException; systemId: file:/C:/hadoop-2.8.0/etc/ha
doop/core-site.xml; lineNumber: 22; columnNumber: 2; The markup in the document
following the root element must be well-formed.
at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
at java.xml/javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java
:151)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2531)
at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2519)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:
2590)
... 11 more
18/04/05 15:41:09 INFO util.ExitUtil: Exiting with status 1
18/04/05 15:41:09 INFO namenode.NameNode: SHUTDOWN_MSG:
/
SHUTDOWN_MSG: Shutting down NameNode at **********************************
************************************************************/
C:\hadoop-2.8.0\bin>
i've been struggling with this for days.
no matter what i do, hadoop doesnt seem to recognize the HADOOP_HOME environment variable. still, i can use the environment variable in cmd.
ive tried:
HADOOP_HOME = C:\Hadoop
HADOOP_HOME = C:\Hadoop
HADOOP_HOME = C:\Hadoop\bin
noting works, this is my console log
`C:\Users\pipe>cd %HADOOP_HOME%
C:\Hadoop\bin>hadoop version
+================================================================+
| Error: HADOOP_HOME is not set correctly |
+----------------------------------------------------------------+
| Please set your HADOOP_HOME variable to the absolute path of |
| the directory that contains the hadoop distribution |
+================================================================+
'-Dhadoop.security.logger' is not recognized as an internal or external command,
operable program or batch file.`
after running start-all.cmd in window 10 getting
rg.apache.hadoop.yarn.exceptions.YarnRuntimeException: Failed to setup local dir /RND/hadoop-2.8.0/tmp-nm, which was marked as good.
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.checkAndInitializeLocalDirs(ResourceLocalizationService.java:1533)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService$1.onDirsChanged(ResourceLocalizationService.java:271)
at org.apache.hadoop.yarn.server.nodemanager.DirectoryCollection.registerDirsChangeListener(DirectoryCollection.java:197)
at org.apache.hadoop.yarn.server.nodemanager.LocalDirsHandlerService.registerLocalDirsChangeListener(LocalDirsHandlerService.java:242)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.serviceStart(ResourceLocalizationService.java:371)
at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.serviceStart(ContainerManagerImpl.java:490)
at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceStart(NodeManager.java:369)
at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.initAndStartNodeManager(NodeManager.java:637)
at org.apache.hadoop.yarn.server.nodemanager.NodeManager.main(NodeManager.java:684)
Caused by: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Permissions incorrectly set for dir /RND/hadoop-2.8.0/tmp-nm/filecache, should be rwxr-xr-x, actual value = rw-rw-rw-
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.checkLocalDir(ResourceLocalizationService.java:1560)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.checkAndInitializeLocalDirs(ResourceLocalizationService.java:1528)
... 13 more
19/10/02 18:38:18 INFO ipc.Server: Stopping IPC Server Responder
19/10/02 18:38:18 WARN nodemanager.NodeResourceMonitorImpl: org.apache.hadoop.yarn.server.nodemanager.NodeResourceMonitorImpl is interrupted. Exiting.
node manager startup eventhough it is having correct permission
First of all thanks for the details on installing Hadoop in windows 10. I would like install the same in the windows 7 environment. Also it would be great if you provide any example to run once installation successfully completed
Hi
when I am trying to run hdfs namenode-format there is an error showing up saying:
C:\Users\rayan>hdfs namenode-format
Error: Could not find or load main class namenode-format
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.