Thanks Praba very much for collecting all the relevant info and links for setting up HDFS, YARN and very practical example of MapReduce, this is one of the Best and most useful Article for setting up Hadoop on our laptop/PCs. I was able to setup and demo my own project on my laptop in less than an hour using this to my Business partner. I appreciate your all your efforts to make it step by step in such a great detail.
Hi
Im getting below errors while installing hadoop on my windows 7 64 bit. I dont know what is the problem?
in datanode
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
15/07/01 00:37:36 INFO datanode.DataNode: STARTUP_MSG:
STARTUP_MSG: java = 1.8.0_45
************************************************************/
15/07/01 00:37:36 WARN util.NativeCodeLoader: Unable to load native-hadoop libra
ry for your platform... using builtin-java classes where applicable
15/07/01 00:37:37 FATAL datanode.DataNode: Exception in secureMain
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.a
ccess0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:5
60)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskCheck
er.java:177)
at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:16
4)
at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:147)
at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.c
heckDir(DataNode.java:1819)
at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations
(DataNode.java:1861)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode
.java:1843)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
ataNode.java:1748)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNo
de.java:1786)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.j
ava:1952)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:19
73)
15/07/01 00:37:37 INFO util.ExitUtil: Exiting with status 1
15/07/01 00:37:37 INFO datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at EKA-PC/192.168.0.21
************************************************************/
C:\hadoop-2.3.0\sbin
in namenode
TARTUP_MSG: java = 1.8.0_45
************************************************************/
15/07/01 00:37:36 WARN util.NativeCodeLoader: Unable to load native-hadoop libra
ry for your platform... using builtin-java classes where applicable
15/07/01 00:37:37 FATAL datanode.DataNode: Exception in secureMain
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.a
ccess0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:5
60)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskCheck
er.java:177)
at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:16
4)
at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:147)
at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.c
heckDir(DataNode.java:1819)
at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations
(DataNode.java:1861)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode
.java:1843)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
ataNode.java:1748)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNo
de.java:1786)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.j
ava:1952)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:19
73)
15/07/01 00:37:37 INFO util.ExitUtil: Exiting with status 1
15/07/01 00:37:37 INFO datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at EKA-PC/192.168.0.21
************************************************************/
C:\hadoop-2.3.0\sbin>^A
Exception in thread "main" java.lang.NoClassDefFoundError: com/sun/tools/javac/M
ain
Caused by: java.lang.ClassNotFoundException: com.sun.tools.javac.Main
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
Could not find the main class: com.sun.tools.javac.Main. Program will exit.
I am able to compile the Recipe project and launch it from eclipse. However, I get the following exception. I am passing the directories in and out. /out does not exist. /in is there with the recipes file. I am compiling with 1.6.0_31 as per the instructions. Any hints?
Regards,
Bhushan
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
/in
/out
Exception in thread "main"0: No such file or directory
at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:232)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:627)
at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:468)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:598)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:179)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
at Recipe.main(Recipe.java:85)
Hi,
Can anyone plz gude?
After running the command start-dfs, I get the following error:
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
Exception in thread "main" java.lang.NoClassDefFoundError: sharma
Caused by: java.lang.ClassNotFoundException: sharma
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: sharma. Program will exit.
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
15/01/29 01:49:06 ERROR util.Shell: Failed to locate the winutils binary in the
hadoop binary path
java.io.IOException: Could not locate executable C:\hadoop-2.3.0\bin\winutils.ex
e in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:318)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:333)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:326)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.j
ava:1951)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:19
73)
15/01/29 01:49:37 WARN util.NativeCodeLoader: Unable to load native-hadoop libra
ry for your platform... using builtin-java classes where applicable
15/01/29 01:49:37 FATAL datanode.DataNode: Exception in secureMain
java.lang.NullPointerException
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:445)
at org.apache.hadoop.util.Shell.run(Shell.java:418)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:
650)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
tem.java:631)
at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.
java:468)
at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck
(DiskChecker.java:130)
at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:146)
at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.c
heckDir(DataNode.java:1819)
at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations
(DataNode.java:1861)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode
.java:1843)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
ataNode.java:1748)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNo
de.java:1786)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.j
ava:1952)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:19
73)
15/01/29 01:49:37 INFO util.ExitUtil: Exiting with status 1
15/01/29 01:49:37 INFO datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at Yashasree/192.168.124.1
************************************************************/
And have the same problem that the data node terminates after running the command start-dfs.cmd.
Due to this the application does not get into running state. It gets struck into the assigned state.
Any help would be highly appreciated on how to resolve this issue.
I am also not able to start the dfs.
Getting the below exception:
15/12/07 00:30:37 FATAL namenode.NameNode: Exception in namenode join
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:560)
at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:996)
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:461)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:282)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:200)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:849)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:609)
at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:446)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:502)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:658)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:643)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1259)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1325)
15/12/07 00:30:37 INFO util.ExitUtil: Exiting with status 1
15/12/07 00:30:37 INFO namenode.NameNode: SHUTDOWN_MSG:
java.lang.UnsupportedClassVersionError: Recipe : Unsupported major.minor version 51.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Picked up _JAVA_OPTIONS: -Xmx512M
Exception in thread "main"
i have try many time but the same error will be found.
i already seen your video. i have do the same step as in video are given
but problem comes when i type the command : C:\hadoop-2.3.0\bin>hadoop -namnode format
Error: counld not find or load main class
how to resolve this
please help me as soon as possible
I've use hadoop-eclipse-kepler-plugin-2.2.0.jar to the plugin folder,when i click others > map reduce in open perspective menu nothing happened. Did you cross this step?
Last Visit: 31-Dec-99 18:00 Last Update: 26-Sep-24 1:05