I am able to compile the Recipe project and launch it from eclipse. However, I get the following exception. I am passing the directories in and out. /out does not exist. /in is there with the recipes file. I am compiling with 1.6.0_31 as per the instructions. Any hints?
Regards,
Bhushan
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
/in
/out
Exception in thread "main"0: No such file or directory
at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:232)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:627)
at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:468)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:598)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:179)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
at Recipe.main(Recipe.java:85)
Hi,
Can anyone plz gude?
After running the command start-dfs, I get the following error:
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
Exception in thread "main" java.lang.NoClassDefFoundError: sharma
Caused by: java.lang.ClassNotFoundException: sharma
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: sharma. Program will exit.
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
15/01/29 01:49:06 ERROR util.Shell: Failed to locate the winutils binary in the
hadoop binary path
java.io.IOException: Could not locate executable C:\hadoop-2.3.0\bin\winutils.ex
e in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:318)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:333)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:326)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.j
ava:1951)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:19
73)
15/01/29 01:49:37 WARN util.NativeCodeLoader: Unable to load native-hadoop libra
ry for your platform... using builtin-java classes where applicable
15/01/29 01:49:37 FATAL datanode.DataNode: Exception in secureMain
java.lang.NullPointerException
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:445)
at org.apache.hadoop.util.Shell.run(Shell.java:418)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:
650)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
tem.java:631)
at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.
java:468)
at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck
(DiskChecker.java:130)
at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:146)
at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.c
heckDir(DataNode.java:1819)
at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations
(DataNode.java:1861)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode
.java:1843)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(D
ataNode.java:1748)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNo
de.java:1786)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.j
ava:1952)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:19
73)
15/01/29 01:49:37 INFO util.ExitUtil: Exiting with status 1
15/01/29 01:49:37 INFO datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at Yashasree/192.168.124.1
************************************************************/
And have the same problem that the data node terminates after running the command start-dfs.cmd.
Due to this the application does not get into running state. It gets struck into the assigned state.
Any help would be highly appreciated on how to resolve this issue.
I am also not able to start the dfs.
Getting the below exception:
15/12/07 00:30:37 FATAL namenode.NameNode: Exception in namenode join
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:560)
at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:996)
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:461)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:282)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:200)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:849)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:609)
at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:446)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:502)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:658)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:643)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1259)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1325)
15/12/07 00:30:37 INFO util.ExitUtil: Exiting with status 1
15/12/07 00:30:37 INFO namenode.NameNode: SHUTDOWN_MSG:
java.lang.UnsupportedClassVersionError: Recipe : Unsupported major.minor version 51.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Picked up _JAVA_OPTIONS: -Xmx512M
Exception in thread "main"
i have try many time but the same error will be found.
i already seen your video. i have do the same step as in video are given
but problem comes when i type the command : C:\hadoop-2.3.0\bin>hadoop -namnode format
Error: counld not find or load main class
how to resolve this
please help me as soon as possible
I've use hadoop-eclipse-kepler-plugin-2.2.0.jar to the plugin folder,when i click others > map reduce in open perspective menu nothing happened. Did you cross this step?
I've added hadoop-eclipse-kepler-plugin-2.2.0.jar to the plugin folder, and
map/reduce is shown in open perspective menu, but when i click OK,
nothing happened.
Does anyone have an answer to this question? I have used both java 1.6 and 1.7 with both eclipse luna,kepler and europa with all hadoop plugin .jar files and still no luck.
Would really appreciate if someone had some solid advice to get the mapreduce job to run!
I am facing the same problem when adding Map/Reduce perspective.
I am getting this error: Plug-in org.apache.hadoop.eclipse was unable to load class org.apache.hadoop.eclipse.view.servers.ServerView.
Do you have any suggestion for this??
Excellent tutorial, the best for Hadoop beginners, I am 5 days trying to "swim" between different bugs and configurations when it comes running in the Windows platform. The CMD execution went fine, no errors. The only problem that I have when trying to run it through Eclipse. The error that I get is as follows:
Exception in thread "main" java.lang.NullPointerException
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:445)
at org.apache.hadoop.util.Shell.run(Shell.java:418)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:631)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:421)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:277)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:125)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:348)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
at Recepti.main(Recepti.java:30)
This happens when I use jdk 7 for runtime/compilation. When using jdk 6, onether error show up:
java.lang.UnsupportedClassVersionError: Recipe : Unsupported major.minor version 51.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Exception in thread "main"
Please suggest what could be the problem, so I can continue further with Hadoop programming.
modified 30-Nov-14 17:30pm.
Last Visit: 31-Dec-99 18:00 Last Update: 12-May-24 16:47