Hi
I start the dfs. It is showing the following error. help me.
java.io.IOException: NameNode is not formatted.
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(
FSImage.java:210)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNam
esystem.java:849)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNa
mesystem.java:609)
at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNo
de.java:446)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.j
ava:502)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:
658)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:
643)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNo
de.java:1259)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:13
25)
15/01/20 19:48:48 INFO mortbay.log: Stopped SelectChannelConnector@127.0.0.1:500
70
15/01/20 19:48:48 INFO impl.MetricsSystemImpl: Stopping NameNode metrics system.
..
15/01/20 19:48:48 INFO impl.MetricsSystemImpl: NameNode metrics system stopped.
15/01/20 19:48:48 INFO impl.MetricsSystemImpl: NameNode metrics system shutdown
complete.
15/01/20 19:48:48 FATAL namenode.NameNode: Exception in namenode join
java.io.IOException: NameNode is not formatted.
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(
FSImage.java:210)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNam
esystem.java:849)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNa
mesystem.java:609)
at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNo
de.java:446)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.j
ava:502)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:
658)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:
643)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNo
de.java:1259)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:13
25)
15/01/20 19:48:48 INFO util.ExitUtil: Exiting with status 1
15/01/20 19:48:48 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at jothi-PC/192.168.1.9
************************************************************/
Hi Prabha,
While running recipe.jar file, I am getting ClassNotFoundException as below. Please help.
14/10/18 16:11:55 INFO mapreduce.Job: Task Id : attempt_1413621124083_0001_m_000
000_2, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class Recip
e$TokenizerMapper not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1882
)
at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobCon
textImpl.java:186)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:722)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
tion.java:1548)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.lang.ClassNotFoundException: Class Recipe$TokenizerMapper not fo
und
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.jav
a:1788)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1880
)
... 8 more
14/10/18 16:12:01 INFO mapreduce.Job: map 100% reduce 100%
14/10/18 16:12:06 INFO mapreduce.Job: Job job_1413621124083_0001 failed with sta
te FAILED due to: Task failed task_1413621124083_0001_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces
Hi Praba,
I am still getting the same error while running jar command.
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class Recip
e$TokenizerMapper not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1882
)
I crosschecked as well as again installed 64 bit version of jdk6. Still same error.
Also below 3 gson.jar files are kept at location: C:\hadoop-2.3.0\share\hadoop\common\lib
1.gson-2.2.4.jar
2.gson-2.2.4-javadoc.jar
3.gson-2.2.4-sources.jar
while as hadoop-common-2.3.0.jar is stored at location:C:\hadoop-2.3.0\share\hadoop\common
Can you be able to please help ?
copy all the files from c:\hwork (it includes all the class files that are not being found i.e Recip
e$TokenizerMapper.class, Recipe$IntSumReducer.class etc.) to C:\hadoop-2.3.0\etc\hadoop
And do this just before running your command "hadoop jar c:\Hwork\Recipe.jar Recipe /in /out"
Hi Prabha,
Thanks for sharing this article, really thankful to you.
I have installed as per you step by step guide. However I am receiving below error while starting dfs.
"Could not locate executable C:\hadoop-2.3.0\bin\winutils.exe in the hadoop binaries"
This exe does not exist at this path, neither I get it from downloadables you specified.
Can you please help.
Hi Prabha, thanks it worked.Before going ahead I was trying some basic commands as below and getting exception. Is my hadoop not installed properly? C:\hadoop-2.3.0>hadoop fs -ls
ls: `.': No such file or directory
C:\hadoop-2.3.0>hadoop hdfs dfs -ls
Exception in thread "main" java.lang.NoClassDefFoundError: hdfs
Caused by: java.lang.ClassNotFoundException: hdfs
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: hdfs. Program will exit.
Everytime i use hadoop namenode -format,i have this error
Exception in thread "main" java.lang.NoClassDefFoundError: KUMAR
Caused by: java.lang.ClassNotFoundException: KUMAR
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: KUMAR. Program will exit.
Help me resolve this error.I'm using windows8.1 and there is no problem with environment variables.
WHILE RUNNING THIS COMMAND c:\hadoop-2.3.0\sbin>hadoop jar c:\Hwork\Recipe.jar Recipe /in /out
GETTING DataStreamer Exception AND ON LOCALHOST UNABLE TO FIND "/out" FROM BROWSER DOCUMENTS...
are you sure this service are running in backgroud
1.namenode
2.datanode
3.resource manager
4.node manager
check this in command windows by typing " jps "
if not check which wasn't working . make the service to run in backgroud
While installing hadoop in windows 8.1 pro and was ready to run mapreduce I got this error message.
Unable to make directory and further more errors are mentioned below.
XML
-mkdir: java.net.URISyntaxException: Illegal character in hostname at index 24:
hdfs://localhost127.0.0.1:9000
Usage: hadoop fs [generic options] -mkdir [-p] <path> ...
for this command
hadoop fs -copyFromLocal c:\Hwork\recipeitems-latest.json /in
I'm getting something like this
XML
-copyFromLocal: java.net.URISyntaxException: Illegal character in hostname at in
dex 24: hdfs://localhost127.0.0.1:9000
Usage: hadoop fs [generic options] -copyFromLocal [-f] [-p] <localsrc> ... <dst>