C:\hwork>hadoop jar c:\hwork\Recipe.jar Recipe /in /out
4/11/04 20:29:56 WARN util.NativeCodeLoader: Unable to load native-hadoop libra
y for your platform... using builtin-java classes where applicable
4/11/04 20:29:56 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0
8032
4/11/04 20:29:57 WARN security.UserGroupInformation: PriviledgedActionException
as:acute (auth:SIMPLE) cause:java.net.ConnectException: Call From acute-PC/192.
68.1.4 to 127.0.0.1:9000 failed on connection exception: java.net.ConnectExcept
on: Connection refused: no further information; For more details see: http://w
ki.apache.org/hadoop/ConnectionRefused
xception in thread "main" java.net.ConnectException: Call From acute-PC/192.168
1.4 to 127.0.0.1:9000 failed on connection exception: java.net.ConnectException
Connection refused: no further information; For more details see: http://wiki
apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct
rAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC
nstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
at org.apache.hadoop.ipc.Client.call(Client.java:1410)
at org.apache.hadoop.ipc.Client.call(Client.java:1359)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEng
ne.java:206)
at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
ava:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
orImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryI
vocationHandler.java:186)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocat
onHandler.java:102)
at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.
etFileInfo(ClientNamenodeProtocolTranslatorPB.java:671)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1746)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFil
System.java:1112)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFil
System.java:1108)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkRes
lver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(Distribute
FileSystem.java:1108)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1399)
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSp
cs(FileOutputFormat.java:145)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java
458)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitt
r.java:343)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
ion.java:1548)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
at Recipe.main(Recipe.java:84)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
ava:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
orImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
aused by: java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout
java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:6
1)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:69
)
at org.apache.hadoop.ipc.Client$Connection.access$2700(Client.java:367)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1458)
at org.apache.hadoop.ipc.Client.call(Client.java:1377)
... 33 more
And also getting error while copying c:\>hwork>hadoop fs -mkdir /in, warning util.nativecodeloader: unable to load native-hadoop library for your platform.... using builtin java-classes where applicable, mkdir: failed on connection exception: java.net.ConnectException: Connection refused: no further information;
Bro i built for 64 bit os and jdk ......
sorry for inconvenience
...................................
you follow the step 1 in article to build it by yourself for x32
hello sir
i have tried all the steps successfully but when i was running the recipe .jar then for some time it was correctly going but from here it is giving Exception and program doesn't completed successfully.
14/10/26 03:58:23 INFO mapreduce.Job: Job job_1414275107946_0002 running in uber
mode : false
14/10/26 03:58:23 INFO mapreduce.Job: map 0% reduce 0%
14/10/26 03:58:23 INFO mapreduce.Job: Job job_1414275107946_0002 failed with sta
te FAILED due to: Application application_1414275107946_0002 failed 2 times due
to AM Container for appattempt_1414275107946_0002_000002 exited with exitCode:
1 due to: Exception from container-launch: org.apache.hadoop.util.Shell$ExitCode
Exception:
org.apache.hadoop.util.Shell$ExitCodeException:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:505)
at org.apache.hadoop.util.Shell.run(Shell.java:418)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:
650)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.la
unchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.C
ontainerLaunch.call(ContainerLaunch.java:283)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.C
ontainerLaunch.call(ContainerLaunch.java:79)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExec
utor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor
.java:908)
at java.lang.Thread.run(Thread.java:662)
hello sir
i have tried all the steps successfully but when i was running the recipe .jar then for some time it was correctly going but from here it is giving Exception and program doesn't completed successfully.
14/10/26 03:58:23 INFO mapreduce.Job: Job job_1414275107946_0002 running in uber
mode : false
14/10/26 03:58:23 INFO mapreduce.Job: map 0% reduce 0%
14/10/26 03:58:23 INFO mapreduce.Job: Job job_1414275107946_0002 failed with sta
te FAILED due to: Application application_1414275107946_0002 failed 2 times due
to AM Container for appattempt_1414275107946_0002_000002 exited with exitCode:
1 due to: Exception from container-launch: org.apache.hadoop.util.Shell$ExitCode
Exception:
org.apache.hadoop.util.Shell$ExitCodeException:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:505)
at org.apache.hadoop.util.Shell.run(Shell.java:418)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:
650)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.la
unchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.C
ontainerLaunch.call(ContainerLaunch.java:283)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.C
ontainerLaunch.call(ContainerLaunch.java:79)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExec
utor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor
.java:908)
at java.lang.Thread.run(Thread.java:662)
Hi
I start the dfs. It is showing the following error. help me.
java.io.IOException: NameNode is not formatted.
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(
FSImage.java:210)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNam
esystem.java:849)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNa
mesystem.java:609)
at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNo
de.java:446)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.j
ava:502)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:
658)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:
643)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNo
de.java:1259)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:13
25)
15/01/20 19:48:48 INFO mortbay.log: Stopped SelectChannelConnector@127.0.0.1:500
70
15/01/20 19:48:48 INFO impl.MetricsSystemImpl: Stopping NameNode metrics system.
..
15/01/20 19:48:48 INFO impl.MetricsSystemImpl: NameNode metrics system stopped.
15/01/20 19:48:48 INFO impl.MetricsSystemImpl: NameNode metrics system shutdown
complete.
15/01/20 19:48:48 FATAL namenode.NameNode: Exception in namenode join
java.io.IOException: NameNode is not formatted.
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(
FSImage.java:210)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNam
esystem.java:849)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNa
mesystem.java:609)
at org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNo
de.java:446)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.j
ava:502)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:
658)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:
643)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNo
de.java:1259)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:13
25)
15/01/20 19:48:48 INFO util.ExitUtil: Exiting with status 1
15/01/20 19:48:48 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at jothi-PC/192.168.1.9
************************************************************/
Hi Prabha,
While running recipe.jar file, I am getting ClassNotFoundException as below. Please help.
14/10/18 16:11:55 INFO mapreduce.Job: Task Id : attempt_1413621124083_0001_m_000
000_2, Status : FAILED
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class Recip
e$TokenizerMapper not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1882
)
at org.apache.hadoop.mapreduce.task.JobContextImpl.getMapperClass(JobCon
textImpl.java:186)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:722)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
tion.java:1548)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.lang.ClassNotFoundException: Class Recipe$TokenizerMapper not fo
und
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.jav
a:1788)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1880
)
... 8 more
14/10/18 16:12:01 INFO mapreduce.Job: map 100% reduce 100%
14/10/18 16:12:06 INFO mapreduce.Job: Job job_1413621124083_0001 failed with sta
te FAILED due to: Task failed task_1413621124083_0001_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces
Hi Praba,
I am still getting the same error while running jar command.
Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class Recip
e$TokenizerMapper not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1882
)
I crosschecked as well as again installed 64 bit version of jdk6. Still same error.
Also below 3 gson.jar files are kept at location: C:\hadoop-2.3.0\share\hadoop\common\lib
1.gson-2.2.4.jar
2.gson-2.2.4-javadoc.jar
3.gson-2.2.4-sources.jar
while as hadoop-common-2.3.0.jar is stored at location:C:\hadoop-2.3.0\share\hadoop\common
Can you be able to please help ?
copy all the files from c:\hwork (it includes all the class files that are not being found i.e Recip
e$TokenizerMapper.class, Recipe$IntSumReducer.class etc.) to C:\hadoop-2.3.0\etc\hadoop
And do this just before running your command "hadoop jar c:\Hwork\Recipe.jar Recipe /in /out"
Hi Prabha,
Thanks for sharing this article, really thankful to you.
I have installed as per you step by step guide. However I am receiving below error while starting dfs.
"Could not locate executable C:\hadoop-2.3.0\bin\winutils.exe in the hadoop binaries"
This exe does not exist at this path, neither I get it from downloadables you specified.
Can you please help.