Exception in thread "main"0: No such file or directory
at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:232)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:627)
at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:468)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:598)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:179)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at Recipe.main(Recipe.java:85)
main class code:
Java
publicstaticvoid main(String[] args) throws Exception {
Configuration conf = new Configuration();
/* String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs();
for ( String string : otherArgs) {
System.out.println(string);
}
if (otherArgs.length != 2) {
System.err.println("Usage: recipe <in> <out>");
System.exit(2);
}*/@SuppressWarnings("deprecation")
Job job = new Job(conf, "Recipe");
job.setJarByClass(Recipe.class);
job.setMapperClass(TokenizerMapper.class);
job.setCombinerClass(IntSumReducer.class);
job.setReducerClass(IntSumReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
// FileInputFormat.addInputPath(job, new Path(otherArgs[0]));// FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));
FileInputFormat.addInputPath(job, new Path("hdfs://127.0.0.1:9000/user/hadoop/in/"));
FileOutputFormat.setOutputPath(job, new Path("hdfs://127.0.0.1:9000/user/hadoop/out/"));
//System.exit(job.waitForCompletion(true) ? 0 : 1);
job.submit();
}
Paths are fine as hadoop fs -ls /user/hadoop/in returns json file.
Help plz
I am facing the same exception like below . when i run the java program through eclipse
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" 0: No such file or directory
at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:232)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:627)
at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:468)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:598)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:179)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at Recipe.main(Recipe.java:87)
From commandline, i am able to run the program , and i got the output . But i am getting error while running through eclipse
Hi Praba, thanks for sharing such and informative guide. I would also like to share my views-
With HDP for Windows and HDInsight Service there is unprecedented choice for Windows enterprises for their Hadoop deployments. HDP for Windows is the Microsoft recommended way to deploy Hadoop on Windows Server environments. For cloud-based deployments HDInsight Service is a 100% compatible and scalable environment for deploying your Hadoop based applications.
Would also like to suggest the newbies they can visit here also for more information-https://intellipaat.com/
Dear Prabha while i am installation of hadoop it is displaying
c:\hadoop-2.3.0\bin>hadoop namenode -format
The system cannot find the path specified.
Error: JAVA_HOME is incorrectly set.
Please update C:\hadoop-2.3.0\conf\hadoop-env.cmd
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
The system cannot find the path specified.
Error: JAVA_HOME is incorrectly set.
Please update C:\hadoop-2.3.0\conf\hadoop-env.cmd
'-Djava.net.preferIPv4Stack' is not recognized as an internal or external command,
operable program or batch file.
I have done every thing what you said above and JAVA path also correctly set.
Please help in this regard.
C:\hadoop-2.3.0\sbin>hadoop fs -copyFromLocal C:\hwork\recipeitems-latest.json /
in
14/11/06 15:31:01 WARN hdfs.DFSClient: DataStreamer Exception
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /in/recipeitems
-latest.json._COPYING_ could only be replicated to 0 nodes instead of minReplica
tion (=1). There are 0 datanode(s) running and no node(s) are excluded in this
operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarg
et(BlockManager.java:1406)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBloc
k(FSNamesystem.java:2596)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(Nam
eNodeRpcServer.java:563)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTra
nslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:407)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$Cl
ientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.cal
l(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1962)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1958)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
tion.java:1548)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1956)
at org.apache.hadoop.ipc.Client.call(Client.java:1406)
at org.apache.hadoop.ipc.Client.call(Client.java:1359)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEng
ine.java:206)
at com.sun.proxy.$Proxy7.addBlock(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryI
nvocationHandler.java:186)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocat
ionHandler.java:102)
at com.sun.proxy.$Proxy7.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.
addBlock(ClientNamenodeProtocolTranslatorPB.java:348)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBl
ock(DFSOutputStream.java:1264)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputSt
ream(DFSOutputStream.java:1112)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStre
am.java:522)
copyFromLocal: File /in/recipeitems-latest.json._COPYING_ could only be replicat
ed to 0 nodes instead of minReplication (=1). There are 0 datanode(s) running a
nd no node(s) are excluded in this operation.
It's Showing datanode is n't running. please follow this commands
1. > jps
datanode
namenode
resource manager
node manager
2. format the hadoop namenode
> hadoop namenode -format
3. you configure the configuration files https://github.com/prabaprakash/Hadoop-2.3-Config[^]
4. You using windows 7/8/10 64 bit with jdk 6 ?
Can you please help in what is the problem here? Thanks a lot..
I am doing start-yarn from sbin folder.
I am running on windows XP. Does the link you put works on windows XP?please answer.
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:5
70)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskCheck
er.java:173)
at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:16
0)
at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:94)
at org.apache.hadoop.yarn.server.nodemanager.DirectoryCollection.checkDi
rs(DirectoryCollection.java:181)
at org.apache.hadoop.yarn.server.nodemanager.LocalDirsHandlerService.che
ckDirs(LocalDirsHandlerService.java:282)
at org.apache.hadoop.yarn.server.nodemanager.LocalDirsHandlerService.ser
viceInit(LocalDirsHandlerService.java:158)
Are you using windows xp 64 bit version , jdk 6 64 bit
because i built it for x64 only.......
if you using x64
follow this
1. Download the Yarn.cmd and replace C:\Hadoop-2.3.0\bin\yarn.cmd
https://raw.githubusercontent.com/prabaprakash/Hadoop-2.3-Config/master/bin/yarn.cmd[^]
or else
2. Open C:\Hadoop-2.3.0\bin\yarn.cmd in Notepad++
in the menu bar
" edit ->eol conversion -> windows format "
ctrl+s
I am using 32 bit as my machine is 32 bit.. I was able to build for 32 bit from hadoop source code by doing some settings changed in native code (two folders of native code in hadoop source, the original downloaded source code of hadoop is based on 64 bit).. Please help further on this. Thanks for your kind help.
C:\hwork>hadoop jar c:\hwork\Recipe.jar Recipe /in /out
4/11/04 20:29:56 WARN util.NativeCodeLoader: Unable to load native-hadoop libra
y for your platform... using builtin-java classes where applicable
4/11/04 20:29:56 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0
8032
4/11/04 20:29:57 WARN security.UserGroupInformation: PriviledgedActionException
as:acute (auth:SIMPLE) cause:java.net.ConnectException: Call From acute-PC/192.
68.1.4 to 127.0.0.1:9000 failed on connection exception: java.net.ConnectExcept
on: Connection refused: no further information; For more details see: http://w
ki.apache.org/hadoop/ConnectionRefused
xception in thread "main" java.net.ConnectException: Call From acute-PC/192.168
1.4 to 127.0.0.1:9000 failed on connection exception: java.net.ConnectException
Connection refused: no further information; For more details see: http://wiki
apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct
rAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC
nstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
at org.apache.hadoop.ipc.Client.call(Client.java:1410)
at org.apache.hadoop.ipc.Client.call(Client.java:1359)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEng
ne.java:206)
at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
ava:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
orImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryI
vocationHandler.java:186)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocat
onHandler.java:102)
at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.
etFileInfo(ClientNamenodeProtocolTranslatorPB.java:671)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1746)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFil
System.java:1112)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFil
System.java:1108)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkRes
lver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(Distribute
FileSystem.java:1108)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1399)
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSp
cs(FileOutputFormat.java:145)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java
458)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitt
r.java:343)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
ion.java:1548)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
at Recipe.main(Recipe.java:84)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
ava:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
orImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
aused by: java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout
java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:6
1)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:69
)
at org.apache.hadoop.ipc.Client$Connection.access$2700(Client.java:367)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1458)
at org.apache.hadoop.ipc.Client.call(Client.java:1377)
... 33 more
And also getting error while copying c:\>hwork>hadoop fs -mkdir /in, warning util.nativecodeloader: unable to load native-hadoop library for your platform.... using builtin java-classes where applicable, mkdir: failed on connection exception: java.net.ConnectException: Connection refused: no further information;
Bro i built for 64 bit os and jdk ......
sorry for inconvenience
...................................
you follow the step 1 in article to build it by yourself for x32