Tag Archives: Hadoop Connect hdfs Error

Hadoop Connect hdfs Error: could only be replicated to 0 nodes instead of minReplication (=1).

Hadoop connect hdfs error: could only be replicated to 0 nodes instead of minReplication (=1). There are 3 datanode(s) running and 3 node(s) are excluded in this operation:

        FileSystem fileSystem=null;
	    //Initialize before executing @Test
	    @Before
	    public void init() throws IOException, InterruptedException, URISyntaxException {
	        fileSystem=FileSystem.get(new URI("hdfs://master:8020"), new Configuration(), "root");

	    }
	    @Test
	    public void write(){
	    	try {
	    		FSDataOutputStream fdos = fileSystem.create(new Path("/testing/file01.txt"), true);
	            fdos.writeBytes("Test text for the txt file");
	            fdos.flush();
	            fdos.close();
	            fileSystem.close();
			} catch (Exception e) {
			}
	    }

The error is as follows

org.apache.hadoop.ipc.RemoteException(java.io.IOException): File  could only be replicated to 0 nodes instead of minReplication (=1).  There are 3 datanode(s) running and 3 node(s) are excluded in this operation.
	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
	at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)

	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1481)
	at org.apache.hadoop.ipc.Client.call(Client.java:1427)
	at org.apache.hadoop.ipc.Client.call(Client.java:1337)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
	at com.sun.proxy.$Proxy12.addBlock(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:440)

Finally, it was found that the computer used could not be directly connected to other data nodes of Hadoop, so the network environment was modified and the problem was solved