Error when root executes Hadoop command:
[root@vmocdp125 conf]# hadoop fs -ls /user/ [INFO] 17:50:42 main [RetryInvocationHandler]Exception while invoking getFileInfo of class ClientNamenodeProtocolTranslatorPB over vmocdp127.test.com/172.16.145.127:8020. Trying to fail over immediately.144 org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby at org.apache.hadoop.hdfs.server.namenode.ha.StandbyState.checkOperation(StandbyState.java:87) at org.apache.hadoop.hdfs.server.namenode.NameNode$NameNodeHAContext.checkOperation(NameNode.java:1932) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOperation(FSNamesystem.java:1313) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3861) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1076) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:843) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2151) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2147) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2145) at org.apache.hadoop.ipc.Client.call(Client.java:1427) at org.apache.hadoop.ipc.Client.call(Client.java:1358) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:252) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104) at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116) at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1315) at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1311) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1311) at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57) at org.apache.hadoop.fs.Globber.glob(Globber.java:252) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1655) at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326) at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:235) at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:218) at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:201) at org.apache.hadoop.fs.shell.Command.run(Command.java:165) at org.apache.hadoop.fs.FsShell.run(FsShell.java:287) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at org.apache.hadoop.fs.FsShell.main(FsShell.java:340) Found 6 items drwxrwx--- - ambari-qa hdfs 0 2016-09-14 19:48 /user/ambari-qa drwxr-xr-x - hcat hdfs 0 2016-09-14 20:09 /user/hcat drwx------ - hdfs hdfs 0 2016-09-20 18:51 /user/hdfs drwxr-xr-x - hive hdfs 0 2016-09-14 20:11 /user/hive drwxr-xr-x - hdfs hdfs 0 2016-09-20 18:49 /user/ocetl drwxrwxr-x - spark hdfs 0 2016-09-14 20:00 /user/spark
Change to hdfs to execute the command can be: sudo -u hdfs hadoop fs -ls /user
[root@vmocdp125 conf]# sudo -u hdfs hadoop fs -ls /user Found 6 items drwxrwx--- - ambari-qa hdfs 0 2016-09-14 19:48 /user/ambari-qa drwxr-xr-x - hcat hdfs 0 2016-09-14 20:09 /user/hcat drwx------ - hdfs hdfs 0 2016-09-20 18:51 /user/hdfs drwxr-xr-x - hive hdfs 0 2016-09-14 20:11 /user/hive drwxr-xr-x - hdfs hdfs 0 2016-09-20 18:49 /user/ocetl drwxrwxr-x - spark hdfs 0 2016-09-14 20:00 /user/spark
Similar Posts:
- Hadoop Connect hdfs Error: could only be replicated to 0 nodes instead of minReplication (=1).
- [Solved] HDFS Filed to Start namenode Error: Premature EOF from inputStream;Failed to load FSImage file, see error(s) above for more info
- JAVA api Access HDFS Error: Permission denied in production environment
- [Solved] IDEA Remote Operate hdfs Hadoop Error: Caused by: java.net.ConnectException: Connection refused: no further information
- hdfs dfs -rm -r cause GC overhead limit exceeded
- [Solved] Hbase Exception: java.io.EOFException: Premature EOF: no length prefix available
- [Solved] Spark-HBase Error: java.lang.NoClassDefFoundError: org/htrace/Trace
- Namenode Initialize Error: java.lang.IllegalArgumentException: URI has an authority component
- [Solved] Call to localhost/127.0.0.1:9000 failed on connection exception:java.net.ConnectException
- [Solved] Hadoop Error: Input path does not exist: hdfs://Master:9000/user/hadoop/input