HDFS Error: org.apache.hadoop.security.AccessControlException: Permission denied
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; public class TestHDFS { public static void main(String[] args) throws Exception{ Configuration conf = new Configuration(); conf.set("fs.defaultFS", "hdfs://master:8020"); FileSystem fs = FileSystem.get(conf); //If it exists, it will overwrite the previous directory boolean success = fs.mkdirs(new Path("/xiaol")); System.out.println(success); } }
Exception in thread “main” org.apache.hadoop.security.AccessControlException: Permission denied: user=xiaol, access=WRITE, inode=”/xiaol”:root:supergroup:drwxr-xr-x
Solution
1. Modify the configuration file
In the HDFS configuration file, change dfs.permissions.enabled to false
2. Change HDFS root permission
hadoop fs -chmod 777 /
3. Add access user
Hadoop will perform permission authentication when accessing HDFS. The process of obtaining the user name is as follows:
Read Hadoop_USER_Name system environment variable. If it is not empty, take it as username. If it is empty
Read Hadoop_ SER_Name is a Java environment variable. If it is empty
Get username from the instance of com.sun.security.auth.ntuserprincipal or com.sun.security.auth.unixprincipal.
If all the above attempts fail, an exception loginexception (“can’t find user name”) will be thrown
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; import java.util.Properties; public class TestHDFS { public static void main(String[] args) throws Exception{ Properties properties = System.getProperties(); properties.setProperty("HADOOP_USER_NAME", "root"); Configuration conf = new Configuration(); conf.set("fs.defaultFS", "hdfs://master:8020"); FileSystem fs = FileSystem.get(conf); //If it exists, it will overwrite the previous directory boolean success = fs.mkdirs(new Path("/xiaol")); System.out.println(success); } }
This is actually a vulnerability. If the user uses the user name of root or other directories, he can perform any operation on the corresponding directories, so it is still very dangerous.
Similar Posts:
- JAVA api Access HDFS Error: Permission denied in production environment
- HDFS: How to Operate API (Example)
- [Solved] Call to localhost/127.0.0.1:9000 failed on connection exception:java.net.ConnectException
- Hadoop Connect hdfs Error: could only be replicated to 0 nodes instead of minReplication (=1).
- [Solved] Exception in thread “main“ java.net.ConnectException: Call From
- Hadoop command error: permission problem [How to Solve]
- IOException: No FileSystem for scheme: hdfs
- [Solved] IDEA Remote Operate hdfs Hadoop Error: Caused by: java.net.ConnectException: Connection refused: no further information
- [Solved] hadoop:hdfs.DFSClient: Exception in createBlockOutputStream
- “Execution error, return code 1 from org. Apache. Hadoop. Hive. QL. Exec. Movetask” error occurred when hive imported data locally