[Solved] SparkSQL Error: org.apache.hadoop.security.HadoopKerberosName.setRuleMechanism

Background

When I use idea to connect to the spark service locally, although I copy hive-site.xml and provide corresponding dependencies, I still report an error: org.apache.hadoop.security.hadoop Kerberos name.setrulemechanism

The dependencies are as follows:

  <dependencies>
      <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-core_2.12</artifactId>
          <version>3.1.2</version>
      </dependency>
      <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-sql_2.12</artifactId>
          <version>3.1.2</version>
      </dependency>

      <dependency>
          <groupId>mysql</groupId>
          <artifactId>mysql-connector-java</artifactId>
          <version>8.0.25</version>
      </dependency>

      <dependency>
          <groupId>org.apache.hive</groupId>
          <artifactId>hive-exec</artifactId>
          <version>3.1.2</version>
      </dependency>

      <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-hive_2.12</artifactId>
          <version>3.1.2</version>
      </dependency>

      <dependency>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-auth</artifactId>
          <version>3.2.0</version>
      </dependency>
  </dependencies>

Analysis

When accessing Hadoop remotely, you need to provide authentication information

 

Solution:

Add the following dependencies in Maven:

        <dependency>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-auth</artifactId>
          <version>3.2.0</version>
      </dependency>

Similar Posts: