Tag Archives: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses

[Solved] Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses

Problem Description:

An error is reported when using MapReduce to implement data Deduplication

Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses

In addition, there are: Java.lang.NoClassDefFoundError; java.io.IOException: Cannot initialize Cluster.

Problem-solving:

These problems are due to the incomplete import of MapReduce and Hadoop dependencies:

<dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-common</artifactId>
      <version>3.2.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-hdfs</artifactId>
      <version>3.2.0</version>
    </dependency>
    <dependency>
      <groupId>log4j</groupId>
      <artifactId>log4j</artifactId>
      <version>1.2.17</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-client</artifactId>
      <version>3.2.0</version>
    </dependency>

    <!--mapreduce-->
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-core</artifactId>
      <version>3.2.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-common</artifactId>
      <version>3.2.0</version>
    </dependency>

So far, the problem has been solved

[Solved] Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.

1. An error is reported when running the serialization test of the local cluster

[INFO] 
[INFO] --- exec-maven-plugin:3.0.0:exec (default-cli) @ HadoopWritable ---
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
    at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:116)
    at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:109)
    at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:102)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1540)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1536)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
    at org.apache.hadoop.mapreduce.Job.connect(Job.java:1536)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1564)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588)
    at com.alex.FlowDriver.main(FlowDriver.java:45)
[ERROR] Command execution failed.
Command execution failed.


org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1)
    at org.apache.commons.exec.DefaultExecutor.executeInternal (DefaultExecutor.java:404)
    at org.apache.commons.exec.DefaultExecutor.execute (DefaultExecutor.java:166)
    at org.codehaus.mojo.exec.ExecMojo.executeCommandLine (ExecMojo.java:982)
    at org.codehaus.mojo.exec.ExecMojo.executeCommandLine (ExecMojo.java:929)
    at org.codehaus.mojo.exec.ExecMojo.execute (ExecMojo.java:457)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:954)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
    at org.codehaus.classworlds.Launcher.main (Launcher.java:47)

2. Reason: lack of jar package

Solution: import Hadoop MapReduce client common package

Adding dependencies using maven

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-common</artifactId>
            <version>3.1.3</version>
            <scope>compile</scope>
        </dependency>

[Solved] Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses

Problem Description:

An error is reported when using MapReduce to implement data De duplication

Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses

In addition, there are: Java.Lang.NoClassDefFoundError; java.io.IOException: Cannot initialize Cluster.

Problem-solving:

These problems are due to the incomplete import of MapReduce and Hadoop dependencies:

<dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-common</artifactId>
      <version>3.2.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-hdfs</artifactId>
      <version>3.2.0</version>
    </dependency>
    <dependency>
      <groupId>log4j</groupId>
      <artifactId>log4j</artifactId>
      <version>1.2.17</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-client</artifactId>
      <version>3.2.0</version>
    </dependency>

    <!--mapreduce-->
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-core</artifactId>
      <version>3.2.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-common</artifactId>
      <version>3.2.0</version>
    </dependency>

So far, the problem has been solved