Tag Archives: ERROR Shell: Failed to locate the winutils binary in the hadoop binary path

ERROR Shell: Failed to locate the winutils binary in the hadoop binary path

What are the eight life cycle hook functions of Vue>>>

15/06/11 15:35:50 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
    at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:356)
    at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:371)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:364)
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
    at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:272)
    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
    at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:790)
    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:760)
    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:633)
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2001)
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2001)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2001)
    at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:207)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:218)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:154)
    at SparkFromHbase$.main(SparkFromHbase.scala:15)
    at SparkFromHbase.main(SparkFromHbase.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)

It’s obviously Hadoop_ Home. If Hadoop_ If home is empty, fullexename must be null/bin/winutils.exe. The solution is very simple. To configure the environment variables, if you don’t want to restart the computer, you can add:

one

System.setProperty(

"hadoop.home.dir"

,

"E:\\Program Files\\hadoop-2.7.0"

);

Note: e: \ \ program files \ \ hadoop-2.7.0 is the path of Hadoop extracted from my computer

Later, you may still make the same mistake. At this time, you may blame me. In fact, I refused at the beginning, because if you go into your hadoop-x.x.x/bin directory, you will find that you don’t have winutils.exe at all

So I tell you, you can go to GitHub to download an address known by people on earth and send it to you

Address: https://github.com/srccodes/hadoop-common-2.2.0-bin

Don’t worry about its version, don’t be afraid, because I use the latest hadoop-2.7.0 is no problem! After downloading, add winutils.exe to your hadoop-x.x.x/bin

[Solved] ERROR Shell: Failed to locate the winutils binary in the hadoop binary path

 

When learning spark. The spark operation will report some useless errors: error shell: failed to locate the winutils binary in the Hadoop binary path

14/12/17 19:18:53 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
    at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:318)
    at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:333)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:326)
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
    at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93)
    at org.apache.hadoop.security.Groups.<init>(Groups.java:77)
    at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:255)
    at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:283)
    at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:36)
    at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:109)
    at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:228)
    at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
    at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
    at com.hark.Test$.main(Test.scala:28)
    at com.hark.Test.main(Test.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
14/12/17 19:18:54 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@WIN7-20140319ZQ:60477/user/HeartbeatReceiver

Windows idea running times this exception

Just add in front of the code

// import java.io.File

val path = new File(".").getCanonicalPath()
//File workaround = new File(".");
System.getProperties().put("hadoop.home.dir", path);
new File("./bin").mkdirs();
new File("./bin/winutils.exe").createNewFile();