What are the eight life cycle hook functions of Vue>>>
15/06/11 15:35:50 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:356)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:371)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:364)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:272)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:790)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:760)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:633)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2001)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2001)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2001)
at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:207)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:218)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:154)
at SparkFromHbase$.main(SparkFromHbase.scala:15)
at SparkFromHbase.main(SparkFromHbase.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
It’s obviously Hadoop_ Home. If Hadoop_ If home is empty, fullexename must be null/bin/winutils.exe. The solution is very simple. To configure the environment variables, if you don’t want to restart the computer, you can add:
one |
|
Note: e: \ \ program files \ \ hadoop-2.7.0 is the path of Hadoop extracted from my computer
Later, you may still make the same mistake. At this time, you may blame me. In fact, I refused at the beginning, because if you go into your hadoop-x.x.x/bin directory, you will find that you don’t have winutils.exe at all
So I tell you, you can go to GitHub to download an address known by people on earth and send it to you
Address: https://github.com/srccodes/hadoop-common-2.2.0-bin
Don’t worry about its version, don’t be afraid, because I use the latest hadoop-2.7.0 is no problem! After downloading, add winutils.exe to your hadoop-x.x.x/bin
Similar Posts:
- [Solved] ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
- [Solved] Exception in thread “main” java.lang.NoSuchMethodError: org.apache.hadoop.security.HadoopKerberosName.setRuleMechanism(Ljava/lang/String;)V
- [Solved] Hadoop Error: Input path does not exist: hdfs://Master:9000/user/hadoop/input
- Only one SparkContext may be running in this JVM
- org.apache.spark.SparkException: A master URL must be set in your configuration
- [Solved] ava.io.IOException: HADOOP_HOME or hadoop.home.dir are not set
- Hadoop command error: permission problem [How to Solve]
- [Solved] Spark Install Error: ERROR SparkContext: Error initializing SparkContext. java.lang.reflect.InvocationTargetException
- [Solved] idea Remote Submit spark Error: java.io.IOException: Failed to connect to DESKTOP-H
- [Solved] Spark-HBase Error: java.lang.NoClassDefFoundError: org/htrace/Trace