Give the error code first
def main(args: Array[String]): Unit = {
//Create SparkConf() And Set AppName
SparkSession.builder()
.appName("Spark Sql basic example")
.config("spark.some.config.option", "some-value")
.getOrCreate()
//import implicit DF,DS
import spark.implicits._ //Here the spark appears in red and cannot be imported
}
Solution: give sparksession. Builder a corresponding variable value, which is spark
The spark here is not something under a package, but the variable value corresponding to our sparksession. Builder(). Here is the correct way to write it
def main(args: Array[String]): Unit = {
//Create SparkConf() And Set AppName
val spark= SparkSession.builder()
.appName("Spark Sql basic example")
.config("spark.some.config.option", "some-value")
.getOrCreate()
//import implicit DF,DS
import spark.implicits._
}
Similar Posts:
- org.apache.spark.SparkException: A master URL must be set in your configuration
- [Solved] Spark-HBase Error: java.lang.NoClassDefFoundError: org/htrace/Trace
- [Solved] java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;) V sets the corresponding Scala version
- [Solved] Spark Programmer Compile error: object apache is not a member of package org
- Spark Program Compilation error: object apache is not a member of package org
- Idea Run Scala Error: Exception in thread “main” java.lang.NoSuchMethodError:com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
- When using SQL(), dataset can’t use map, flatmap and other conversion operators
- Only one SparkContext may be running in this JVM
- Idea debugs locally, and spark reports an error when creating hivecontext
- [Solved] Exception in thread “main” java.lang.NoSuchMethodError: org.apache.hadoop.security.HadoopKerberosName.setRuleMechanism(Ljava/lang/String;)V