Spark programmer compile error:
[INFO] Compiling 2 source files to E:\Develop\IDEAWorkspace\spark\target\classes at 1567004370534
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:3: error: object apache is not a member of package org
[ERROR] import org.apache.spark.rdd.RDD
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:4: error: object apache is not a member of package org
[ERROR] import org.apache.spark.{SparkConf, SparkContext}
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:12: error: not found: type SparkConf
[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName(“WordCount”).setMaster(“local[2]”)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:12: error: not found: type SparkConf
[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName(“WordCount”).setMaster(“local[2]”)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:14: error: not found: type SparkContext
[ERROR] val sc: SparkContext = new SparkContext(sparkConf)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:14: error: not found: type SparkContext
[ERROR] val sc: SparkContext = new SparkContext(sparkConf)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:18: error: not found: type RDD
[ERROR] val data: RDD[String] = sc.textFile(“E:\\Study\\BigData\\heima\\stage5\\2spark����\\words.txt”)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:20: error: not found: type RDD
[ERROR] val words: RDD[String] = data.flatMap(_.split(” “))
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:22: error: not found: type RDD
[ERROR] val wordToOne: RDD[(String, Int)] = words.map((_,1))
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:24: error: not found: type RDD
[ERROR] val result: RDD[(String, Int)] = wordToOne.reduceByKey(_+_)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:27: error: not found: type RDD
[ERROR] val ascResult: RDD[(String, Int)] = result.sortBy(_._2,false) //����
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:3: error: object apache is not a member of package org
[ERROR] import org.apache.spark.{SparkConf, SparkContext}
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:4: error: object apache is not a member of package org
[ERROR] import org.apache.spark.rdd.RDD
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:12: error: not found: type SparkConf
[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName(“WordCountCluster”)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:12: error: not found: type SparkConf
[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName(“WordCountCluster”)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:14: error: not found: type SparkContext
[ERROR] val sc: SparkContext = new SparkContext(sparkConf)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:14: error: not found: type SparkContext
[ERROR] val sc: SparkContext = new SparkContext(sparkConf)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:18: error: not found: type RDD
[ERROR] val data: RDD[String] = sc.textFile(args(0))
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:20: error: not found: type RDD
[ERROR] val words: RDD[String] = data.flatMap(_.split(” “))
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:22: error: not found: type RDD
[ERROR] val wordToOne: RDD[(String, Int)] = words.map((_,1))
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:24: error: not found: type RDD
[ERROR] val result: RDD[(String, Int)] = wordToOne.reduceByKey(_+_)
[ERROR] ^
[ERROR] 21 errors found
[INFO] ————————————————————————
[INFO] BUILD FAILURE
Reason: There is a problem with the local repository. It is likely that the original local repository path is too long and too deep, the repository itself is fine, because I copied the original repository to the E:\Study\BigData\ directory and it works fine.
Solution.
The original spark project maven local repository is: E:\Study\BigData\heima\stage5\1scala\scala3\spark course needs maven repository\SparkRepository
Later I modified it to: E:\Study\BigData\repository\ just fine.
Similar Posts:
- Spark Program Compilation error: object apache is not a member of package org
- [Solved] java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;) V sets the corresponding Scala version
- Only one SparkContext may be running in this JVM
- Idea Run Scala Error: Exception in thread “main” java.lang.NoSuchMethodError:com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
- org.apache.spark.SparkException: A master URL must be set in your configuration
- [Solved] Exception in thread “main” java.lang.NoSuchMethodError: org.apache.hadoop.security.HadoopKerberosName.setRuleMechanism(Ljava/lang/String;)V
- ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
- [Solved] ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
- [Solved] Spark Install Error: ERROR SparkContext: Error initializing SparkContext. java.lang.reflect.InvocationTargetException
- [Solved] class java.lang.RuntimeException/error reading Scala signature of package.class: Scala signature package has wrong version