When spark streaming and spark SQL are used together, we should note that there can only be one sparkcontext, otherwise the following error will be reported:
Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
com.ecp.data_cleaning.Main.main(Main.java:72)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2223)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2219)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2219)
at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2292)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:86)
at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:836)
at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:138)
at com.ecp.data_cleaning.Main.main(Main.java:74)
We can get sparkcontext through streamingcontext
JavaStreamingContext jsc = new JavaStreamingContext(sparkconf,new Duration(co.getDuration()));
JavaSparkContext sc = jsc.sparkContext();
SQLContext sqlcontext = new SQLContext(sc);
Similar Posts:
- org.apache.spark.SparkException: A master URL must be set in your configuration
- [Solved] Exception in thread “main” java.lang.NoSuchMethodError: org.apache.hadoop.security.HadoopKerberosName.setRuleMechanism(Ljava/lang/String;)V
- [Solved] ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
- ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
- [Solved] java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;) V sets the corresponding Scala version
- [Solved] Spark Programmer Compile error: object apache is not a member of package org
- Spark Program Compilation error: object apache is not a member of package org
- [Solved] Spark-HBase Error: java.lang.NoClassDefFoundError: org/htrace/Trace
- [Solved] SparkException: Could not find CoarseGrainedScheduler or it has been stopped.
- [Solved] Spark Install Error: ERROR SparkContext: Error initializing SparkContext. java.lang.reflect.InvocationTargetException