Only one SparkContext may be running in this JVM

When spark streaming and spark SQL are used together, we should note that there can only be one sparkcontext, otherwise the following error will be reported:

Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
com.ecp.data_cleaning.Main.main(Main.java:72)
	at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2223)
	at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2219)
	at scala.Option.foreach(Option.scala:257)
	at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2219)
	at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2292)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:86)
	at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:836)
	at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
	at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:138)
	at com.ecp.data_cleaning.Main.main(Main.java:74)

We can get sparkcontext through streamingcontext

JavaStreamingContext jsc = new JavaStreamingContext(sparkconf,new Duration(co.getDuration()));
JavaSparkContext sc = jsc.sparkContext();
SQLContext sqlcontext = new SQLContext(sc);

Similar Posts: