Spark shell cannot start normally due to scala compiler

After chopping hands, the fraud call came before the express delivery was received. How to improve the privacy and security of e-commerce>>>

Spark shell can’t start blog classification normally due to scala compiler: Spark

Recently, I began to learn spark. When I run the command spark shell according to the official instructions on windows to enter the spark shell of scala version, the following problems appear:

Failed to initialize compiler: object scala.runtime in compiler mirror not found.

** Note that as of 2.8 scala does not assume use of the java classpath.

** For the old behavior pass -usejavacp to scala, or if using a Setting

** object programatically, settings.usejavacp.value = true.

The reason for this problem is that Scala version 2.8 and above will no longer use Java classpath by default. To solve this problem, it is obvious to add the command to use Java classpath in the configuration file.

After Google method, we finally found a complete and feasible solution

Modify the contents of the file (that is, add the red part in the figure below)

After saving, run the bin/spark shell command on CMD to switch to the shell in scala version

So modify the

rem Set JAVA_ OPTS to be able to load native libraries and to set heap size
set JAVA_ OPTS=%OUR_ JAVA_ OPTS% -Djava.library.path=%SPARK_ LIBRARY_ PATH%-Xms%SPARK_ MEM% -Xmx%SPARK_ MEM%-Dscala.usejavacp=true

It is worth mentioning that there are many calss configuration files in Bin of spark 1.1.0 distribution package, such as spark class, spark class.cmd and spark-class2.cmd. The contents of these three files are different. I tried to use the solution in two other files, but none of them worked. As for why the solution needs to be applied in spark-class2.cmd and what functions these three files realize or what relationship they have, it needs to be further studied.

Note: the solutions mentioned above are from

http://my.oschina.net/sulliy/blog/217212

Similar Posts: