An error is reported when using./spark-shell command
Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@476fde05, see the next exception for details. at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source) ... 153 more Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /usr/local/development/spark-2.1.1-bin-hadoop2.7/bin/metastore_db. at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source) at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source) -------------------- Caused by: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@476fde05, see the next exception for details. at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source) at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source) ------------------- Caused by: org.apache.hadoop.ipc.RemoteException: Cannot create directory /tmp/hive/root/5436b1aa-85e3-4512-b505-b0bdc7444e46. Name node is in safe mode. The reported blocks 0 needs additional 9 blocks to reach the threshold 0.9990 of total blocks 9. The number of live datanodes 0 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached. at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1327) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3895) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenode
Many errors are thrown. Even when I exit spark shell and enter safe mode again, the error thrown is different. The last error is name node is in safe mode. First solve this problem
it is said on the Internet that this is the security mode of HDFS distributed file system. When the security mode is in, the contents of the file system are not allowed to be modified or deleted until the security mode is over, The security mode is that the system checks the validity of each datanode data block
bin/hadoop dfsadmin -safemode leave //leave the safe mode
users can operate the safe mode through dfsadmin – safemode value. The description of the parameter value is as follows:
// enter – enter safe mode
//Leave – force namenode to leave safe mode
[root@node1 sbin]# hdfs dfsadmin -safemode leave Safe mode is OFF
// get – Returns whether the security mode is enabled
[root@node1 sbin]# hdfs dfsadmin -safemode get Safe mode is ON
//Wait – wait until the end of safe mode
when leaving the safe mode and spark shell again, an exception is thrown
Caused by: org.apache.derby.iapi.error.StandardException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@476fde05, see the next exception for details. at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source) ... 153 more Caused by: org.apache.derby.iapi.error.StandardException: Another instance of Derby may have already booted the database /usr/local/development/spark-2.1.1-bin-hadoop2.7/bin/metastore_db. at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source) at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
As you can see, this exception means that there is another database (Derby) instance in/usr/local/development/spark-2.1.1-bin-hadoop 2.7/bin/. I have always used MySQL database and have never used Derby database. Finally, I think it may be that I connected hive when installing spark, and my hive
configured MySQL database at that time, Hive uses the Derby database by default, so when I start spark to connect hive, an instance of Derby is automatically generated. I saw ‘Metastore_The contents of the db ‘folder are the configuration of the Derby database
solution:
delete/usr/local/development/spark-2.1.1-bin-hadoop 2.7/bin/metastore_DB folder
Restart
./spark-shell
[root@node1 bin]# ./spark-shell Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 17/07/05 00:13:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/07/05 00:13:29 WARN metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0 17/07/05 00:13:29 WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException 17/07/05 00:13:40 WARN metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException Spark context Web UI available at http://192.168.177.120:4040 Spark context available as 'sc' (master = local[*], app id = local-1499184787668). Spark session available as 'spark'. Welcome to ____ __ /__/__ ________/ /__ _\ \/ _\/ _`/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.1.1 /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_131) Type in expressions to have them evaluated. Type :help for more information. scala> scala> sc res0: org.apache.spark.SparkContext = org.apache.spark.SparkContext@178c4480
Similar Posts:
- [Solved] spark Connect hive Error: javax.jdo.JDODataStoreException: Required table missing : “`DBS`” in Catalog “” Schema “”
- [Solved] Spark-HBase Error: java.lang.NoClassDefFoundError: org/htrace/Trace
- Hadoop command error: permission problem [How to Solve]
- hdfs dfs -rm -r cause GC overhead limit exceeded
- “Execution error, return code 1 from org. Apache. Hadoop. Hive. QL. Exec. Movetask” error occurred when hive imported data locally
- [Solved] SparkSQL Error: org.apache.hadoop.security.HadoopKerberosName.setRuleMechanism
- [Solved] hadoop:hdfs.DFSClient: Exception in createBlockOutputStream
- Hiveserver2 Connect Error: Could not open client transport with JDBC Uri: jdbc:hive2://hadoop01:100…
- org.apache.thrift.TApplicationException: Required field ‘client_protocol’ is unset!
- :org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.