1. Causes
Hive — service hiverver2 reports an error, which makes the hiverver2 service unusable
2. Error information
Exception in thread “HiveServer2-Handler-Pool: Thread-556” java.lang.OutOfMemoryError: GC overhead limit exceeded
3. Reasons for error reporting
JVM started by hive shell script is out of memory
4. Solution
cp hive-env.sh.template hive-env.sh vi hive-env.sh Release export HADOOP_HEAPSIZE=1024 (if you still get an error, make the memory bigger)
Similar Posts:
- Sqoop Import MYSQL Datas to Hive Error: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly
- hdfs dfs -rm -r cause GC overhead limit exceeded
- Hive1.1.0 startup error reporting Missing Hive Execution Jar: lib/hive-exec-*.jar
- “Execution error, return code 1 from org. Apache. Hadoop. Hive. QL. Exec. Movetask” error occurred when hive imported data locally
- [Solved] Hive Update and Delete Error: Attempt to do update or delete using transaction manager
- [Hive on Tez] Input path does not exists error
- [Solved] CDP7.1.7 Install hive on tez Error: Can’t create directory /mnt/ssd/yarn/nm/usercache/urika/appcache/application_1424508393097_0004 – Permission denied
- Hive connection HBase external table error, can’t get the locations
- [Solved] spark Connect hive Error: javax.jdo.JDODataStoreException: Required table missing : “`DBS`” in Catalog “” Schema “”
- [Solved] SparkSQL Error: org.apache.hadoop.security.HadoopKerberosName.setRuleMechanism