Tag Archives: Hive

Java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument

Website content quality is poor, distribution efficiency is too low how to do?Huawei engineers offer 5 unique skills>>>

Error message prompted:

SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
        at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
        at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
        at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
        at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
        at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
        at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
        at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5099)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:97)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:81)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:699)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:236)

The key point is: com.google.common.base.preconditions.checkargument. This is because the version of guava.jar in hive is inconsistent with that in Hadoop. Test method:

Check the guava.jar version in share/Hadoop/common/Lib in the Hadoop installation directory

Check the version of guava.jar in lib under hive installation directory. If the two versions are inconsistent, delete the lower version and copy the higher version to solve the problem

But I found out later that there were other problems

Other versions of guava can be downloaded from the MVN repository

Reference link: https://blog.csdn.net/GQB1226/article/details/102555820

Hive Error: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

Report errors during hive execution and pay attention to the key points (yellow)

2019-02-01 09:56:54,623 ERROR [pool-7-thread-4] dao.IHiveDaoImpl - java.sql.SQLException: org.apache.hive.service.cli.HiveSQLException: 
Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
    at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:380)
    at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:257)
    at org.apache.hive.service.cli.operation.SQLOperation.access$800(SQLOperation.java:91)
    at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:348)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1758)
    at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:362)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

It’s probably an error when executing MapReduce:

Check that MapReduce is actually implemented

Pull Mr error log:

2019-02-01 10:28:35,832 INFO [IPC Server handler 4 on 38091] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1537175606568_162793_m_000000_3: Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"log":"5\u001aNEWEHIREWEB17.2019012911\u001a1\u001a3\u001a1548730807629\u001a43\u001a14\u001a2223123\u001a2577551\u001a8e56221be35a44f8845064b8cc8f21f9\u001a61.170.197.152\u001a","webname":"ehireLog","mon":"201901","dt":"20190129"}
    at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:169)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1758)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"log":"5\u001aNEWEHIREWEB17.2019012911\u001a1\u001a3\u001a1548730807629\u001a43\u001a14\u001a2223123\u001a2577551\u001a8e56221be35a44f8845064b8cc8f21f9\u001a61.170.197.152\u001a","webname":"ehireLog","mon":"201901","dt":"20190129"}
    at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:562)
    at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:160)
    ... 8 more
Caused by: com.tracker.common.db.simplehbase.exception.SimpleHBaseException: convert result exception. cells=[003\x111/data:id/1538028988105/Put/vlen=4/seqid=0, 003\x111/data:isInSelector/1538028988105/Put/vlen=4/seqid=0, 003\x111/data:isStats/1538028988105/Put/vlen=4/seqid=0, 003\x111/data:pageDesc/1538028988105/Put/vlen=6/seqid=0, 003\x111/data:pageType/1548918298621/Put/vlen=1/seqid=0, 003\x111/data:webId/1538028988105/Put/vlen=4/seqid=0] type=class com.tracker.common.data.model.dict.website.Page
    at com.tracker.common.db.simplehbase.HbaseClient.convertToHbaseObjectResult(HbaseClient.java:337)
    at com.tracker.common.db.simplehbase.HbaseClientImpl$6.handleData(HbaseClientImpl.java:177)
    at com.tracker.common.db.simplehbase.HbaseClientImpl.handData_internal(HbaseClientImpl.java:733)
    at com.tracker.common.db.simplehbase.HbaseClientImpl.handDataByRowPrefixList(HbaseClientImpl.java:651)
    at com.tracker.common.db.simplehbase.HbaseClientImpl.findObjectByRowPrefixList(HbaseClientImpl.java:174)
    at com.tracker.common.db.simplehbase.HbaseClientImpl.findObjectByRowPrefix(HbaseClientImpl.java:167)
    at com.tracker.common.data.dao.dict.WebDictDataDao$6.apply(WebDictDataDao.java:154)
    at com.tracker.common.data.dao.dict.WebDictDataDao$6.apply(WebDictDataDao.java:151)
    at com.tracker.common.cache.LocalMapCache.getOrElse(LocalMapCache.java:66)
    at com.tracker.common.data.dao.dict.WebDictDataDao.getPageList(WebDictDataDao.java:151)
    at com.tracker.common.data.dao.dict.WebDictDataDao.loadDictToCache(WebDictDataDao.java:36)
    at com.tracker.common.data.query.DictDataQuery.loadLogPaserDict(DictDataQuery.java:84)
    at com.tracker.hive.func.udf.parse.ParseLog.initialize(ParseLog.java:64)
    at org.apache.hadoop.hive.ql.udf.generic.GenericUDF.initializeAndFoldConstants(GenericUDF.java:141)
    at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:146)
    at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:140)
    at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:140)
    at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:140)
    at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:140)
    at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluatorHead.initialize(ExprNodeEvaluatorHead.java:39)
    at org.apache.hadoop.hive.ql.exec.FilterOperator.process(FilterOperator.java:80)
    at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:897)
    at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130)
    at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:148)
    at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:547)
    ... 9 more
Caused by: com.tracker.common.db.simplehbase.exception.SimpleHBaseException: java.lang.IllegalArgumentException: offset (0) + length (4) exceed the capacity of the array: 1
    at com.tracker.common.db.simplehbase.HbaseClient.convertBytesToPOJOField(HbaseClient.java:374)
    at com.tracker.common.db.simplehbase.HbaseClient.convertToHbaseObjectResult(HbaseClient.java:332)
    ... 33 more
Caused by: java.lang.IllegalArgumentException: offset (0) + length (4) exceed the capacity of the array: 1
    at org.apache.hadoop.hbase.util.Bytes.explainWrongLengthOrOffset(Bytes.java:632)
    at org.apache.hadoop.hbase.util.Bytes.toInt(Bytes.java:802)
    at org.apache.hadoop.hbase.util.Bytes.toInt(Bytes.java:778)
    at com.tracker.coprocessor.utils.TypeHandlerHolder$IntegerHandler.toObject(TypeHandlerHolder.java:311)
    at com.tracker.common.db.simplehbase.HbaseClient.convertBytesToPOJOField(HbaseClient.java:371)
    ... 34 more

If you look at the yellow part, you can see that it is the corresponding entity class error of HBase

Reason: the type – > in HBase data dictionary table has been modified; There is no jar package to update hive

“Execution error, return code 1 from org. Apache. Hadoop. Hive. QL. Exec. Movetask” error occurred when hive imported data locally

Phenomenon

Importing local files through load data local will cause an error that cannot be imported

hive> load data local inpath '/home/hadoop/out/mid_test.txt' overwrite into table my_mid.mid_test partition (etl_date=20190101);
Loading data to table my_mid.mid_test partition (etl_date=20190101)
Failed with exception Unable to move source file:/home/hadoop/out/mid_test.txt to destination hdfs://namenode01.my.com/my_etl/dw/mid/mid_test/etl_date=20190101/mid_test.txt
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask

Related operations

The first import was successful. Later, it was found that there was a problem with the number of lines in the source file. A new text file was copied to the local import directory. The above problem occurred during subsequent import

Solutions

Query the hive log for more detailed information. The common location of the log is/TMP/${user}/hive.log

2019-02-19 09:38:21,503 ERROR [main]: exec.Task (SessionState.java:printError(960)) - Failed with exception Unable to move source file:/home/hadoop/out/mid_test.txt to destination hdfs://namenode01.my.com/my_etl/dw/mid/mid_test/etl_date=20190101/mid_test.txt
org.apache.hadoop.hive.ql.metadata.HiveException: Unable to move source file:/home/hadoop/out/mid_test.txt to destination hdfs://namenode01.my.com/my_etl/dw/mid/mid_test/etl_date=20190101/mid_test.txt
	at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:2644)
	at org.apache.hadoop.hive.ql.metadata.Hive.replaceFiles(Hive.java:2911)
	at org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(Hive.java:1398)
	at org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(Hive.java:1324)
	at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:438)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1653)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1412)
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736)
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error: file:/home/hadoop/out/mid_test.txt at 0 exp: -827044509 got: 624370567
	at org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:323)
	at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:279)
	at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:228)
	at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:196)
	at java.io.DataInputStream.read(DataInputStream.java:100)
	at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
	at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:59)
	at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:119)
	at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
	at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
	at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1965)
	at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1933)
	at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1898)
	at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:2637)
	... 23 more
```shell

> Determine that it is a checksum failure and find the actual reason is that the checksum value of the old file crc file does not match with the new copy file.

## Check if there is a crc file

```shell
[hadoop@my-17 out]$ls -al
totally 112720
drwxrwxr-x   2 hadoop hadoop     4096 2月  19 09:31 .
drwx------. 39 hadoop hadoop     4096 2月  19 09:57 ..
-rw-r--r--   1 hadoop hadoop  1595242 2月  19 09:12 mid_test.txt
-rw-r--r--   1 hadoop hadoop     3128 2月  19 08:22 .mid_test.txt.crc

If there is such a file, just delete the CRC file

The subsequent import is successful

Hive appears to refuse connection ConnectionRefused Solution

Caused by: java.net.ConnectException: Call From nn1.hadoop/192.168.10.6 to nn2. hadoop:9000 failed On connection exception: java.net.connectexception: refuse connection; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

If you fail to connect to hive, you can refer to the official document if you report an error https://cwiki.apache.org/confluence/display/HADOOP2/ConnectionRefused

First check whether Hadoop is running normally, open the HDFS web page of Hadoop to browse whether the two nodes are running normally. A node may hang up or something, just start the node

Check whether the namenode is running normally, check whether the namenode is stable, and restart the zkfc service

Maybe MySQL hung up, just get up

Hive connection HBase external table error, can’t get the locations

Knowledge map advanced must read: read how large-scale map data efficient storage and retrieval>>>

Execute the create HBase external table in hive, and execute the create script:

hive>CREATEEXTERNALTABLEhbase_userFace(idstring,mobilestring,namestring)
>STOREDBY'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>WITHSERDEPROPERTIES("hbase.columns.mapping"=":key,faces:mobile,faces:name")
>TBLPROPERTIES("hbase.table.name"="userFace");

The error is as follows:

FAILED:ExecutionError,returncode1fromorg.apache.hadoop.hive.ql.exec.DDLTask.MetaException(message:org.apache.hadoop.hbase.client.RetriesExhaustedException:Can'tgetthelocations
atorg.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:312)
atorg.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:153)
atorg.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:61)
atorg.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
atorg.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320)
atorg.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:295)
atorg.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:160)
atorg.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:155)
atorg.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:811)
atorg.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:602)
atorg.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:366)
atorg.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:303)
atorg.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:313)
atorg.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:200)
atorg.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:664)
atorg.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:657)
atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)
atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
atjava.lang.reflect.Method.invoke(Method.java:606)
atorg.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
atcom.sun.proxy.$Proxy8.createTable(UnknownSource)
atorg.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:714)
atorg.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4135)
atorg.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:306)
atorg.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
atorg.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
atorg.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1653)
atorg.apache.hadoop.hive.ql.Driver.execute(Driver.java:1412)
atorg.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195)
atorg.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
atorg.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
atorg.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
atorg.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
atorg.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
atorg.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736)
atorg.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
atorg.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)
atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
atjava.lang.reflect.Method.invoke(Method.java:606)
atorg.apache.hadoop.util.RunJar.run(RunJar.java:221)
atorg.apache.hadoop.util.RunJar.main(RunJar.java:136)
)

The error should be that HBase cannot be connected. HBase is managed by zookeeper. Test as follows:

1. Test the connection of single node HBase

$hive-hiveconfhbase.master=master:60000

After entering hive’s cli, execute the script to create the external table, and find that the error is still reported

2. Test the connection of HBase in cluster

hive-hiveconfhbase.zookeeper.quorum=slave1,slave2,master,slave4,slave5,slave6,slave7

After entering hive’s cli, execute the script to create the external table, and find that the creation is successful

It can be seen that an error occurred when hive read the zookeeper of HBase. Check the hive-site.xml file, there is a property named hive.zookeeper.quorum, copy a property changed to hbase.zookeeper.quorum. As follows:

<property>
<name>hbase.zookeeper.quorum</name>
<value>slave1,slave2,master,slave4,slave5,slave6,slave7</value>
<description>
</description>
</property>

So far, the problem is solved and the creation of HBase external table is successful

Hive Error: Error while compiling statement: FAILED: ParseException line 1:7 Failed to recognize

Execute hive SQL statement“   The error of “select out from XXX” is as follows:

Error: Error while compiling statement: FAILED: ParseException line 1:7 Failed to recognize predicate ‘out’. Failed rule: ‘identifier’ in table or column identifier (state=42000,code=40000)

My intention is to take out the value of the out field in my table. The bug in this sentence means that out is a reserved word. Although a column in my table is also called out, it can’t directly execute the select out field. It is modified as follows:
select  ` out`   from XXX

That is to say, put counter quotation marks on both sides of out to solve the problem~