Reason: When the hive data is imported into mysql, the content of a field exceeds the field length set by mysql, so an error is reported
Solution:
yarn logs -applicationId application_1621320191765_0037
Check the corresponding applicationid log, find the field that exceeds the length, and reset the field content
Similar Posts:
- [Solved] Hadoop Error: The directory item limit is exceeded: limit=1048576 items=1048576
- The Difference Between Hadoop job-kill and Yarn application-kill
- [Solved] Oracle Import Error:field in data file exceeds maximum length
- Error in importing excel file from SQL Server
- [Solved] CDP7.1.7 Install hive on tez Error: Can’t create directory /mnt/ssd/yarn/nm/usercache/urika/appcache/application_1424508393097_0004 – Permission denied
- Hive1.1.0 startup error reporting Missing Hive Execution Jar: lib/hive-exec-*.jar
- Sqoop Import MYSQL Datas to Hive Error: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly
- [Solved] Execution failed for task ‘:app:mergeDebugResources’
- SparkSQL Use DataSet to Operate createOrReplaceGlobalTempView Error
- Error: Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster