Use sqoop to store HDFS data in MySQL database and report an error
Job job_ 1566707990804_ 0002 failed with state FAILED due to: Tas k failed task_ 1566707990804_ 0002_ m_ 0、
I encountered this problem because when creating a table in mysql, varchar (10) is used. If the content in the data is greater than 10, increase the varchar to solve it!
Similar Posts:
- An error is reported when sqoop imports data into MySQL database: error tool. Exporttool: error during export: export job failed!
- [Solved] Sqoop error: Could not load db driver class: com.mysql.jdbc.Driver
- Incorrect key file for table [How to Solve]
- In Oracle, an error is reported: ora-00904
- ERROR: syntax error at end of input & Database Error: invalid input syntax for type numeric: “Not reviewed”
- Sqoop Import MYSQL Datas to Hive Error: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly
- Solution of data truncated for column ‘xxx’ in MySQL
- When mysql creates a table, set timestamp DEFAULT NULL error 1067-Invalid default value for’updated_at’
- HDFS problem set (1), use the command to report an error: com.google.protobuf.servicee xception:java.lang.OutOfMemoryError :java heap space
- Mysql Error when creating table: Tablespace for table `tablexx` exists. Please DISCARD the tablespace before IMPORT.