Error Messages: java.lang.Exception: java.lang.ArrayIndexOutOfBoundsException: 1 at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:492) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:552) Caused by: java.lang.ArrayIndexOutOfBoundsException: 1 at exper.Filter$Map.map(Filter.java:25) at exper.Filter$Map.map(Filter.java:19) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347) at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:271) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)
Reason for error: The source code sets the data separator as space, while the code separator in the dataset is tab tab
Modify the red marked statement.
public void map(Object key, Text value, Context context) throws IOException, InterruptedException { String line = value.toString(); System.out.println(line); String arr[] = line.split("\t"); newKey.set(arr[1]); context.write(newKey, NullWritable.get()); System.out.println(newKey); }
Then run the jar file on Linux:
enter the command under the folder where the jar package is stored:
Hadoop jar mapreducedemo-1.0-snapshot.jar exp.filter/mymapreduce2/in/buyer_Myfavorite1/user/root/mymapreduce2/out
runs successfully!
Similar Posts:
- Mapreduce:Split metadata size exceeded 10000000
- Hive Error: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
- [How to Solve] Exception starting filter struts2
- [Solved] Caused by: org.apache.catalina.LifecycleException: A child container failed during start
- [Solved] Flume startup error: org.apache.flume.FlumeException: Failed to set up server socket
- [Solved] Tez Compression codec com.hadoop.compression.lzo.LzoCodec not found.
- [Solved] Eclipse Error: java.lang.ClassNotFoundException: ContextLoaderListener
- [Solved] Hadoop running jar package error: xception in thread “main” java.lang.ClassNotFoundException: Filter
- [Solved] mybatis Multi-Module Error: Invalid bound statement (not found)
- [Solved] Invalid character found in method name. HTTP method names must be tokens