Home >Database >Mysql Tutorial >异常:Application failed 2 times due to AM Container

异常:Application failed 2 times due to AM Container

WBOY
WBOYOriginal
2016-06-07 16:34:457976browse

安装完Hadoop2版本后,启动测试程序报如下异常: 14/04/28 15:29:16 INFO mapreduce.Job: Job job_1398669840354_0003 failed with state FAILED due to: Application application_1398669840354_0003 failed 2 times due to AM Container for appattempt_13

安装完Hadoop2版本后,启动测试程序报如下异常:

14/04/28 15:29:16 INFO mapreduce.Job: Job job_1398669840354_0003 failed with state FAILED due to: Application application_1398669840354_0003 failed 2 times due to AM Container for appattempt_1398669840354_0003_000002 exited with  exitCode: 1 due to: Exception from container-launch: org.apache.hadoop.util.Shell$ExitCodeException:
org.apache.hadoop.util.Shell$ExitCodeException:
	at org.apache.hadoop.util.Shell.runCommand(Shell.java:505)
	at org.apache.hadoop.util.Shell.run(Shell.java:418)
	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
	at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
	at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:283)
	at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:79)
	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
Container exited with a non-zero exit code 1
.Failing this attempt.. Failing the application.
14/04/28 15:29:16 INFO mapreduce.Job: Counters: 0
Job Finished in 5.2 seconds
java.io.FileNotFoundException: File does not exist: hdfs://192.168.1.23:8020/user/hadoop/QuasiMonteCarlo_1398670150085_932896136/out/reduce-out
	at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1128)
	at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
	at org.apache.hadoop.io.SequenceFile$Reader.(SequenceFile.java:1749)
	at org.apache.hadoop.io.SequenceFile$Reader.(SequenceFile.java:1773)
	at org.apache.hadoop.examples.QuasiMonteCarlo.estimatePi(QuasiMonteCarlo.java:314)
	at org.apache.hadoop.examples.QuasiMonteCarlo.run(QuasiMonteCarlo.java:354)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.hadoop.examples.QuasiMonteCarlo.main(QuasiMonteCarlo.java:363)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
	at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
	at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

解决办法:修改 mapred-site.xml,添加如下配置

mapreduce.application.classpath 
$HADOOP_CONF_DIR,$HADOOP_COMMON_HOME/*, 
$HADOOP_COMMON_HOME/lib/*, 
$HADOOP_HDFS_HOME/*, 
$HADOOP_HDFS_HOME/lib/*, 
$HADOOP_MAPRED_HOME/*, 
$HADOOP_MAPRED_HOME/lib/*, 
$HADOOP_YARN_HOME/*, 
$HADOOP_YARN_HOME/lib/*

参考:http://blog.csdn.net/fansy1990/article/details/22896249

http://hadoop.apache.org/docs/r2.4.0/hadoop-mapreduce-client/hadoop-mapreduce-client-core/mapred-default.xml

http://www.yanbit.com/?p=42

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn