spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Joshua Caplan (JIRA)" <j...@apache.org>
Subject [jira] [Created] (SPARK-19649) Spark YARN client throws exception if job succeeds and max-completed-applications=0
Date Fri, 17 Feb 2017 22:57:44 GMT
Joshua Caplan created SPARK-19649:
-------------------------------------

             Summary: Spark YARN client throws exception if job succeeds and max-completed-applications=0
                 Key: SPARK-19649
                 URL: https://issues.apache.org/jira/browse/SPARK-19649
             Project: Spark
          Issue Type: Bug
          Components: YARN
    Affects Versions: 1.6.3
         Environment: EMR release label 4.8.x
            Reporter: Joshua Caplan
            Priority: Minor


I have configured YARN not to keep *any* recent jobs in memory, as some of my jobs get pretty
large.

{code}
yarn-site	yarn.resourcemanager.max-completed-applications	0
{code}

The once-per-second call to getApplicationReport may thus encounter a RUNNING application
followed by a not found application, and report a false negative.

(typical) Executor log:
{code}
17/01/09 19:31:23 INFO ApplicationMaster: Final app status: SUCCEEDED, exitCode: 0
17/01/09 19:31:23 INFO SparkContext: Invoking stop() from shutdown hook
17/01/09 19:31:24 INFO SparkUI: Stopped Spark web UI at http://10.0.0.168:37046
17/01/09 19:31:24 INFO YarnClusterSchedulerBackend: Shutting down all executors
17/01/09 19:31:24 INFO YarnClusterSchedulerBackend: Asking each executor to shut down
17/01/09 19:31:24 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/01/09 19:31:24 INFO MemoryStore: MemoryStore cleared
17/01/09 19:31:24 INFO BlockManager: BlockManager stopped
17/01/09 19:31:24 INFO BlockManagerMaster: BlockManagerMaster stopped
17/01/09 19:31:24 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator
stopped!
17/01/09 19:31:24 INFO SparkContext: Successfully stopped SparkContext
17/01/09 19:31:24 INFO ApplicationMaster: Unregistering ApplicationMaster with SUCCEEDED
17/01/09 19:31:24 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
17/01/09 19:31:24 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down;
proceeding with flushing remote transports.
17/01/09 19:31:24 INFO AMRMClientImpl: Waiting for application to be successfully unregistered.
17/01/09 19:31:24 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
{code}

Client log:
{code}
17/01/09 19:31:23 INFO Client: Application report for application_1483983939941_0056 (state:
RUNNING)
17/01/09 19:31:24 ERROR Client: Application application_1483983939941_0056 not found.
Exception in thread "main" org.apache.spark.SparkException: Application application_1483983939941_0056
is killed
	at org.apache.spark.deploy.yarn.Client.run(Client.scala:1038)
	at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1081)
	at org.apache.spark.deploy.yarn.Client.main(Client.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
{code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message