spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Liangzhao Zeng <liangzhao.z...@gmail.com>
Subject Re: Job can not terminated in Spark 2.0 on Yarn
Date Tue, 02 Aug 2016 19:13:10 GMT
It is 2.6 and code is very simple. I load data file from Hdfs to create rdd then same some
samples.


Thanks 

发自我的 iPhone

> 在 Aug 2, 2016,11:01 AM,Ted Yu <yuzhihong@gmail.com> 写道:
> 
> Which hadoop version are you using ?
> 
> Can you show snippet of your code ?
> 
> Thanks
> 
>> On Tue, Aug 2, 2016 at 10:06 AM, Liangzhao Zeng <liangzhao.zeng@gmail.com>
wrote:
>> Hi, 
>> 
>> I migrate my code to Spark 2.0 from 1.6. It finish last stage (and result is correct)
but get following errors then start over. 
>> 
>> Any idea on what happen?
>> 
>> 16/08/02 16:59:33 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped!
Dropping event SparkListenerExecutorMetricsUpdate(2,WrappedArray())
>> 16/08/02 16:59:33 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped!
Dropping event SparkListenerExecutorMetricsUpdate(115,WrappedArray())
>> 16/08/02 16:59:33 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped!
Dropping event SparkListenerExecutorMetricsUpdate(70,WrappedArray())
>> 16/08/02 16:59:33 WARN yarn.YarnAllocator: Expected to find pending requests, but
found none.
>> 16/08/02 16:59:33 WARN netty.Dispatcher: Message RemoteProcessDisconnected(17.138.53.26:55338)
dropped. Could not find MapOutputTracker.
>> 
>> 
>> Cheers,
>> 
>> LZ
> 

Mime
View raw message