spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andre Kuhnen <andrekuh...@gmail.com>
Subject Re: Lease Exception hadoop 2.4
Date Mon, 05 May 2014 02:10:38 GMT
I think I forgot to rsync the slaves with the new compiled jar,  I will
give it a try as soon as possible,
Em 04/05/2014 21:35, "Andre Kuhnen" <andrekuhnen@gmail.com> escreveu:

> I compiled spark with SPARK_HADOOP_VERSION=2.4.0 sbt/sbt assembly, fixed
> the s3 dependencies,  but I am still getting the same error...
> 14/05/05 00:32:33 WARN TaskSetManager: Loss was due to
> org.apache.hadoop.ipc.RemoteException
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException):
> No lease on
>
> Any ideas?
>
> thanks
>
>
>
> 2014-05-04 11:53 GMT-03:00 Andre Kuhnen <andrekuhnen@gmail.com>:
>
>> Thanks Mayur,  the only think that my code is doing is:
>>
>> read from s3,  and saveAsTextFile on hdfs.  Like I said,  everything is
>> written correctly,  but at the end of the job there is this warnning,
>>  I will try to compile with hadoop 2.4
>> thanks
>>
>>
>>
>>
>> 2014-05-04 11:17 GMT-03:00 Mayur Rustagi <mayur.rustagi@gmail.com>:
>>
>> You should compile Spark with every hadoop version you use. I am
>>> surprised its working otherwise as HDFS breaks compatibility quite often.
>>> As for this error it comes when your code writes/reads from file that
>>> has already deleted. Are you trying to update a single file in multiple
>>> mappers/reduce partitioners?
>>>
>>>
>>> Mayur Rustagi
>>> Ph: +1 (760) 203 3257
>>> http://www.sigmoidanalytics.com
>>>  @mayur_rustagi <https://twitter.com/mayur_rustagi>
>>>
>>>
>>>
>>> On Sun, May 4, 2014 at 5:30 PM, Andre Kuhnen <andrekuhnen@gmail.com>wrote:
>>>
>>>> Please, can anyone give a feedback?  thanks
>>>>
>>>> Hello, I am getting this warning after upgrading Hadoop 2.4, when I try
>>>> to write something to the HDFS.   The content is written correctly, but I
>>>> do not like this warning.
>>>>
>>>> DO I have to compile SPARK with hadoop 2.4?
>>>>
>>>> WARN TaskSetManager: Loss was due to
>>>> org.apache.hadoop.ipc.RemoteException
>>>>
>>>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException)
>>>>
>>>> thanks
>>>>
>>>>
>>>> 2014-05-03 13:09 GMT-03:00 Andre Kuhnen <andrekuhnen@gmail.com>:
>>>>
>>>> Hello, I am getting this warning after upgrading Hadoop 2.4, when I try
>>>>> to write something to the HDFS.   The content is written correctly, but
I
>>>>> do not like this warning.
>>>>>
>>>>> DO I have to compile SPARK with hadoop 2.4?
>>>>>
>>>>> WARN TaskSetManager: Loss was due to
>>>>> org.apache.hadoop.ipc.RemoteException
>>>>>
>>>>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException)
>>>>>
>>>>> thanks
>>>>>
>>>>>
>>>>
>>>
>>
>

Mime
View raw message