spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gurvinder Singh <gurvinder.si...@uninett.no>
Subject Re: Spark on Mesos 0.20
Date Mon, 06 Oct 2014 07:50:19 GMT
The issue does not occur if the task at hand has small number of map
tasks. I have a task which has 978 map tasks and I see this error as

14/10/06 09:34:40 ERROR BlockManagerMasterActor: Got two different block
manager registrations on 20140711-081617-711206558-5050-2543-5

Here is the log from the mesos-slave where this container was running.

http://pastebin.com/Q1Cuzm6Q

If you look for the code from where error produced by spark, you will
see that it simply exit and saying in comments "this should never
happen, lets just quit" :-)

- Gurvinder
On 10/06/2014 09:30 AM, Timothy Chen wrote:
> (Hit enter too soon...)
> 
> What is your setup and steps to repro this?
> 
> Tim
> 
> On Mon, Oct 6, 2014 at 12:30 AM, Timothy Chen <tnachen@gmail.com> wrote:
>> Hi Gurvinder,
>>
>> I tried fine grain mode before and didn't get into that problem.
>>
>>
>> On Sun, Oct 5, 2014 at 11:44 PM, Gurvinder Singh
>> <gurvinder.singh@uninett.no> wrote:
>>> On 10/06/2014 08:19 AM, Fairiz Azizi wrote:
>>>> The Spark online docs indicate that Spark is compatible with Mesos 0.18.1
>>>>
>>>> I've gotten it to work just fine on 0.18.1 and 0.18.2
>>>>
>>>> Has anyone tried Spark on a newer version of Mesos, i.e. Mesos v0.20.0?
>>>>
>>>> -Fi
>>>>
>>> Yeah we are using Spark 1.1.0 with Mesos 0.20.1. It runs fine in coarse
>>> mode, in fine grain mode there is an issue with blockmanager names
>>> conflict. I have been waiting for it to be fixed but it is still there.
>>>
>>> -Gurvinder
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Mime
View raw message