spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kürşat Kurt <kur...@kursatkurt.com>
Subject RE: Out of memory at 60GB free memory.
Date Mon, 07 Nov 2016 11:51:49 GMT
I understand that i shoud set the executor memory. I tried with the parameters below but OOM
still occures...

./spark-submit --class main.scala.Test1 --master local[8]  --driver-memory 20g --executor-memory
20g

 

From: Sean Owen [mailto:sowen@cloudera.com] 
Sent: Monday, November 7, 2016 12:21 PM
To: Kürşat Kurt <kursat@kursatkurt.com>; user@spark.apache.org
Subject: Re: Out of memory at 60GB free memory.

 

You say "out of memory", and you allocate a huge amount of driver memory, but, it's your executor
that's running out of memory. You want --executor-memory. You can't set it after the driver
has run.

On Mon, Nov 7, 2016 at 5:35 AM Kürşat Kurt <kursat@kursatkurt.com <mailto:kursat@kursatkurt.com>
> wrote:

Hi;

I am trying to use Naive Bayes for multi-class classification.

I am getting OOM at “pipeline.fit(train)” line. When i submit the code, everything is
ok so far the stage “collect at NaiveBayes.scala:400”.

At this stage, starting 375 tasks very fast and going slowing down at this point. Task count
could not became 500, getting OOM at 380-390th task.

 


Mime
View raw message