spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Takeshi Yamamuro <linguin....@gmail.com>
Subject Re: Broadcast big dataset
Date Wed, 28 Sep 2016 15:09:40 GMT
Hi,

# I dropped dev and added user because this is more suitable in
user-mailinglist.

I think you need to describe more about your environments,
e.g. spark version, executor memory, and so on.

// maropu


On Wed, Sep 28, 2016 at 11:03 PM, WangJianfei <
wangjianfei15@otcaix.iscas.ac.cn> wrote:

> Hi Devs
>  In my application, i just broadcast a dataset(about 500M) to  the
> ececutors(100+), I got a java heap error
> Jmartad-7219.hadoop.jd.local:53591 (size: 4.0 MB, free: 3.3 GB)
> 16/09/28 15:56:48 INFO BlockManagerInfo: Added broadcast_9_piece19 in
> memory
> on BJHC-Jmartad-9012.hadoop.jd.local:53197 (size: 4.0 MB, free: 3.3 GB)
> 16/09/28 15:56:49 INFO BlockManagerInfo: Added broadcast_9_piece8 in memory
> on BJHC-Jmartad-84101.hadoop.jd.local:52044 (size: 4.0 MB, free: 3.3 GB)
> 16/09/28 15:56:58 INFO BlockManagerInfo: Removed broadcast_8_piece0 on
> 172.22.176.114:37438 in memory (size: 2.7 KB, free: 3.1 GB)
> 16/09/28 15:56:58 WARN TaskSetManager: Lost task 125.0 in stage 7.0 (TID
> 130, BJHC-Jmartad-9376.hadoop.jd.local): java.lang.OutOfMemoryError: Java
> heap space
>         at java.io.ObjectInputStream$HandleTable.grow(
> ObjectInputStream.java:3465)
>         at
> java.io.ObjectInputStream$HandleTable.assign(ObjectInputStream.java:3271)
>         at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1789)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.
> java:1350)
>         at java.io.ObjectInputStream.defaultReadFields(
> ObjectInputStream.java:1990)
>         at java.io.ObjectInputStream.readSerialData(
> ObjectInputStream.java:1915)
>         at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.
> java:1350)
>         at java.io.ObjectInputStream.defaultReadFields(
> ObjectInputStream.java:1990)
>         at java.io.ObjectInputStream.readSerialData(
> ObjectInputStream.java:1915)
>         at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.
> java:1350)
>         at java.io.ObjectInputStream.readArray(ObjectInputStream.
> java:1706)
>
> My configuration is 4G memory in driver.  Any advice is appreciated.
> Thank you!
>
>
>
> --
> View this message in context: http://apache-spark-
> developers-list.1001551.n3.nabble.com/Broadcast-big-dataset-tp19127.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>


-- 
---
Takeshi Yamamuro

Mime
View raw message