spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: work around Size exceeds Integer.MAX_VALUE
Date Thu, 09 Jul 2015 22:11:19 GMT
Which release of Spark are you using ?

Can you show the complete stack trace ?

getBytes() could be called from:
    getBytes(file, 0, file.length)
or:
    getBytes(segment.file, segment.offset, segment.length)

Cheers

On Thu, Jul 9, 2015 at 2:50 PM, Michal ńĆizmazia <micizma@gmail.com> wrote:

> Please could anyone give me pointers for appropriate SparkConf to work
> around "Size exceeds Integer.MAX_VALUE"?
>
> Stacktrace:
>
> 2015-07-09 20:12:02 INFO  (sparkDriver-akka.actor.default-dispatcher-3)
> BlockManagerInfo:59 - Added rdd_0_0 on disk on localhost:51132 (size: 29.8
> GB)
> 2015-07-09 20:12:02 ERROR (Executor task launch worker-0) Executor:96 -
> Exception in task 0.0 in stage 0.0 (TID 0)
> java.lang.IllegalArgumentException: Size exceeds Integer.MAX_VALUE
>         at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:836)
>         at
> org.apache.spark.storage.DiskStore$$anonfun$getBytes$2.apply(DiskStore.scala:125)
> ...
>
>

Mime
View raw message