spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dai, Kevin" <yun...@paypal.com.INVALID>
Subject 答复: java.lang.OutOfMemoryError: Direct buffer memory when using broadcast join
Date Mon, 21 Mar 2016 08:34:34 GMT

spark 1.6.0

________________________________
发件人: Tamas Szuromi <tamas.szuromi@odigeo.com.INVALID>
发送时间: 2016年3月21日 16:30
收件人: Dai, Kevin
抄送: user@spark.apache.org
主题: Re: java.lang.OutOfMemoryError: Direct buffer memory when using broadcast join

what version of spark do you use?

Tamas



On 21 March 2016 at 09:25, Dai, Kevin <yundai@paypal.com.invalid<mailto:yundai@paypal.com.invalid>>
wrote:

Hi,  All


I'm joining a small table (about 200m) with a huge table using broadcast join, however, spark
throw the exception as follows:


16/03/20 22:32:06 WARN TransportChannelHandler: Exception in connection from
java.lang.OutOfMemoryError: Direct buffer memory
        at java.nio.Bits.reserveMemory(Bits.java:658)
        at java.nio.DirectByteBuffer.<init>(DirectByteBuffer.java:123)
        at java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:306)
        at io.netty.buffer.PoolArena$DirectArena.newUnpooledChunk(PoolArena.java:651)
        at io.netty.buffer.PoolArena.allocateHuge(PoolArena.java:237)
        at io.netty.buffer.PoolArena.allocate(PoolArena.java:215)
        at io.netty.buffer.PoolArena.allocate(PoolArena.java:132)
        at io.netty.buffer.PooledByteBufAllocator.newDirectBuffer(PooledByteBufAllocator.java:271)
        at io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:155)
        at io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:146)
        at io.netty.buffer.CompositeByteBuf.allocBuffer(CompositeByteBuf.java:1345)
        at io.netty.buffer.CompositeByteBuf.consolidateIfNeeded(CompositeByteBuf.java:276)
        at io.netty.buffer.CompositeByteBuf.addComponent(CompositeByteBuf.java:116)
        at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
        at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:82)
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        at java.lang.Thread.run(Thread.java:745)

Can anyone tell me what's wrong and how to fix it?

Best Regards,
Kevin.


Mime
View raw message