spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <so...@cloudera.com>
Subject Re: Do Spark executors restrict native heap vs JVM heap?
Date Mon, 03 Nov 2014 06:16:24 GMT
Yes, that's correct to my understanding and the probable explanation of
your issue. There are no additional limits or differences from how the JVM
works here.
On Nov 3, 2014 4:40 AM, "Paul Wais" <pwais@yelp.com> wrote:

> Thanks Sean! My novice understanding is that the 'native heap' is the
> address space not allocated to the JVM heap, but I wanted to check to see
> if I'm missing something.  I found out my issue appeared to be actual
> memory pressure on the executor machine.  There was space for the JVM heap
> but not much more.
>
> On Thu, Oct 30, 2014 at 12:49 PM, Sean Owen <sowen@cloudera.com> wrote:
> > No, but, the JVM also does not allocate memory for native code on the
> heap.
> > I dont think heap has any bearing on whether your native code can't
> allocate
> > more memory except that of course the heap is also taking memory.
> >
> > On Oct 30, 2014 6:43 PM, "Paul Wais" <pwais@yelp.com> wrote:
> >>
> >> Dear Spark List,
> >>
> >> I have a Spark app that runs native code inside map functions.  I've
> >> noticed that the native code sometimes sets errno to ENOMEM indicating
> >> a lack of available memory.  However, I've verified that the /JVM/ has
> >> plenty of heap space available-- Runtime.getRuntime().freeMemory()
> >> shows gigabytes free and the native code needs only megabytes.  Does
> >> spark limit the /native/ heap size somehow?  Am poking through the
> >> executor code now but don't see anything obvious.
> >>
> >> Best Regards,
> >> -Paul Wais
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> >> For additional commands, e-mail: user-help@spark.apache.org
> >>
> >
>
>

Mime
View raw message