spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Eduardo Mello <eedu.me...@gmail.com>
Subject Re: Crash in Unit Tests
Date Fri, 29 Sep 2017 20:22:35 GMT
I had this problem at my work.

I solved by increasing the unix ulimit,  because spark is trying to open to
many files.

Em 29 de set de 2017 5:05 PM, "Anthony Thomas" <ahthomas@eng.ucsd.edu>
escreveu:

> Hi Spark Users,
>
> I recently compiled spark 2.2.0 from source on an EC2 m4.2xlarge instance
> (8 cores, 32G RAM) running Ubuntu 14.04. I'm using Oracle Java 1.8. I
> compiled using the command:
>
> export MAVEN_OPTS="-Xmx2g -XX:ReservedCodeCacheSize=512m"
> ./build/mvn -DskipTests -Pnetlib-lgpl clean package
>
> Spark compiles fine, but when running tests (./build/mvn test), at least
> one test in "StandaloneDynamicAllocationSuite" of "Spark Project Core"
> consistently cause the JVM to crash with errors like:
>
>  Exception in thread "ExecutorRunner for app-20170929185545-0000/0"
> java.lang.OutOfMemoryError: unable to create new native thread
>
> According to "cat /proc/sys/kernel/threads-max" the system can support up
> to 120,000 threads which seems like it should be more than enough. The
> limits set in ulimit also seem reasonable. Looking at "top," the JVM
> doesn't seem to be using anywhere close to 32GB of RAM.
>
> Has anyone else encountered similar issues or have any suggestions about
> where to go in diagnosing the cause of this problem? Alternatively, is
> this a problem I can safely ignore? Running some short code segments on
> Spark seems to work just fine, but I'm wondering if this will become a
> problem at heavy loads. Please let me know if there's any other info that
> would be helpful.
>

Mime
View raw message