spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 诺铁 <noty...@gmail.com>
Subject Re: SparkContext creation slow down unit tests
Date Tue, 16 Sep 2014 07:55:22 GMT
sorry for disturb, please ignore this mail....
in the end, I find it slow because lack of memory in my machine..

sorry again.

On Tue, Sep 16, 2014 at 3:26 PM, 诺铁 <notyycn@gmail.com> wrote:

> I connect my sample project to a hosted CI service, it only takes 3
> seconds to run there...while the same tests takes 2minutes on my macbook
> pro.  so maybe this is a mac os specific problem?
>
> On Tue, Sep 16, 2014 at 3:06 PM, 诺铁 <notyycn@gmail.com> wrote:
>
>> hi,
>>
>> I am trying to write some unit test, following spark programming guide
>> <http://spark.apache.org/docs/latest/programming-guide.html#unit-testing>
>> .
>> but I observed unit test runs very slow(the code is just a SparkPi),  so
>> I turn log level to trace and look through the log output.  and found
>> creation of SparkContext seems to take most time.
>>
>> there are some action take a lot of time:
>> 1, seems starting jetty
>> 14:04:55.038 [ScalaTest-run-running-MySparkPiSpec] DEBUG
>> o.e.jetty.servlet.ServletHandler -
>> servletNameMap={org.apache.spark.ui.JettyUtils$$anon$1-672f11c2=org.apache.spark.ui.JettyUtils$$anon$1-672f11c2}
>> 14:05:25.121 [ScalaTest-run-running-MySparkPiSpec] DEBUG
>> o.e.jetty.util.component.Container - Container
>> org.eclipse.jetty.server.Server@e3cee7b +
>> SelectChannelConnector@0.0.0.0:4040 as connector
>>
>> 2, I don't know what's this
>> 14:05:25.202 [ScalaTest-run-running-MySparkPiSpec] DEBUG
>> org.apache.hadoop.security.Groups - Group mapping
>> impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping;
>> cacheTimeout=300000
>> 14:05:54.594 [spark-akka.actor.default-dispatcher-4] TRACE
>> o.a.s.s.BlockManagerMasterActor - Checking for hosts with no recent heart
>> beats in BlockManagerMaster.
>>
>>
>> are there any way to make this faster for unit test?
>> I also notice in spark's unit test, that there exists a
>> SharedSparkContext
>> <https://github.com/apache/spark/blob/master/core/src/test/scala/org/apache/spark/SharedSparkContext.scala>,
>> is this the suggested way to work around this problem? if so, I would
>> suggest to document this in programming guide.
>>
>
>

Mime
View raw message