spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From hequn8128 <>
Subject Spark properties setting doesn't take effect
Date Sat, 01 Mar 2014 09:30:43 GMT
I wrote a standalone cluster app in scala, and i did some properties

System.setProperty("spark.akka.frameSize", "100") 
System.setProperty("spark.executor.memory", "3g") 
val sc = new SparkContext(...) 

It is strange that "spark.executor.memory" has taken effect, but
"spark.akka.frameSize" does not take effect. 
here is the sparkUI: 
Spark Executor Command: "/usr/java/jdk1.6.0_31/bin/java" "-cp"
"-Xms3072M" "-Xmx3072M"
"akka.tcp://spark@Master.Hadoop:55755/user/CoarseGrainedScheduler" "8"
"Salve2.Hadoop" "8" "akka.tcp://sparkWorker@Salve2.Hadoop:56145/user/Worker"

Because "spark.akka.frameSize" is 10 by default, my app keep showing "Lost
I use sbt package and sbt run to run my app. 

So,What's the reason and how to make "spark.akka.frameSize" take effect? 
Thank you very much!

View this message in context:
Sent from the Apache Spark User List mailing list archive at

View raw message