spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "majian" <maj...@nq.com>
Subject Shark Tasks in parallel execution
Date Wed, 18 Jun 2014 09:36:47 GMT
HI,all

I confuse that why shark window don't execute queries also occupied resources ? In the case
of  shark window is not closed how can parallel execution multiple queries? 

for example 
Spark config:
 total  Nodes: 3 
 Spark total  cores :9
Shark config:
SPARK_JAVA_OPTS+="-Dspark.scheduler.allocation.file=/opt/spark-0.9.1-bin-hadoop1/conf/fairscheduler.xml
"
        SPARK_JAVA_OPTS+="-Dspark.cores.max=9 "
        SPARK_JAVA_OPTS+="-Dspark.scheduler.mode=FAIR "

I can only open a window to execute queries, if set spark.cores.max less than 9 I think there
are not take full advantage of the cluster.


Thank you for your help !!!

2014-06-18 



majian 

Mime
View raw message