spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "majian" <>
Subject Shark Tasks in parallel execution
Date Wed, 18 Jun 2014 09:36:47 GMT

I confuse that why shark window don't execute queries also occupied resources ? In the case
of  shark window is not closed how can parallel execution multiple queries? 

for example 
Spark config:
 total  Nodes: 3 
 Spark total  cores :9
Shark config:
        SPARK_JAVA_OPTS+="-Dspark.cores.max=9 "
        SPARK_JAVA_OPTS+="-Dspark.scheduler.mode=FAIR "

I can only open a window to execute queries, if set spark.cores.max less than 9 I think there
are not take full advantage of the cluster.

Thank you for your help !!!



View raw message