spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From lihu <>
Subject the spark worker assignment Question?
Date Thu, 02 Jan 2014 12:53:57 GMT
   I run  spark on a cluster with 20 machine, but when I start an
application use the spark-shell, there only 4 machine is working , the
other with just idle, without memery and cpu used, I watch this through

   I wonder the other machine maybe  busy, so i watch the machines using
 "top" and "free" command, but this is not。

  * So I just wonder why not spark assignment work to all all the 20
machine? this is not a good resource usage.*

View raw message