spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From rajnish <>
Subject How Spark Calculate partition size automatically
Date Mon, 12 Jan 2015 22:33:23 GMT

When I am running a job, that is loading the data from Cassandra, Spark has
created almost 9million partitions. How spark decide the partition count? I
have read from one of the presentation that it is good to have 1000 to
10,000 partitions.


View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message