spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From manasdebashiskar <>
Subject Re: Problem using limit clause in spark sql
Date Fri, 25 Dec 2015 15:10:41 GMT
It can be easily done using an RDD.

rdd.zipwithIndex.partitionBy(YourCustomPartitioner) should give you your
Here YourCustomPartitioner will know how to pick sample items from each

If you want to stick to Dataframe you can always repartition the data after
you apply the limit.


View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message