spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rishi Mishra <rmis...@snappydata.io>
Subject Re: SparkSQL API to insert DataFrame into a static partition?
Date Wed, 02 Dec 2015 08:59:58 GMT
As long as all your data is being inserted by Spark , hence using the same
hash partitioner,  what Fengdong mentioned should work.

On Wed, Dec 2, 2015 at 9:32 AM, Fengdong Yu <fengdongy@everstring.com>
wrote:

> Hi
> you can try:
>
> if your table under location “/test/table/“ on HDFS
> and has partitions:
>
>  “/test/table/dt=2012”
>  “/test/table/dt=2013”
>
> df.write.mode(SaveMode.Append).partitionBy("date”).save(“/test/table")
>
>
>
> On Dec 2, 2015, at 10:50 AM, Isabelle Phan <nliphan@gmail.com> wrote:
>
> df.write.partitionBy("date").insertInto("my_table")
>
>
>


-- 
Regards,
Rishitesh Mishra,
SnappyData . (http://www.snappydata.io/)

https://in.linkedin.com/in/rishiteshmishra

Mime
View raw message