spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Isabelle Phan <>
Subject SparkSQL API to insert DataFrame into a static partition?
Date Wed, 02 Dec 2015 02:50:25 GMT

Is there any API to insert data into a single partition of a table?

Let's say I have a table with 2 columns (col_a, col_b) and a partition by
After doing some computation for a specific date, I have a DataFrame with 2
columns (col_a, col_b) which I would like to insert into a specific date
partition. What is the best way to achieve this?

It seems that if I add a date column to my DataFrame, and turn on dynamic
partitioning, I can do:
But it seems overkill to use dynamic partitioning function for such a case.

Thanks for any pointers!


View raw message