spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Patrick Woody <patrick.woo...@gmail.com>
Subject DataSourceV2 write input requirements
Date Mon, 26 Mar 2018 15:40:40 GMT
Hey all,

I saw in some of the discussions around DataSourceV2 writes that we might
have the data source inform Spark of requirements for the input data's
ordering and partitioning. Has there been a proposed API for that yet?

Even one level up it would be helpful to understand how I should be
thinking about the responsibility of the data source writer, when I should
be inserting a custom catalyst rule, and how I should handle
validation/assumptions of the table before attempting the write.

Thanks!
Pat

Mime
View raw message