spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "deenar.toraskar" <>
Subject DataFrame saveAsTable - partitioned tables
Date Sun, 22 Mar 2015 07:19:38 GMT

I wanted to store DataFrames as partitioned Hive tables. Is there a way to
do this via the saveAsTable call. The set of options does not seem to be

saveAsTable(tableName: String, source: String, mode: SaveMode, options:
Map[String, String]): Unit
(Scala-specific) Creates a table from the the contents of this DataFrame
based on a given data source, SaveMode specified by mode, and a set of

Optionally is there a way to just create external hive tables for data that
is already present on HDFS. something similar to 

sc.sql("alter table results add partition (date = '20141111');")


View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message