spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Cheng Lian <>
Subject Re: Can Spark benefit from Hive-like partitions?
Date Mon, 26 Jan 2015 18:55:54 GMT
Currently no if you don't want to use Spark SQL's HiveContext. But we're 
working on adding partitioning support to the external data sources API, 
with which you can create, for example, partitioned Parquet tables 
without using Hive.


On 1/26/15 8:47 AM, Danny Yates wrote:
> Thanks Michael.
> I'm not actually using Hive at the moment - in fact, I'm trying to 
> avoid it if I can. I'm just wondering whether Spark has anything 
> similar I can leverage?
> Thanks

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message