spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Cheng Lian <lian.cs....@gmail.com>
Subject Re: Can Spark benefit from Hive-like partitions?
Date Mon, 26 Jan 2015 18:55:54 GMT
Currently no if you don't want to use Spark SQL's HiveContext. But we're 
working on adding partitioning support to the external data sources API, 
with which you can create, for example, partitioned Parquet tables 
without using Hive.

Cheng

On 1/26/15 8:47 AM, Danny Yates wrote:
> Thanks Michael.
>
> I'm not actually using Hive at the moment - in fact, I'm trying to 
> avoid it if I can. I'm just wondering whether Spark has anything 
> similar I can leverage?
>
> Thanks


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message