One way is to use sparkSQL.
scala> sqlContext.sql("create table orc_table(key INT, value STRING) stored as orc")
scala> sqlContext.sql("insert into table orc_table select * from schema_rdd_temp_table")
scala> sqlContext.sql("FROM orc_table select *")

On 4 January 2015 at 00:57, SamyaMaiti <> wrote:
Hi Experts,

Like saveAsParquetFile on schemaRDD, there is a equivalent to store in ORC

I am using spark 1.2.0.

As per the link below, looks like its not part of 1.2.0, so any latest
update would be great.

Till the next release, is there a workaround to read/write ORC file.


View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail: