One way is to use sparkSQL.
scala> sqlContext.sql("create table orc_table(key INT, value STRING) stored as orc")
scala> sqlContext.sql("insert into table orc_table select * from schema_rdd_temp_table")
scala> sqlContext.sql("FROM orc_table select *")


On 4 January 2015 at 00:57, SamyaMaiti <samya.maiti2012@gmail.com> wrote:
Hi Experts,

Like saveAsParquetFile on schemaRDD, there is a equivalent to store in ORC
file.

I am using spark 1.2.0.

As per the link below, looks like its not part of 1.2.0, so any latest
update would be great.
https://issues.apache.org/jira/browse/SPARK-2883

Till the next release, is there a workaround to read/write ORC file.

Regards,
Sam



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/save-rdd-to-ORC-file-tp20956.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org