spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hyukjin Kwon <gurwls...@gmail.com>
Subject Re: can spark-csv package accept strings instead of files?
Date Fri, 15 Apr 2016 15:02:34 GMT
I hope it was not too late :).

It is possible.

Please check csvRdd api here,
https://github.com/databricks/spark-csv/blob/master/src/main/scala/com/databricks/spark/csv/CsvParser.scala#L150
.

Thanks!
On 2 Apr 2016 2:47 a.m., "Benjamin Kim" <bbuild11@gmail.com> wrote:

> Does anyone know if this is possible? I have an RDD loaded with rows of
> CSV data strings. Each string representing the header row and multiple rows
> of data along with delimiters. I would like to feed each thru a CSV parser
> to convert the data into a dataframe and, ultimately, UPSERT a Hive/HBase
> table with this data.
>
> Please let me know if you have any ideas.
>
> Thanks,
> Ben
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Mime
View raw message