spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gourav Sengupta <gourav.sengu...@gmail.com>
Subject Re: Splitting columns from a text file
Date Mon, 05 Sep 2016 22:21:54 GMT
just use SPARK CSV, all other ways of splitting and working is just trying
to reinvent the wheel and a magnanimous waste of time.


Regards,
Gourav

On Mon, Sep 5, 2016 at 1:48 PM, Ashok Kumar <ashok34668@yahoo.com.invalid>
wrote:

> Hi,
>
> I have a text file as below that I read in
>
> 74,20160905-133143,98.11218069128827594148
> 75,20160905-133143,49.52776998815916807742
> 76,20160905-133143,56.08029957123980984556
> 77,20160905-133143,46.63689526544407522777
> 78,20160905-133143,84.88227141164402181551
> 79,20160905-133143,68.72408602520662115000
>
> val textFile = sc.textFile("/tmp/mytextfile.txt")
>
> Now I want to split the rows separated by ","
>
> scala> textFile.map(x=>x.toString).split(",")
> <console>:27: error: value split is not a member of
> org.apache.spark.rdd.RDD[String]
>        textFile.map(x=>x.toString).split(",")
>
> However, the above throws error?
>
> Any ideas what is wrong or how I can do this if I can avoid converting it
> to String?
>
> Thanking
>
>

Mime
View raw message