spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <jishnu.prat...@wipro.com>
Subject RE: Persist streams to text files
Date Fri, 21 Nov 2014 06:42:07 GMT
Hi Akhil
            Thanks for reply
But it creates different directories ..I tried using filewriter  but it shows non serializable
error..
val stream = TwitterUtils.createStream(ssc, None) //, filters)

    val statuses = stream.map(
      status => sentimentAnalyzer.findSentiment({
        status.getText().replaceAll("[^A-Za-z0-9 \\#]", "")

      })
      )

    val line = statuses.foreachRDD(
      rdd => {
        rdd.foreach(
          tweetWithSentiment => {
            if(!tweetWithSentiment.getLine().isEmpty())
            println(tweetWithSentiment.getCssClass() + " for line :=>  " + tweetWithSentiment.getLine())//Now
I print in console but I need to update it to a file in local machine

          })
      })

Thanks & Regards
Jishnu Menath Prathap
From: Akhil Das [mailto:akhil@sigmoidanalytics.com]
Sent: Friday, November 21, 2014 11:48 AM
To: Jishnu Menath Prathap (WT01 - BAS)
Cc: user@spark.incubator.apache.org
Subject: Re: Persist streams to text files


To have a single text file output for each batch you can repartition it to 1 and then call
the saveAsTextFiles

stream.repartition(1).saveAsTextFiles(location)
On 21 Nov 2014 11:28, <jishnu.prathap@wipro.com<mailto:jishnu.prathap@wipro.com>>
wrote:
Hi I am also having similar problem.. any fix suggested..

Originally Posted by GaganBM
Hi,

I am trying to persist the DStreams to text files. When I use the inbuilt API 'saveAsTextFiles'
as :

stream.saveAsTextFiles(resultDirectory)

this creates a number of subdirectories, for each batch, and within each sub directory, it
creates bunch of text files for each RDD (I assume).

I am wondering if I can have single text files for each batch. Is there any API for that ?
Or else, a single output file for the entire stream ?

I tried to manually write from each RDD stream to a text file as :

stream.foreachRDD(rdd =>{
  rdd.foreach(element => {
  fileWriter.write(element)
  })
  })

where 'fileWriter' simply makes use of a Java BufferedWriter to write strings to a file. However,
this fails with exception :

DStreamCheckpointData.writeObject used
java.io.BufferedWriter
java.io.NotSerializableException: java.io.BufferedWriter
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1183)
        at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
        .....

Any help on how to proceed with this ?

The information contained in this electronic message and any attachments to this message are
intended for the exclusive use of the addressee(s) and may contain proprietary, confidential
or privileged information. If you are not the intended recipient, you should not disseminate,
distribute or copy this e-mail. Please notify the sender immediately and destroy all copies
of this message and any attachments.

WARNING: Computer viruses can be transmitted via email. The recipient should check this email
and any attachments for the presence of viruses. The company accepts no liability for any
damage caused by any virus transmitted by this email.

www.wipro.com<http://www.wipro.com>

The information contained in this electronic message and any attachments to this message are
intended for the exclusive use of the addressee(s) and may contain proprietary, confidential
or privileged information. If you are not the intended recipient, you should not disseminate,
distribute or copy this e-mail. Please notify the sender immediately and destroy all copies
of this message and any attachments. 

WARNING: Computer viruses can be transmitted via email. The recipient should check this email
and any attachments for the presence of viruses. The company accepts no liability for any
damage caused by any virus transmitted by this email.

www.wipro.com
Mime
View raw message