spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nishant Patel <nishant.k.pa...@gmail.com>
Subject java.io.NotSerializableException: org.apache.spark.streaming.StreamingContext
Date Fri, 23 Jan 2015 10:59:41 GMT
Below is code I have written. I am getting NotSerializableException. How
can I handle this scenario?

kafkaStream.foreachRDD(rdd => {
      println("<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<")
      rdd.foreachPartition(partitionOfRecords => {
        partitionOfRecords.foreach(
          record => {

            //Write for CSV.
            if (true == true) {

              val structType = table.schema
              val csvFile = ssc.sparkContext.textFile(record.toString())

              val rowRDD = csvFile.map(x =>
getMappedRowFromCsvRecord(structType, x))

            }
          })

-- 
Regards,
Nishant

Mime
View raw message