spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew A <andrew.a...@gmail.com>
Subject NotSerializableException in DStream.transform
Date Tue, 04 Oct 2016 10:29:07 GMT
Hi All!

I'm using Spark 1.6.1 and I'm trying to transform my DStream as follows:

myStream.transorm { rdd =>

     val sqlContext = SQLContext.getOrCreate(rdd.sparkContext)
      import sqlContext.implicits._

      val j = rdd.toDS()
      j.map {
          case a =>  Some(...)
          case _ => None
        }.rdd

    })
}

and I'm getting following exception:

java.io.NotSerializableException: DStream checkpointing has been enabled
but the DStreams with their functions are not serializable
org.scalatest.Assertions$AssertionsHelper
Serialization stack:
    - object not serializable (class:
org.scalatest.Assertions$AssertionsHelper, value:
org.scalatest.Assertions$AssertionsHelper@522fc67)
    - field (class: org.scalatest.FlatSpec, name: assertionsHelper, type:
class org.scalatest.Assertions$AssertionsHelper)
......

But after removing .toDS() this code works -

myStream.transorm { rdd =>

      val j = rdd.toDS()
      j.map {
          case a =>  Some(...)
          case _ => None
        }

    })
}

Can I use Datasets in DStream's?

Regards

Mime
View raw message