spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Ash <and...@andrewash.com>
Subject Re: Exception in serialization hangs saving-to-disk
Date Tue, 28 Jan 2014 17:55:24 GMT
Are you able to get a copy of the exception you refer to?


On Tue, Jan 28, 2014 at 2:26 AM, Ionized <ionized@gmail.com> wrote:

> I noticed that running the following code results in the process hanging
> forever waiting for the Job to complete.
> It seems the exception never propagates to the caller.
>
> Should a bug be filed on this?
>
> - Paul
>
>
>
> import java.io.IOException;
> import java.io.ObjectInputStream;
> import java.io.ObjectOutputStream;
> import java.io.Serializable;
> import java.util.ArrayList;
> import java.util.List;
>
> import org.apache.spark.api.java.JavaRDD;
> import org.apache.spark.api.java.JavaSparkContext;
>
> public class SparkSerializationTest {
>
> public static void main(String[] args) {
> JavaSparkContext context = new JavaSparkContext("local[3]", "test");
>  List<MyObject> list = new ArrayList<>();
> list.add(new MyObject());
> JavaRDD<MyObject> rdd = context.parallelize(list);
>  rdd.saveAsObjectFile("/tmp/sparkserializationtest");
> }
>
> private static final class MyObject implements Serializable {
>
> private static final long serialVersionUID = 1L;
>
> private void readObject(ObjectInputStream in) throws IOException,
>  ClassNotFoundException {
> }
>
> private void writeObject(ObjectOutputStream out) throws IOException {
>  throw new RuntimeException();
> }
>
> }
> }
>
>

Mime
View raw message