spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Silvio Fiorito <>
Subject Re: Reading custom inputformat from hadoop dfs
Date Mon, 28 Oct 2013 18:26:55 GMT
I was having the same probs trying to read from HCatalog with Scala API. The way around this
was that I created a wrapper InputFormat in Java that uses Spark's SerializableWritable.

I hacked this up Friday afternoon, tested a few times, and it seemed to work well.

Here's an example:

I'm new to Spark and Scala so this may not be the "right way", but it worked for me! :)

From: Arun Kumar <<>>
Reply-To: "<>"
Date: Monday, October 28, 2013 4:52 AM
To: "<>" <<>>
Subject: Reading custom inputformat from hadoop dfs


I am trying to read some custom sequence file from hadoop file system. The CustomInputFormat
Class implements InputFormat<WritableComparable, Writable> . I am able to read the file
in JavaRDD as follows

JobConf job = new JobConf();

FileInputFormat.setInputPaths(job, new Path(input));

JavaPairRDD<WritableComparable, Writable> rdd = spark.hadoopRDD(job, CustomInputFormat.class,
WritableComparable.class, Writable.class);

But I want to read directly from scala api, I am trying as follows

val job = new JobConf();

FileInputFormat.setInputPaths(job, new Path(input));

spark.hadoopRDD(job, classOf[CustomInputFormat], classOf[WritableComparable[Object]], classOf[Writable]);

I am getting the following error:

[error] argument expression's type is not compatible with formal parameter type;

[error]  found   : java.lang.Class[CustomInputFormat]

[error]  required: java.lang.Class[_ <: org.apache.hadoop.mapred.InputFormat[?K,?V]]

But my CustomInputFormat Class implements InputFormat<WritableComparable, Writable>.
Is the generics causing the compilation problem? WritableComparable expects a type parameter.

View raw message