spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matei Zaharia <>
Subject Re: Configuring custom input format
Date Tue, 25 Nov 2014 22:31:22 GMT
How are you creating the object in your Scala shell? Maybe you can write a function that directly
returns the RDD, without assigning the object to a temporary variable.


> On Nov 5, 2014, at 2:54 PM, Corey Nolet <> wrote:
> The closer I look @ the stack trace in the Scala shell, it appears to be the call to
toString() that is causing the construction of the Job object to fail. Is there a ways to
suppress this output since it appears to be hindering my ability to new up this object?
> On Wed, Nov 5, 2014 at 5:49 PM, Corey Nolet < <>>
> I'm trying to use a custom input format with SparkContext.newAPIHadoopRDD. Creating the
new RDD works fine but setting up the configuration file via the static methods on input formats
that require a Hadoop Job object is proving to be difficult. 
> Trying to new up my own Job object with the SparkContext.hadoopConfiguration is throwing
the exception on line 283 of this grepcode:
> Looking in the SparkContext code, I'm seeing that it's newing up Job objects just fine
using nothing but the configuraiton. Using SparkContext.textFile() appears to be working for
me. Any ideas? Has anyone else run into this as well? Is it possible to have a method like
SparkContext.getJob() or something similar?
> Thanks.

View raw message