Can you not create SparkContext inside the scheduler code? If you are looking just to access hdfs then you can use the following object with it, you can create/read/write files.

val hdfs = org.apache.hadoop.fs.FileSystem.get(new URI("hdfs://localhost:9000"), hadoopConf)



Thanks
Best Regards

On Fri, Nov 14, 2014 at 9:12 PM, rapelly kartheek <kartheek.mbms@gmail.com> wrote:
No. I am not accessing hdfs from either shell or a spark application. I want to access from spark "Scheduler code".

I face an error when I use sc.textFile() as SparkContext wouldn't have been created yet. So, error says: "sc not found".

On Fri, Nov 14, 2014 at 9:07 PM, Akhil Das <akhil@sigmoidanalytics.com> wrote:
like this?

val file = sc.textFile("hdfs://localhost:9000/sigmoid/input.txt")

Thanks
Best Regards

On Fri, Nov 14, 2014 at 9:02 PM, rapelly kartheek <kartheek.mbms@gmail.com> wrote:
Hi,
I am trying to read a HDFS file from Spark "scheduler code". I could find how to write hdfs read/writes in java. 

But I  need to access hdfs from spark using scala. Can someone please help me in this regard.