spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <van...@cloudera.com>
Subject SparkContext.hadoopConfiguration vs. SparkHadoopUtil.newConfiguration()
Date Fri, 01 Aug 2014 23:48:04 GMT
Hi all,

While working on some seemingly unrelated code, I ran into this issue
where "spark.hadoop.*" configs were not making it to the Configuration
objects in some parts of the code. I was trying to do that to avoid
having to do dirty ticks with the classpath while running tests, but
that's a little besides the point.

Since I don't know the history of that code in SparkContext, does
anybody see any issue with moving it up a layer so that all code that
uses SparkHadoopUtil.newConfiguration() does the same thing?

This would also include some code (e.g. in the yarn module) that does
"new Configuration()" directly instead of going through the wrapper.


-- 
Marcelo

Mime
View raw message