spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nan Zhu <zhunanmcg...@gmail.com>
Subject Discussion on SPARK-1139
Date Wed, 26 Feb 2014 13:23:29 GMT
Hi, all  

I just created a JIRA https://spark-project.atlassian.net/browse/SPARK-1139 . The issue discusses
that:

the new Hadoop API based Spark APIs are actually a mixture of old and new Hadoop API.

Spark APIs are still using JobConf (or Configuration) as one of the parameters, but actually
Configuration has been replace by mapreduce.Job in the new Hadoop API

for example : http://codesfusion.blogspot.ca/2013/10/hadoop-wordcount-with-new-map-reduce-api.html
 

&  

http://www.slideshare.net/sh1mmer/upgrading-to-the-new-map-reduce-api (p10)

Personally I think it’s better to fix this design, but it will introduce some compatibility
issue  

Just bring it here for your advices

Best,  

--  
Nan Zhu


Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message