spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Wei Tan <w...@us.ibm.com>
Subject reuse hadoop code in Spark
Date Wed, 04 Jun 2014 20:08:19 GMT
Hello,

  I am trying to use spark in such a scenario:

  I have code written in Hadoop and now I try to migrate to Spark. The 
mappers and reducers are fairly complex. So I wonder if I can reuse the 
map() functions I already wrote in Hadoop (Java), and use Spark to chain 
them, mixing the Java map() functions with Spark operators?

  Another related question, can I use binary as operators, like Hadoop 
streaming?

  Thanks!
Wei

 
Mime
View raw message