spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From AlexG <>
Subject how to write pyspark interface to scala code?
Date Tue, 12 Apr 2016 23:30:31 GMT
I have Scala Spark code for computing a matrix factorization. I'd like to
make it possible to use this code from PySpark, so users can pass in a
python RDD and receive back one without knowing or caring that Scala code is
being called.

Please point me to an example of code (e.g. somewhere in the Spark codebase,
if it's clean enough) from which I can learn how to do this.

View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message