spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mohit Jaggi <mohitja...@gmail.com>
Subject Re: Pyspark access to scala/java libraries
Date Sun, 15 Jul 2018 06:01:18 GMT
Trying again…anyone know how to make this work?

> On Jul 9, 2018, at 3:45 PM, Mohit Jaggi <mohitjaggi@gmail.com> wrote:
> 
> Folks,
> I am writing some Scala/Java code and want it to be usable from pyspark.
> 
> For example:
> class MyStuff(addend: Int)  {
> 	def myMapFunction(x: Int) = x + addend
> }
> 
> I want to call it from pyspark as:
> 
> df = ...
> mystuff = sc._jvm.MyStuff(5)
> df[‘x’].map(lambda x: mystuff.myMapFunction(x))
> 
> How can I do this?
> 
> Mohit.
> 
> 


---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message