spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mohit Jaggi <mohitja...@gmail.com>
Subject Pyspark access to scala/java libraries
Date Mon, 09 Jul 2018 22:45:45 GMT
Folks,
I am writing some Scala/Java code and want it to be usable from pyspark.

For example:
class MyStuff(addend: Int)  {
	def myMapFunction(x: Int) = x + addend
}

I want to call it from pyspark as:

df = ...
mystuff = sc._jvm.MyStuff(5)
df[‘x’].map(lambda x: mystuff.myMapFunction(x))

How can I do this?

Mohit.



---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message