spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Achilleus 003 <achilleus...@gmail.com>
Subject Udfs in spark
Date Thu, 28 Mar 2019 04:45:25 GMT
Couple of questions regarding udfs:
1) Is there a way to get all the registered UDFs in spark scala?
I couldn’t find any straight forward api. But found a pattern to get all the registered
udfs. 
Spark.catalog.listfunctions.filter(_.className == null).collect

This does the trick but not sure it will hold true in all the cases.Is there a better way
to get all the registered udfs?

2) is there way i can share my udfs across session when not using databricks notebook?
 

Sent from my iPhone
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message