spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nicolas Paris <nicolas.pa...@riseup.net>
Subject Best approach to write UDF
Date Tue, 21 Jan 2020 17:29:17 GMT
Hi

I have written spark udf and I am able to use them in spark scala /
pyspark by using the org.apache.spark.sql.api.java.UDFx API.

I d'like to use them in spark-sql thought thrift. I tried to create the
functions "create function as 'org.my.MyUdf'". however I get the below
error when using it:

> org.apache.spark.sql.AnalysisException: No handler for UDF/UDAF/UDTF 'org.my.MyUdf';


I have read there (https://stackoverflow.com/a/56970800/3865083) that
only the org.apache.hadoop.hive.ql.exec.UDF API works for thrift. 

How one can write UDF the good way ?

Thanks

-- 
nicolas

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message