spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nirav Patel <>
Subject Spark SQL UDF - passing map as a UDF parameter
Date Tue, 15 Nov 2016 00:33:35 GMT
I am trying to use following API from Functions to convert a map into
column so I can pass it to UDF.

map(cols: Column
*): Column

"Creates a new map column. The input columns must be grouped as key-value
pairs, e.g. (key1, value1, key2, value2, ...). The key columns must all
have the same data type, and can't be null. The value columns must all have
the same data type."

final val idxMap = idxMapRdd.collectAsMap
val colmap =  map( _): _*)

But getting following error:

<console>:139: error: type mismatch;
 found   : Iterable[org.apache.spark.sql.Column]
 required: Seq[org.apache.spark.sql.Column]
       val colmap =  map( _): _*)

If I try:
val colmap =  map( _).toSeq: _*)

It says:

java.lang.RuntimeException: Unsupported literal type class scala.Tuple2
at org.apache.spark.sql.functions$.lit(functions.scala:101)
at $anonfun$1.apply(<console>:153)

What is the correct usage of a `map` api to convert hashmap into column?


[image: What's New with Xactly] <>

<>  [image: LinkedIn] 
<>  [image: Twitter] 
<>  [image: Facebook] 
<>  [image: YouTube] 

View raw message