spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Divya Gehlot <>
Subject convert row to map of key as int and values as arrays
Date Wed, 16 Mar 2016 04:11:09 GMT
As I cant add colmns from another Dataframe
I am planning to  my row coulmns to map of key and arrays
As I am new to scala and spark
I am trying like below

// create an empty map
import scala.collection.mutable.{ArrayBuffer => mArrayBuffer}
var map = Map[Int,mArrayBuffer[Any]]()

def addNode(key: String, value:ArrayBuffer[Any] ) ={
nodes += (key -> (value :: (nodes get key getOrElse Nil)))

  var rows = dfLnItmMappng.collect()
rows.foreach(r =>  addNode(r.getInt(2),
for ((k,v) <- rows)
printf("key: %s, value: %s\n", k, v)

But I am getting below error :
import scala.collection.mutable.{ArrayBuffer=>mArrayBuffer}
= Map()
<console>:28: error: not found: value nodes
        nodes += (key -> (value :: (nodes get key getOrElse Nil)))
<console>:27: error: not found: type ArrayBuffer
       def addNode(key: String, value:ArrayBuffer[Any] ) ={

If anybody knows  better method to add coulmns from another
dataframe,please help by letting me know .


View raw message