spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ajay Chander <>
Subject Spark_sql
Date Thu, 22 Oct 2015 01:32:59 GMT
Hi Everyone,

I have a use case where I have to create a DataFrame inside the map()
function. To create a DataFrame it need sqlContext or hiveContext. Now how
do I pass the context to my map function ? And I am doing it in java. I
tried creating a class "TestClass" which implements "Function<Row, String>"
and inside the call method I want to create the DataFrame, so I created a
parameterized constructor to pass context from driver program to TestClass
and use that context to create DataFrame. But it seems like it's a wrong
way of doing. Can anyone help me in this? Thanks in advance.


View raw message