spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dirceu Semighini Filho <dirceu.semigh...@gmail.com>
Subject Re: How to create a Row from a List or Array in Spark using Scala
Date Mon, 02 Mar 2015 13:16:50 GMT
You can use the parallelize method:

val data = List(
  Row(1, 5, "vlr1", 10.5),
  Row(2, 1, "vl3", 0.1),
  Row(3, 8, "vl3", 10.0),
  Row(4, 1, "vl4", 1.0))
val rdd = sc.parallelize(data)

Here I'm using a list of Rows, but you could use it with a list of
other kind of object, like this:


val x = sc.parallelize(List("a","b","c"))

Where x is an RDD[String] and sc is the spark context.


Regards,

Dirceu


2015-02-28 5:37 GMT-03:00 DEVAN M.S. <msdevanms@gmail.com>:

>   In scala API its there, Row.fromSeq(ARRAY), I dnt know much more
> about java api
>
>
>
> Devan M.S. | Research Associate | Cyber Security | AMRITA VISHWA
> VIDYAPEETHAM | Amritapuri | Cell +919946535290 |
>
>
> On Sat, Feb 28, 2015 at 1:28 PM, r7raul1984@163.com <r7raul1984@163.com>
> wrote:
>
> > import org.apache.spark.sql.catalyst.expressions._
> >
> > val values: JavaArrayList[Any] = new JavaArrayList()
> > computedValues = Row(values.get(0),values.get(1)) //It is not good by use
> > get(index).  How to create a Row from a List or Array in Spark using
> Scala .
> >
> >
> >
> > r7raul1984@163.com
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message