spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Vasili I. Galchin" <vigalc...@gmail.com>
Subject Re: how can I write a language "wrapper"?
Date Mon, 29 Jun 2015 08:33:37 GMT
Shivaram,

    Vis-a-vis Haskell support, I am reading DataFrame.R,
SparkRBackend*, context.R, et. al., am I headed in the correct
direction?/ Yes or no, please give more guidance. Thank you.

Kind regards,

Vasili



On Tue, Jun 23, 2015 at 1:46 PM, Shivaram Venkataraman
<shivaram@eecs.berkeley.edu> wrote:
> Every language has its own quirks / features -- so I don't think there
> exists a document on how to go about doing this for a new language. The most
> related write up I know of is the wiki page on PySpark internals
> https://cwiki.apache.org/confluence/display/SPARK/PySpark+Internals written
> by Josh Rosen -- It covers some of the issues like closure capture,
> serialization, JVM communication that you'll need to handle for a new
> language.
>
> Thanks
> Shivaram
>
> On Tue, Jun 23, 2015 at 1:35 PM, Vasili I. Galchin <vigalchin@gmail.com>
> wrote:
>>
>> Hello,
>>
>>       I want to add language support for another language(other than
>> Scala, Java et. al.). Where is documentation that explains to provide
>> support for a new language?
>>
>> Thank you,
>>
>> Vasili
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Mime
View raw message