Unfortunately, afaik that project is long dead.
It'd be an interesting project to create an intermediary protocol, perhaps using something
that nearly everything these days understand (unfortunately [!] that might be JavaScript).
For example, instead of pickling language constructs, it might be interesting to translate
rdd operations to some json structure, and have a single thing server side processing the
"instructions".
There's also mbrace (http://www.m-brace.net/)... mbrace-spark integration would be quite interesting
indeed. Though the difference in approach might be quite a challenge.
Another approach could be using IKVM to host the JVM, much like how pyspark executes.
Microsoft research published some very early work in OneNet: http://research.microsoft.com/en-us/um/people/jinl/redesign/research/onenet_executive_summary.pdf
- their careers page seems to be recruiting for the project.
Again, these are all future things, most of which would need to be community driven. If you
need something right now, then there really isn't good integration between spark and .NET.
However, given your requirements, mbrace might be something that you might find useful.
-Ashic.
Date: Sun, 5 Jul 2015 11:05:30 -0600
Subject: Re: .NET on Apache Spark?
From: dautkhanov@gmail.com
To: ski.rodriguez@gmail.com
CC: user@spark.apache.org
Scala used to run on .NEThttp://www.scala-lang.org/old/node/10299
--
Ruslan Dautkhanov
On Thu, Jul 2, 2015 at 1:26 PM, pedro <ski.rodriguez@gmail.com> wrote:
You might try using .pipe() and installing your .NET program as a binary
across the cluster (or using addFile). Its not ideal to pipe things in/out
along with the overhead, but it would work.
I don't know much about IronPython, but perhaps changing the default python
by changing your path might work?
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/NET-on-Apache-Spark-tp23578p23594.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
|