spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Loughran <>
Subject Re: Third party library
Date Sun, 27 Nov 2016 22:15:34 GMT

On 27 Nov 2016, at 02:55, kant kodali <<>>

I would say instead of LD_LIBRARY_PATH you might want to use java.library.path

in the following way

java -Djava.library.path=/path/to/my/library or pass java.library.path along with spark-submit

This is only going to set up paths on the submitting system; to load JNI code in the executors,
the binary needs to be sent to far end and then put on the Java load path there.

Copy the relevant binary to somewhere on the PATH of the destination machine. Do that and
you shouldn't have to worry about other JVM options, (though it's been a few years since I
did any JNI).

One trick: write a simple main() object/entry point which calls the JNI method, and doesn't
attempt to use any spark libraries; have it log any exception and return an error code if
the call failed. This will let you use it as a link test after deployment: if you can't run
that class then things are broken, before you go near spark

On Sat, Nov 26, 2016 at 6:44 PM, Gmail <<>>
Maybe you've already checked these out. Some basic questions that come to my mind are:
1) is this library "foolib" or "foo-C-library" available on the worker node?
2) if yes, is it accessible by the user/program (rwx)?


On Nov 26, 2016, at 5:08 PM, kant kodali <<>>

If it is working for standalone program I would think you can apply the same settings across
all the spark worker  and client machines and give that a try. Lets start with that.

On Sat, Nov 26, 2016 at 11:59 AM, vineet chadha <<>>
Just subscribed to  Spark User.  So, forwarding message again.

On Sat, Nov 26, 2016 at 11:50 AM, vineet chadha <<>>
Thanks Kant. Can you give me a sample program which allows me to call jni from executor task
?   I have jni working in standalone program in scala/java.


On Sat, Nov 26, 2016 at 11:43 AM, kant kodali <<>>
Yes this is a Java JNI question. Nothing to do with Spark really.

 java.lang.UnsatisfiedLinkError typically would mean the way you setup LD_LIBRARY_PATH is
wrong unless you tell us that it is working for other cases but not this one.

On Sat, Nov 26, 2016 at 11:23 AM, Reynold Xin <<>>
That's just standard JNI and has nothing to do with Spark, does it?

On Sat, Nov 26, 2016 at 11:19 AM, vineet chadha <<>>
Thanks Reynold for quick reply.

 I have tried following:

class MySimpleApp {
 // ---Native methods
  @native def fooMethod (foo: String): String

object MySimpleApp {
  val flag = false
  def loadResources() {
  val flag = true
  def main() {
    sc.parallelize(1 to 10).mapPartitions ( iter => {
      if(flag == false){
     val SimpleInstance = new MySimpleApp
      SimpleInstance.fooMethod ("fooString")

I don't see way to invoke fooMethod which is implemented in foo-C-library. Is I am missing
something ? If possible, can you point me to existing implementation which i can refer to.

Thanks again.


On Fri, Nov 25, 2016 at 3:32 PM, Reynold Xin <<>>
bcc dev@ and add user@

This is more a user@ list question rather than a dev@ list question. You can do something
like this:

object MySimpleApp {
  def loadResources(): Unit = // define some idempotent way to load resources, e.g. with a
flag or lazy val

  def main() = {

    sc.parallelize(1 to 10).mapPartitions { iter =>

      // do whatever you want with the iterator

On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha <<>>

I am trying to invoke C library from the Spark Stack using JNI interface (here is sample 
application code)

class SimpleApp {
 // ---Native methods
@native def foo (Top: String): String

object SimpleApp  {
   def main(args: Array[String]) {

    val conf = new SparkConf().setAppName("SimpleApplication").set("SPARK_LIBRARY_PATH", "lib")
    val sc = new SparkContext(conf)
    //instantiate the class
     val SimpleAppInstance = new SimpleApp
    //String passing - Working
    val ret ="fooString")

Above code work fines.

I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,  spark.executor.extraLibraryPath
at worker node

How can i invoke JNI library from worker node ? Where should i load it in executor ?
Calling  System.loadLibrary("foolib") inside the work node gives me following error :

Exception in thread "main" java.lang.UnsatisfiedLinkError:

Any help would be really appreciated.

View raw message