spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Lochana Menikarachchi <locha...@gmail.com>
Subject Re: packaging spark run time with osgi service
Date Thu, 04 Dec 2014 02:50:44 GMT
I think the problem has to do with akka not picking up the 
reference.conf file in the assembly.jar

We managed to make akka pick the conf file by temporary switching the 
class loaders.

Thread.currentThread().setContextClassLoader(JavaSparkContext.class.getClassLoader());

The model gets build but execution fails during some later stage with a snappy error..

14/12/04 08:07:44 ERROR Executor: Exception in task 0.0 in stage 105.0 (TID 104)
java.lang.UnsatisfiedLinkError: org.xerial.snappy.SnappyNative.maxCompressedLength(I)I
	at org.xerial.snappy.SnappyNative.maxCompressedLength(Native Method)
	at org.xerial.snappy.Snappy.maxCompressedLength(Snappy.java:320)
	at org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:79)
	at org.apache.spark.io.SnappyCompressionCodec.compressedOutputStream(CompressionCodec.scala:125)

According to akka documentation a conf file can be parsed with -Dconfig.file= but, we couldn't
get it to work..

Any ideas how to do this?

Lochana




On 12/2/14 8:17 AM, Dinesh J. Weerakkody wrote:
> Hi Lochana,
>
> can you please go through this mail thread [1]. I haven't tried but 
> can be useful.
>
> [1] 
> http://apache-spark-user-list.1001560.n3.nabble.com/Packaging-a-spark-job-using-maven-td5615.html

>
>
> On Mon, Dec 1, 2014 at 4:28 PM, Lochana Menikarachchi 
> <lochanac@gmail.com <mailto:lochanac@gmail.com>> wrote:
>
>     I have spark core and mllib as dependencies for a spark based osgi
>     service. When I call the model building method through a unit test
>     (without osgi) it works OK. When I call it through the osgi
>     service, nothing happens. I tried adding spark assembly jar. Now
>     it throws following error..
>
>     An error occurred while building supervised machine learning
>     model: No configuration setting found for key 'akka.version'
>     com.typesafe.config.ConfigException$Missing: No configuration
>     setting found for key 'akka.version'
>         at
>     com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
>         at
>     com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
>         at
>     com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
>         at
>     com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
>         at
>     com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
>         at
>     com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:197)
>
>     What is the correct way to include spark runtime dependencies to
>     osgi service.. Thanks.
>
>     Lochana
>
>     ---------------------------------------------------------------------
>     To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>     <mailto:dev-unsubscribe@spark.apache.org>
>     For additional commands, e-mail: dev-help@spark.apache.org
>     <mailto:dev-help@spark.apache.org>
>
>
>
>
> -- 
> Thanks & Best Regards,
>
> *Dinesh J. Weerakkody*
> /www.dineshjweerakkody.com <http://www.dineshjweerakkody.com>/


Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message