spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Night Wolf <nightwolf...@gmail.com>
Subject Re: Unable to find org.apache.spark.sql.catalyst.ScalaReflection class
Date Mon, 23 Mar 2015 08:39:15 GMT
Was a solution ever found for this. Trying to run some test cases with sbt
test which use spark sql and in Spark 1.3.0 release with Scala 2.11.6 I get
this error. Setting fork := true in sbt seems to work but its a less than
idea work around.

On Tue, Mar 17, 2015 at 9:37 PM, Eric Charles <eric@apache.org> wrote:

> Launching from eclipse (scala-ide) as a scala process gives such error,
> but as  a java process (a java main class) works fine.
>
> Launching as a scala process from Intellij works fine.
>
> There is something wrong at eclipse side, not in Spark.
>
>
> On 03/13/2015 11:47 AM, Jianshi Huang wrote:
> > Liancheng also found out that the Spark jars are not included in the
> > classpath of URLClassLoader.
> >
> > Hmm... we're very close to the truth now.
> >
> > Jianshi
> >
> > On Fri, Mar 13, 2015 at 6:03 PM, Jianshi Huang <jianshi.huang@gmail.com
> > <mailto:jianshi.huang@gmail.com>> wrote:
> >
> >     I'm almost certain the problem is the ClassLoader.
> >
> >     So adding
> >
> >       fork := true
> >
> >     solves problems for test and run.
> >
> >     The problem is how can I fork a JVM for sbt console? fork in console
> >     := true seems not working...
> >
> >     Jianshi
> >
> >
> >     On Fri, Mar 13, 2015 at 4:35 PM, Jianshi Huang
> >     <jianshi.huang@gmail.com <mailto:jianshi.huang@gmail.com>> wrote:
> >
> >         I guess it's a ClassLoader issue. But I have no idea how to
> >         debug it. Any hints?
> >
> >         Jianshi
> >
> >         On Fri, Mar 13, 2015 at 3:00 PM, Eric Charles <eric@apache.org
> >         <mailto:eric@apache.org>> wrote:
> >
> >             i have the same issue running spark sql code from eclipse
> >             workspace. If you run your code from the command line (with
> >             a packaged jar) or from Intellij, I bet it should work.
> >
> >             IMHO This is some how related to eclipse env, but would love
> >             to know how to fix it (whether via eclipse conf, or via a
> >             patch in spark).
> >
> >
> >
> >             On 03/01/2015 02:32 AM, Michael Armbrust wrote:
> >>             I think its possible that the problem is that the scala
> >>             compiler is not being loaded by the primordial classloader
> >>             (but instead by some child classloader) and thus the scala
> >>             reflection mirror is failing to initialize when it can't
> >>             find it. Unfortunately, the only solution that I know of
> >>             is to load all required jars when the JVM starts.
> >>
> >>             On Sat, Feb 28, 2015 at 5:26 PM, Ashish Nigam
> >>             <ashnigamtech@gmail.com <mailto:ashnigamtech@gmail.com>>
> >>             wrote:
> >>
> >>                 Also, can scala version play any role here?
> >>                 I am using scala 2.11.5 but all spark packages have
> >>                 dependency to scala 2.11.2
> >>                 Just wanted to make sure that scala version is not an
> >>                 issue here.
> >>
> >>                 On Sat, Feb 28, 2015 at 9:18 AM, Ashish Nigam
> >>                 <ashnigamtech@gmail.com
> >>                 <mailto:ashnigamtech@gmail.com>> wrote:
> >>
> >>                     Hi,
> >>                     I wrote a very simple program in scala to convert
> >>                     an existing RDD to SchemaRDD.
> >>                     But createSchemaRDD function is throwing exception
> >>
> >>                     Exception in thread "main"
> >>                     scala.ScalaReflectionException: class
> >>                     org.apache.spark.sql.catalyst.ScalaReflection in
> >>                     JavaMirror with primordial classloader with boot
> >>                     classpath [.....] not found
> >>
> >>
> >>                     Here's more info on the versions I am using -
> >>
> >>                     <scala.binary.version>2.11</scala.binary.version>
> >>                         <spark.version>1.2.1</spark.version>
> >>                         <scala.version>2.11.5</scala.version>
> >>
> >>                     Please let me know how can I resolve this problem.
> >>
> >>                     Thanks
> >>                     Ashish
> >>
> >>
> >>
> >
> >
> >
> >         --
> >         Jianshi Huang
> >
> >         LinkedIn: jianshi
> >         Twitter: @jshuang
> >         Github & Blog: http://huangjs.github.com/
> >
> >
> >
> >
> >     --
> >     Jianshi Huang
> >
> >     LinkedIn: jianshi
> >     Twitter: @jshuang
> >     Github & Blog: http://huangjs.github.com/
> >
> >
> >
> >
> > --
> > Jianshi Huang
> >
> > LinkedIn: jianshi
> > Twitter: @jshuang
> > Github & Blog: http://huangjs.github.com/
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Mime
View raw message