i have the same issue running spark sql code from eclipse workspace. If you run your code from the command line (with a packaged jar) or from Intellij, I bet it should work.
IMHO This is some how related to eclipse env, but would love to know how to fix it (whether via eclipse conf, or via a patch in spark).
On 03/01/2015 02:32 AM, Michael Armbrust wrote:
I think its possible that the problem is that the scala compiler is not being loaded by the primordial classloader (but instead by some child classloader) and thus the scala reflection mirror is failing to initialize when it can't find it. Unfortunately, the only solution that I know of is to load all required jars when the JVM starts.
On Sat, Feb 28, 2015 at 5:26 PM, Ashish Nigam <email@example.com> wrote:
Just wanted to make sure that scala version is not an issue here.Also, can scala version play any role here?I am using scala 2.11.5 but all spark packages have dependency to scala 2.11.2
On Sat, Feb 28, 2015 at 9:18 AM, Ashish Nigam <firstname.lastname@example.org> wrote:
AshishThanksPlease let me know how can I resolve this problem.Here's more info on the versions I am using -But createSchemaRDD function is throwing exceptionHi,I wrote a very simple program in scala to convert an existing RDD to SchemaRDD.
Exception in thread "main" scala.ScalaReflectionException: class org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with primordial classloader with boot classpath [.....] not found