spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jianshi Huang <jianshi.hu...@gmail.com>
Subject Re: Unable to find org.apache.spark.sql.catalyst.ScalaReflection class
Date Fri, 13 Mar 2015 06:52:33 GMT
Forget about my last message. I was confused. Spark 1.2.1 + Scala 2.10.4
started by SBT console command also failed with this error. However running
from a standard spark shell works.

Jianshi

On Fri, Mar 13, 2015 at 2:46 PM, Jianshi Huang <jianshi.huang@gmail.com>
wrote:

> Hmm... look like the console command still starts a Spark 1.3.0 with Scala
> 2.11.6 even I changed them in build.sbt.
>
> So the test with 1.2.1 is not valid.
>
> Jianshi
>
> On Fri, Mar 13, 2015 at 2:34 PM, Jianshi Huang <jianshi.huang@gmail.com>
> wrote:
>
>> I've confirmed it only failed in console started by SBT.
>>
>> I'm using sbt-spark-package plugin, and the initialCommands look like
>> this (I added implicit sqlContext to it):
>>
>> > show console::initialCommands
>> [info]  println("Welcome to\n" +
>> [info] "      ____              __\n" +
>> [info] "     / __/__  ___ _____/ /__\n" +
>> [info] "    _\\ \\/ _ \\/ _ `/ __/  '_/\n" +
>> [info] "   /___/ .__/\\_,_/_/ /_/\\_\\   version \"1.3.0-rc2\"\n" +
>> [info] "      /_/\n" +
>> [info] "Using Scala \"2.11.6\"\n")
>> [info]
>> [info] import org.apache.spark.SparkContext._
>> [info]
>> [info] val sc = {
>> [info]   val conf = new  org.apache.spark.SparkConf()
>> [info]     .setMaster("local")
>> [info]     .setAppName("Sbt console + Spark!")
>> [info]   new org.apache.spark.SparkContext(conf)
>> [info] }
>> [info] println("Created spark context as sc.")
>> [info]
>> [info] def time[T](f: => T): T = {
>> [info]   import System.{currentTimeMillis => now}
>> [info]   val start = now
>> [info]   try { f } finally { println("Elapsed: " + (now - start)/1000.0 +
>> " s") }
>> [info] }
>> [info]
>> [info] @transient val sqlc = new org.apache.spark.sql.SQLContext(sc)
>> [info] implicit def sqlContext = sqlc
>> [info] import sqlc._
>>
>>
>> Jianshi
>>
>>
>> On Fri, Mar 13, 2015 at 3:10 AM, Jianshi Huang <jianshi.huang@gmail.com>
>> wrote:
>>
>>> BTW, I was running tests from SBT when I get the errors. One test turn a
>>> Seq of case class to DataFrame.
>>>
>>> I also tried run similar code in the console, but failed with same error.
>>>
>>> I tested both Spark 1.3.0-rc2 and 1.2.1 with Scala 2.11.6 and 2.10.4
>>>
>>> Any idea?
>>>
>>> Jianshi
>>>
>>> On Fri, Mar 13, 2015 at 2:23 AM, Jianshi Huang <jianshi.huang@gmail.com>
>>> wrote:
>>>
>>>> Same issue here. But the classloader in my exception is somehow
>>>> different.
>>>>
>>>> scala.ScalaReflectionException: class
>>>> org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with
>>>> java.net.URLClassLoader@53298398 of type class java.net.URLClassLoader
>>>> with classpath
>>>>
>>>>
>>>> Jianshi
>>>>
>>>> On Sun, Mar 1, 2015 at 9:32 AM, Michael Armbrust <
>>>> michael@databricks.com> wrote:
>>>>
>>>>> I think its possible that the problem is that the scala compiler is
>>>>> not being loaded by the primordial classloader (but instead by some child
>>>>> classloader) and thus the scala reflection mirror is failing to initialize
>>>>> when it can't find it. Unfortunately, the only solution that I know of
is
>>>>> to load all required jars when the JVM starts.
>>>>>
>>>>> On Sat, Feb 28, 2015 at 5:26 PM, Ashish Nigam <ashnigamtech@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Also, can scala version play any role here?
>>>>>> I am using scala 2.11.5 but all spark packages have dependency to
>>>>>> scala 2.11.2
>>>>>> Just wanted to make sure that scala version is not an issue here.
>>>>>>
>>>>>> On Sat, Feb 28, 2015 at 9:18 AM, Ashish Nigam <ashnigamtech@gmail.com
>>>>>> > wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>> I wrote a very simple program in scala to convert an existing
RDD to
>>>>>>> SchemaRDD.
>>>>>>> But createSchemaRDD function is throwing exception
>>>>>>>
>>>>>>> Exception in thread "main" scala.ScalaReflectionException: class
>>>>>>> org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with
primordial
>>>>>>> classloader with boot classpath [.....] not found
>>>>>>>
>>>>>>>
>>>>>>> Here's more info on the versions I am using -
>>>>>>>
>>>>>>> <scala.binary.version>2.11</scala.binary.version>
>>>>>>>     <spark.version>1.2.1</spark.version>
>>>>>>>     <scala.version>2.11.5</scala.version>
>>>>>>>
>>>>>>> Please let me know how can I resolve this problem.
>>>>>>>
>>>>>>> Thanks
>>>>>>> Ashish
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> Jianshi Huang
>>>>
>>>> LinkedIn: jianshi
>>>> Twitter: @jshuang
>>>> Github & Blog: http://huangjs.github.com/
>>>>
>>>
>>>
>>>
>>> --
>>> Jianshi Huang
>>>
>>> LinkedIn: jianshi
>>> Twitter: @jshuang
>>> Github & Blog: http://huangjs.github.com/
>>>
>>
>>
>>
>> --
>> Jianshi Huang
>>
>> LinkedIn: jianshi
>> Twitter: @jshuang
>> Github & Blog: http://huangjs.github.com/
>>
>
>
>
> --
> Jianshi Huang
>
> LinkedIn: jianshi
> Twitter: @jshuang
> Github & Blog: http://huangjs.github.com/
>



-- 
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github & Blog: http://huangjs.github.com/

Mime
View raw message