Also, can scala version play any role here?
I am using scala 2.11.5 but all spark packages have dependency to scala 2.11.2
Just wanted to make sure that scala version is not an issue here.

On Sat, Feb 28, 2015 at 9:18 AM, Ashish Nigam <ashnigamtech@gmail.com> wrote:
Hi,
I wrote a very simple program in scala to convert an existing RDD to SchemaRDD.
But createSchemaRDD function is throwing exception

Exception in thread "main" scala.ScalaReflectionException: class org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with primordial classloader with boot classpath [.....] not found


Here's more info on the versions I am using -

<scala.binary.version>2.11</scala.binary.version>
    <spark.version>1.2.1</spark.version>
    <scala.version>2.11.5</scala.version>

Please let me know how can I resolve this problem.

Thanks
Ashish