Here are the jar files on my classpath after doing a grep for spark jars.

org.apache.spark/spark-core_2.11/2.0.0/c4d04336c142f10eb7e172155f022f86b6d11dd3/spark-core_2.11-2.0.0.jar

org.apache.spark/sparkstreaming_2.11/2.0.0/7227cbd39f5952b0ed3579bc78463bcc318ecd2b/spark-streaming_2.11-2.0.0.jar

com.datastax.spark/spark-cassandra-connector_2.11/2.0.0-M3/d38ac36dde076e3364f1024985754bce84bd39d/spark-cassandra-connector_2.11-2.0.0-M3.jar

org.apache.spark/spark-launcher_2.11/2.0.0/9c3e1bd84ccb099e86ea232f5acd8fec1a61e291/spark-launcher_2.11-2.0.0.jar

org.apache.spark/spark-network-common_2.11/2.0.0/b451dae899ee8138e96319528eed64f7e849dbe2/spark-network-common_2.11-2.0.0.jar

org.apache.spark/spark-network-shuffle_2.11/2.0.0/233c036e88761424212508b2a6a55633a3cf4ec8/spark-network-shuffle_2.11-2.0.0.jar

org.apache.spark/spark-unsafe_2.11/2.0.0/9f8682d4c83ce32f08fea067c2e22aaabca27d86/spark-unsafe_2.11-2.0.0.jar

org.apache.spark/spark-tags_2.11/2.0.0/7f84a46b1e60c1981e47cae05c462fed65217eff/spark-tags_2.11-2.0.0.jar

org.spark-project.spark/unused/1.0.0/205fe37a2fade6ce6dfcf8eff57ed21a4a1c22af/unused-1.0.0.jar  



On Wed, Oct 5, 2016 4:03 PM, Jakob Odersky jakob@odersky.com wrote:
Ok, that rules out a whole class of errors. Let's continue the diagnostic:
- How are you submitting the application to spark?
- How/which version of spark are you using within your build tool?
- Could you have dirty ivy or maven caches that use some locally-built version of spark?

On Wed, Oct 5, 2016 at 3:35 PM, kant kodali <kanth909@gmail.com> wrote:
I am running locally so they all are on one host



On Wed, Oct 5, 2016 3:12 PM, Jakob Odersky jakob@odersky.com wrote:
Are all spark and scala versions the same? By "all" I mean the master, worker and driver instances.