spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From bluejoe2008 <bluejoe2...@gmail.com>
Subject Re: Re: mismatched hdfs protocol
Date Thu, 05 Jun 2014 07:52:40 GMT
ok, i see
i imported wrong jar files which only work well on default hadoop version

2014-06-05 


bluejoe2008

From: prabeesh k
Date: 2014-06-05 16:14
To: user
Subject: Re: Re: mismatched hdfs protocol
If you are not setting the Spark hadoop version, Spark built using default hadoop version("1.0.4").



Before importing Spark-1.0.0 libraries ,  
build Spark using SPARK_HADOOP_VERSION=2.4.0 sbt/sbt assembly command.





On Thu, Jun 5, 2014 at 12:28 PM, bluejoe2008 <bluejoe2008@gmail.com> wrote:

thank you!

i am developping a java project in Eclipse IDE on Windows
in which spark 1.0.0 libraries are imported
and now i want to open HDFS files as input
the hadoop version of HDFS is 2.4.0

2014-06-05 


bluejoe2008

From: prabeesh k
Date: 2014-06-05 13:23
To: user
Subject: Re: mismatched hdfs protocol
For building Spark for particular version of Hadoop 
Refer http://spark.apache.org/docs/latest/hadoop-third-party-distributions.html 



On Thu, Jun 5, 2014 at 8:14 AM, Koert Kuipers <koert@tresata.com> wrote:

you have to build spark against the version of hadoop your are using




On Wed, Jun 4, 2014 at 10:25 PM, bluejoe2008 <bluejoe2008@gmail.com> wrote:

hi, all
when my spark program accessed hdfs files
an error happened:

Exception in thread "main" org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot
communicate with client version 4 

it seems the client was trying to connect hadoop2 via an old hadoop protocol

so my question is:
how to specify the version of hadoop on connection?

thank you!

bluejoe

2014-06-05 
Mime
View raw message