spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: unit tests with "java.io.IOException: Could not create FileClient"
Date Mon, 19 Jan 2015 17:36:08 GMT
Your classpath has some MapR jar.

Is that intentional ?

Cheers

On Mon, Jan 19, 2015 at 6:58 AM, Jianguo Li <flyingfromchina@gmail.com>
wrote:

> Hi,
>
> I created some unit tests to test some of the functions in my project
> which use Spark. However, when I used the sbt tool to build it and then ran
> the "sbt test", I ran into "java.io.IOException: Could not create
> FileClient":
>
> 2015-01-19 08:50:38,1894 ERROR Client
> fs/client/fileclient/cc/client.cc:385 Thread: -2 Failed to initialize
> client for cluster 127.0.0.1:7222, error Unknown error(108)
> num lines: 21
> [info] TextFileAdapterTestSuite:
> [info] - Checking the RDD Vector Length *** FAILED ***
> [info]   java.io.IOException: Could not create FileClient
> [info]   at
> com.mapr.fs.MapRFileSystem.lookupClient(MapRFileSystem.java:351)
> [info]   at
> com.mapr.fs.MapRFileSystem.lookupClient(MapRFileSystem.java:363)
> [info]   at
> com.mapr.fs.MapRFileSystem.getMapRFileStatus(MapRFileSystem.java:795)
> [info]   at
> com.mapr.fs.MapRFileSystem.getFileStatus(MapRFileSystem.java:822)
> [info]   at
> org.apache.hadoop.fs.FileSystem.getFileStatus(FileSystem.java:1419)
> [info]   at
> org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1092)
> [info]   at
> org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1031)
> [info]   at
> org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:231)
> [info]   at
> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:277)
> [info]   at
> org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:199)
> [info]   ...
>
> The only tests failed, which I believe led to this exception are the ones
> where my functions call the SparkContext's function textFile(). I tried to
> debug this, and found that the exception seems to take place within the
> textFile() function. Does anybody know what is the issue and how to fix it?
> I used the local host for the SparkContext, does it have anything to do
> with this exception.
>
>
> Thanks,
>
> Jianguo
>

Mime
View raw message