spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From hakanilter <>
Subject Re: Problem with loading files: Loss was due to
Date Wed, 21 May 2014 22:19:35 GMT
The problem is solved after hadoop-core dependency added. But I think there
is a misunderstanding about local files. I found this one:

"Note that if you've connected to a Spark master, it's possible that it will
attempt to load the file on one of the different machines in the cluster, so
make sure it's available on all the cluster machines. In general, in future
you will want to put your data in HDFS, S3, or similar file systems to avoid
this problem."

This means that you can't use local files with spark. I don't understand
why, because after calling addFile() or textFile(), the file can be
downloaded by every node on the cluster and became accessible. 

Anyway, if you got "Loss was due to", you have to make
sure that hadoop libs are available.



View this message in context:
Sent from the Apache Spark User List mailing list archive at

View raw message