spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Spark throws rsync: change_dir errors on startup
Date Thu, 02 Apr 2015 05:51:58 GMT
Error 23 is defined as a "partial transfer" and might be caused by
filesystem incompatibilities, such as different character sets or access
control lists. In this case it could be caused by the double slashes (// at
the end of sbin), You could try editing your sbin/spark-daemon.sh file,
look for rsync inside the file, add -v along with that command to see what
exactly i going wrong.

Thanks
Best Regards

On Wed, Apr 1, 2015 at 7:25 PM, Horsmann, Tobias <tobias.horsmann@uni-due.de
> wrote:

>  Hi,
>
>  I try to set up a minimal 2-node spark cluster for testing purposes.
> When I start the cluster with start-all.sh I get a rsync error message:
>
>  rsync: change_dir "/usr/local/spark130/sbin//right" failed: No such file
> or directory (2)
> rsync error: some files/attrs were not transferred (see previous errors)
> (code 23) at main.c(1183) [sender=3.1.0]
>
>  (For clarification, my 2 nodes are called ‚right‘ and ‚left‘ referencing
> to the physical machines standing in front of me)
> It seems that a file named after my master node ‚right‘ is expected to
> exist and the synchronisation with it fails as it does not exist.
> I don’t understand what Spark is trying to do here. Why does it expect
> this file to exist and what content should it have?
>  I assume I did something wrong in my configuration setup – can someone
> interpret this error message and has an idea where his error is coming from?
>
>  Regards,
> Tobias
>

Mime
View raw message