spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: pySpark saveAsSequenceFile append overwrite
Date Wed, 03 Dec 2014 08:46:53 GMT
You can't append to a file with spark using the native saveAs* calls, it
will always check if the directory already exists and if yes, it will throw
error. People usually use hadoop's getMerge utilities to combine the
output.

Thanks
Best Regards

On Tue, Dec 2, 2014 at 8:10 PM, Csaba Ragany <ragesz@gmail.com> wrote:

> Dear Spark community,
>
> Has the pySpark saveAsSequenceFile(<folder>) method the ability to append
> the new sequencefile into an other one or to overwrite an existing
> sequencefile? If the <folder> already exists then I get an error message...
>
> Thank You!
> Csaba
>

Mime
View raw message