spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Maiti, Samya" <>
Subject Re: Writing to a single file from multiple executors
Date Thu, 12 Mar 2015 16:47:18 GMT
Hi TD,

I want to append my record to a AVRO file which will be later used for querying.

Having a single file is not mandatory for us but then how can we make the executors append
the AVRO data to multiple files.

On Mar 12, 2015, at 4:09 AM, Tathagata Das <<>>

Why do you have to write a single file?

On Wed, Mar 11, 2015 at 1:00 PM, SamyaMaiti <<>>
Hi Experts,

I have a scenario, where in I want to write to a avro file from a streaming
job that reads data from kafka.

But the issue is, as there are multiple executors and when all try to write
to a given file I get a concurrent exception.

I way to mitigate the issue is to repartition & have a single writer task,
but as my data is huge that is not a feasible option.

Any suggestions welcomed.


View this message in context:
Sent from the Apache Spark User List mailing list archive at<>.

To unsubscribe, e-mail:<>
For additional commands, e-mail:<>

The information contained in this message may be confidential and legally protected under
applicable law. The message is intended solely for the addressee(s). If you are not the intended
recipient, you are hereby notified that any use, forwarding, dissemination, or reproduction
of this message is strictly prohibited and may be unlawful. If you are not the intended recipient,
please contact the sender by return e-mail and destroy all copies of the original message.

View raw message