spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rishi Shah <rishishah.s...@gmail.com>
Subject Re: [pyspark 2.3.0] Task was denied committing errors
Date Thu, 07 Nov 2019 01:27:10 GMT
Any suggestions?

On Wed, Nov 6, 2019 at 7:30 AM Rishi Shah <rishishah.star@gmail.com> wrote:

> Hi All,
>
> I have two relatively big tables and join on them keeps throwing
> TaskCommitErrors, eventually job succeeds but I was wondering what these
> errors are and if there's any solution?
>
> --
> Regards,
>
> Rishi Shah
>


-- 
Regards,

Rishi Shah

Mime
View raw message