spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rishi Shah <rishishah.s...@gmail.com>
Subject Re: [pyspark 2.3.0] Task was denied committing errors
Date Sun, 10 Nov 2019 19:24:34 GMT
Hi Team,

I could really use your insight here, any help is appreciated!

Thanks,
Rishi


On Wed, Nov 6, 2019 at 8:27 PM Rishi Shah <rishishah.star@gmail.com> wrote:

> Any suggestions?
>
> On Wed, Nov 6, 2019 at 7:30 AM Rishi Shah <rishishah.star@gmail.com>
> wrote:
>
>> Hi All,
>>
>> I have two relatively big tables and join on them keeps throwing
>> TaskCommitErrors, eventually job succeeds but I was wondering what these
>> errors are and if there's any solution?
>>
>> --
>> Regards,
>>
>> Rishi Shah
>>
>
>
> --
> Regards,
>
> Rishi Shah
>


-- 
Regards,

Rishi Shah

Mime
View raw message