spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Georg Heiler <georg.kf.hei...@gmail.com>
Subject Re: custom column types for JDBC datasource writer
Date Thu, 06 Jul 2017 04:49:51 GMT
Great, thanks!
But for the current release is there any possibility to be able to catch
the exception and handle it i.e. not have spark only log it to the console?

Takeshi Yamamuro <linguin.m.s@gmail.com> schrieb am Do., 6. Juli 2017 um
06:44 Uhr:

> -dev +user
>
> You can in master and see
> https://github.com/apache/spark/commit/c7911807050227fcd13161ce090330d9d8daa533
> .
> This option will be available in the next release.
>
> // maropu
>
> On Thu, Jul 6, 2017 at 1:25 PM, Georg Heiler <georg.kf.heiler@gmail.com>
> wrote:
>
>> Hi,
>> is it possible to somehow make spark not use VARCHAR(255) but something
>> bigger i.e. CLOB for Strings?
>>
>> If not, is it at least possible to catch the exception which is thrown.
>> To me, it seems that spark is catching and logging it - so I can no longer
>> intervene and handle it:
>>
>>
>> https://stackoverflow.com/questions/44927764/spark-jdbc-oracle-long-string-fields
>>
>> Regards,
>> Georg
>>
>
>
>
> --
> ---
> Takeshi Yamamuro
>

Mime
View raw message