spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Denise Mauldin <denise.maul...@gmail.com>
Subject Adding uuid support
Date Mon, 09 Nov 2020 22:29:58 GMT
Hello,

When I run PySpark to save to a Postgresql database, I run into an error
where uuid insert statements are not constructed properly.  There are a lot
of different questions on stackoverflow about the same issue.

https://stackoverflow.com/questions/64671739/pyspark-nullable-uuid-type-uuid-but-expression-is-of-type-character-varying

I would like to add support for saving uuids to Postgresql in Pyspark.

How do I identify what is causing this error? Is this something that needs
to be fixed in the Pyspark code, the Apache Spark Code, or the Postgresql
JDBC driver?  Does anyone have advice on how I should approach fixing this
issue?

Thanks,
Denise

Mime
View raw message