spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jungtaek Lim <kabhwan.opensou...@gmail.com>
Subject SQL DDL statements with replacing default catalog with custom catalog
Date Wed, 07 Oct 2020 00:54:54 GMT
Hi devs,

I'm not sure whether it's addressed in Spark 3.1, but at least from Spark
3.0.1, many SQL DDL statements don't seem to go through the custom catalog
when I replace default catalog with custom catalog and only provide
'dbName.tableName' as table identifier.

I'm not an expert in this area, but after skimming the code I feel
TempViewOrV1Table looks to be broken for the case, as it can still be a V2
table. Classifying the table identifier to either V2 table or "temp view or
v1 table" looks to be mandatory, as former and latter have different code
paths and different catalog interfaces.

That sounds to me as being stuck and the only "clear" approach seems to
disallow default catalog with custom one. Am I missing something?

Thanks,
Jungtaek Lim (HeartSaVioR)

Mime
View raw message