spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jacek Laskowski <>
Subject How to resolve UnresolvedRelations (to explore FindDataSourceTable)?
Date Tue, 16 Jan 2018 10:43:44 GMT

I've been exploring Spark Analyzer's FindDataSourceTable rule and found the
following example hard to explain why one of two UnresolvedRelations has
not been resolved.

Could you help me with the places marked as FIXME?

scala> spark.version
res0: String = 2.4.0-SNAPSHOT

// Create tables
// FIXME Is there a more idiomatic way of creating tables for demos?

import org.apache.spark.sql.catalyst.dsl.plans._
val plan = table("t1").insertInto(tableName = "t2", overwrite = true)

// Transform the logical plan with ResolveRelations logical rule first
// so UnresolvedRelations become UnresolvedCatalogRelations
import spark.sessionState.analyzer.ResolveRelations
val planWithUnresolvedCatalogRelations = ResolveRelations(plan)
scala> println(planWithUnresolvedCatalogRelations.numberedTreeString)
00 'InsertIntoTable 'UnresolvedRelation `t2`, true, false
01 +- 'SubqueryAlias t1
02    +- 'UnresolvedCatalogRelation `default`.`t1`,

// Let's resolve UnresolvedCatalogRelations then
import org.apache.spark.sql.execution.datasources.FindDataSourceTable
val r = new FindDataSourceTable(spark)
val tablesResolvedPlan = r(planWithUnresolvedCatalogRelations)
// FIXME Why is t2 not resolved?!
scala> println(tablesResolvedPlan.numberedTreeString)
00 'InsertIntoTable 'UnresolvedRelation `t2`, true, false
01 +- SubqueryAlias t1
02    +- Relation[id#10L] parquet

Why is t2 not resolved?! Have I missed a rule to apply to the logical plan?
Which one? Could this be that it is not supposed to work due to "Inserting
into an RDD-based table is not allowed." [1]?


Jacek Laskowski
Mastering Spark SQL
Spark Structured Streaming
Mastering Kafka Streams
Follow me at

View raw message