spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mohamed Nadjib MAMI <>
Subject Spark SQL - java.lang.StackOverflowError after caching table
Date Thu, 24 Mar 2016 09:43:40 GMT
Hi all,
I'm running SQL queries (sqlContext.sql()) on Parquet tables and facing 
a problem with table caching (sqlContext.cacheTable()), using 
spark-shell of Spark 1.5.1.

After I run the sqlContext.cacheTable(table), the sqlContext.sql(query) 
takes longer the first time (well, for the lazy execution reason) but it 
finishes and returns results. However, the weird thing is that after I 
run the same query again, I get the error: "java.lang.StackOverflowError".

I Googled it but didn't find the error appearing with table caching and 
Any hint is appreciated.

View raw message