spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Andrew Or (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-8628) Race condition in AbstractSparkSQLParser.parse
Date Thu, 02 Jul 2015 18:49:04 GMT

     [ https://issues.apache.org/jira/browse/SPARK-8628?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Andrew Or updated SPARK-8628:
-----------------------------
    Fix Version/s:     (was: 1.4.2)
                   1.4.1

> Race condition in AbstractSparkSQLParser.parse
> ----------------------------------------------
>
>                 Key: SPARK-8628
>                 URL: https://issues.apache.org/jira/browse/SPARK-8628
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.3.0, 1.3.1, 1.4.0
>            Reporter: Santiago M. Mola
>            Assignee: Vinod KC
>            Priority: Critical
>              Labels: regression
>             Fix For: 1.4.1, 1.5.0
>
>
> SPARK-5009 introduced the following code in AbstractSparkSQLParser:
> {code}
> def parse(input: String): LogicalPlan = {
>     // Initialize the Keywords.
>     lexical.initialize(reservedWords)
>     phrase(start)(new lexical.Scanner(input)) match {
>       case Success(plan, _) => plan
>       case failureOrError => sys.error(failureOrError.toString)
>     }
>   }
> {code}
> The corresponding initialize method in SqlLexical is not thread-safe:
> {code}
>   /* This is a work around to support the lazy setting */
>   def initialize(keywords: Seq[String]): Unit = {
>     reserved.clear()
>     reserved ++= keywords
>   }
> {code}
> I'm hitting this when parsing multiple SQL queries concurrently. When one query parsing
starts, it empties the reserved keyword list, then a race-condition occurs and other queries
fail to parse because they recognize keywords as identifiers.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message