spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dongjoon Hyun (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-15462) Checking `resolved === false` is enough for testcases.
Date Sat, 21 May 2016 07:17:12 GMT

     [ https://issues.apache.org/jira/browse/SPARK-15462?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Dongjoon Hyun updated SPARK-15462:
----------------------------------
    Description: 
In only `catalyst` module, there exists 7 evaluation test cases on unresolved expressions.
But, in real-world situation, those cases doesn't happen since they occurs exceptions before
evaluations.

{code}
scala> sql("select format_number(null, 3)")
res0: org.apache.spark.sql.DataFrame = [format_number(CAST(NULL AS DOUBLE), 3): string]

scala> sql("select format_number(cast(null as NULL), 3)")
org.apache.spark.sql.catalyst.parser.ParseException:
DataType null() is not supported.(line 1, pos 34)
{code}

This PR makes those testcases more realistic.
{code}
-    checkEvaluation(FormatNumber(Literal.create(null, NullType), Literal(3)), null)
+    assert(FormatNumber(Literal.create(null, NullType), Literal(3)).resolved === false)
{code}

Also, this PR also removes redundant `resolved` checking in `FoldablePropagation` optimizer.

  was:
In only `catalyst` module, there exists 7 evaluation test cases on unresolved expressions.
But, in real-world situation, those cases doesn't happen since there occurs exceptions before
evaluations.

{code}
scala> sql("select format_number(null, 3)")
res0: org.apache.spark.sql.DataFrame = [format_number(CAST(NULL AS DOUBLE), 3): string]

scala> sql("select format_number(cast(null as NULL), 3)")
org.apache.spark.sql.catalyst.parser.ParseException:
DataType null() is not supported.(line 1, pos 34)
{code}

This PR makes those testcases more realistic.
{code}
-    checkEvaluation(FormatNumber(Literal.create(null, NullType), Literal(3)), null)
+    assert(FormatNumber(Literal.create(null, NullType), Literal(3)).resolved === false)
{code}

Also, this PR also removes redundant `resolved` checking in `FoldablePropagation` optimizer.


> Checking `resolved === false` is enough for testcases.
> ------------------------------------------------------
>
>                 Key: SPARK-15462
>                 URL: https://issues.apache.org/jira/browse/SPARK-15462
>             Project: Spark
>          Issue Type: Test
>          Components: SQL, Tests
>            Reporter: Dongjoon Hyun
>            Priority: Minor
>
> In only `catalyst` module, there exists 7 evaluation test cases on unresolved expressions.
But, in real-world situation, those cases doesn't happen since they occurs exceptions before
evaluations.
> {code}
> scala> sql("select format_number(null, 3)")
> res0: org.apache.spark.sql.DataFrame = [format_number(CAST(NULL AS DOUBLE), 3): string]
> scala> sql("select format_number(cast(null as NULL), 3)")
> org.apache.spark.sql.catalyst.parser.ParseException:
> DataType null() is not supported.(line 1, pos 34)
> {code}
> This PR makes those testcases more realistic.
> {code}
> -    checkEvaluation(FormatNumber(Literal.create(null, NullType), Literal(3)), null)
> +    assert(FormatNumber(Literal.create(null, NullType), Literal(3)).resolved === false)
> {code}
> Also, this PR also removes redundant `resolved` checking in `FoldablePropagation` optimizer.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message