spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ankit Raj Boudh (Jira)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-32306) `approx_percentile` in Spark SQL gives incorrect results
Date Thu, 16 Jul 2020 03:30:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-32306?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17158863#comment-17158863
] 

Ankit Raj Boudh commented on SPARK-32306:
-----------------------------------------

[~seanmalory], i will raise the pr for this soon

> `approx_percentile` in Spark SQL gives incorrect results
> --------------------------------------------------------
>
>                 Key: SPARK-32306
>                 URL: https://issues.apache.org/jira/browse/SPARK-32306
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark, SQL
>    Affects Versions: 2.4.4
>            Reporter: Sean Malory
>            Priority: Major
>
> The `approx_percentile` function in Spark SQL does not give the correct result. I'm not
sure how incorrect it is; it may just be a boundary issue. From the docs:
> {quote}The accuracy parameter (default: 10000) is a positive numeric literal which controls
approximation accuracy at the cost of memory. Higher value of accuracy yields better accuracy,
1.0/accuracy is the relative error of the approximation.
> {quote}
> This is not true. Here is a minimum example in `pyspark` where, essentially, the median
of 5 and 8 is being calculated as 5:
> {code:python}
> import pyspark.sql.functions as psf
> df = spark.createDataFrame(
>     [('bar', 5), ('bar', 8)], ['name', 'val']
> )
> median = psf.expr('percentile_approx(val, 0.5, 2147483647)')
> df.groupBy('name').agg(median.alias('median'))    # gives the median as 5
> {code}
> I've tested this with Spark v2.4.4, pyspark v2.4.5- although I suspect this is an issue
with the underlying algorithm.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message