From issues-return-242239-apmail-spark-issues-archive=spark.apache.org@spark.apache.org Sun Dec 1 23:47:02 2019 Return-Path: X-Original-To: apmail-spark-issues-archive@minotaur.apache.org Delivered-To: apmail-spark-issues-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [207.244.88.153]) by minotaur.apache.org (Postfix) with SMTP id 17FE91082E for ; Sun, 1 Dec 2019 23:47:02 +0000 (UTC) Received: (qmail 58531 invoked by uid 500); 1 Dec 2019 23:47:01 -0000 Delivered-To: apmail-spark-issues-archive@spark.apache.org Received: (qmail 58489 invoked by uid 500); 1 Dec 2019 23:47:01 -0000 Mailing-List: contact issues-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list issues@spark.apache.org Received: (qmail 58475 invoked by uid 99); 1 Dec 2019 23:47:01 -0000 Received: from mailrelay1-us-west.apache.org (HELO mailrelay1-us-west.apache.org) (209.188.14.139) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 01 Dec 2019 23:47:01 +0000 Received: from jira-he-de.apache.org (static.172.67.40.188.clients.your-server.de [188.40.67.172]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id A9027E01C5 for ; Sun, 1 Dec 2019 23:47:00 +0000 (UTC) Received: from jira-he-de.apache.org (localhost.localdomain [127.0.0.1]) by jira-he-de.apache.org (ASF Mail Server at jira-he-de.apache.org) with ESMTP id 23F5D78046C for ; Sun, 1 Dec 2019 23:47:00 +0000 (UTC) Date: Sun, 1 Dec 2019 23:47:00 +0000 (UTC) From: "Xiao Li (Jira)" To: issues@spark.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Updated] (SPARK-29699) Different answers in nested aggregates with window functions MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/SPARK-29699?page=3Dcom.atlassi= an.jira.plugin.system.issuetabpanels:all-tabpanel ] Xiao Li updated SPARK-29699: ---------------------------- Labels: correctness (was: ) > Different answers in nested aggregates with window functions > ------------------------------------------------------------ > > Key: SPARK-29699 > URL: https://issues.apache.org/jira/browse/SPARK-29699 > Project: Spark > Issue Type: Sub-task > Components: SQL > Affects Versions: 3.0.0 > Reporter: Takeshi Yamamuro > Priority: Major > Labels: correctness > > A=C2=A0nested aggregate below with a window function seems to have differ= ent answers in the `rsum` column=C2=A0 between PgSQL and Spark; > {code:java} > postgres=3D# create table gstest2 (a integer, b integer, c integer, d int= eger, e integer, f integer, g integer, h integer); > postgres=3D# insert into gstest2 values > postgres-# (1, 1, 1, 1, 1, 1, 1, 1), > postgres-# (1, 1, 1, 1, 1, 1, 1, 2), > postgres-# (1, 1, 1, 1, 1, 1, 2, 2), > postgres-# (1, 1, 1, 1, 1, 2, 2, 2), > postgres-# (1, 1, 1, 1, 2, 2, 2, 2), > postgres-# (1, 1, 1, 2, 2, 2, 2, 2), > postgres-# (1, 1, 2, 2, 2, 2, 2, 2), > postgres-# (1, 2, 2, 2, 2, 2, 2, 2), > postgres-# (2, 2, 2, 2, 2, 2, 2, 2); > INSERT 0 9 > postgres=3D#=20 > postgres=3D# select a, b, sum(c), sum(sum(c)) over (order by a,b) as rsum > postgres-# from gstest2 group by rollup (a,b) order by rsum, a, b; > a | b | sum | rsum=20 > ---+---+-----+------ > 1 | 1 | 16 | 16 > 1 | 2 | 4 | 20 > 1 | | 20 | 40 > 2 | 2 | 4 | 44 > 2 | | 4 | 48 > | | 24 | 72 > (6 rows) > {code} > {code:java} > scala> sql(""" > | select a, b, sum(c), sum(sum(c)) over (order by a,b) as rsum > | from gstest2 group by rollup (a,b) order by rsum, a, b > | """).show() > +----+----+------+----+ = =20 > | a| b|sum(c)|rsum| > +----+----+------+----+ > |null|null| 12| 12| > | 1|null| 10| 22| > | 1| 1| 8| 30| > | 1| 2| 2| 32| > | 2|null| 2| 34| > | 2| 2| 2| 36| > +----+----+------+----+ > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org For additional commands, e-mail: issues-help@spark.apache.org