spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "563280193@qq.com" <563280...@qq.com>
Subject spark sql occer error
Date Fri, 22 Mar 2019 07:39:19 GMT
Hi , 
I ran a spark sql like this:

    select imei,tag, product_id,
  sum(case when succ1>=1 then 1 else 0 end) as succ,
  sum(case when fail1>=1 and succ1=0 then 1 else 0 end) as fail,
  count(*) as cnt
    from t_tbl
    where sum(case when succ1>=1 then 1 else 0 end)=0 and sum(case when fail1>=1 and
succ1=0 then 1 else 0 end)>0
    group by tag, product_id, app_version

It occur a problem below:

 execute, tree:
Exchange hashpartitioning(imei#0, tag#1, product_id#2, 100)
+- *(1) HashAggregate(keys=[imei#0, tag#1, product_id#2], functions=[partial_sum(cast(CASE
WHEN (succ1#3L >= 1) THEN 1 ELSE 0 END as bigint)), partial_sum(cast(CASE WHEN ((fail1#4L
>= 1) && (succ1#3L = 0)) THEN 1 ELSE 0 END as bigint)), partial_count(1)], output=[imei#0,
tag#1, product_id#2, sum#49L, sum#50L, count#51L])
   +- *(1) Filter ((sum(cast(CASE WHEN (succ1#3L >= 1) THEN 1 ELSE 0 END as bigint)) =
0) && (sum(cast(CASE WHEN ((fail1#4L >= 1) && (succ1#3L = 0)) THEN 1 ELSE
0 END as bigint)) > 0))
      +- *(1) FileScan json [imei#0,tag#1,product_id#2,succ1#3L,fail1#4L] Batched: false,
Format: JSON, Location: InMemoryFileIndex[hdfs://xxxxxx], PartitionFilters: [], PushedFilters:
[], ReadSchema: struct<imei:string,tag:string,product_id:string,succ1:bigint,fail1:bigint>


Could anyone help me to solve this problem?
my spark version is 2.3.1
thank you.



563280193@qq.com
Mime
View raw message