spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Apache Spark (JIRA)" <>
Subject [jira] [Assigned] (SPARK-15392) The default value of size estimation is not good
Date Wed, 18 May 2016 21:38:13 GMT


Apache Spark reassigned SPARK-15392:

    Assignee: Apache Spark  (was: Davies Liu)

> The default value of size estimation is not good
> ------------------------------------------------
>                 Key: SPARK-15392
>                 URL:
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 2.0.0
>            Reporter: Davies Liu
>            Assignee: Apache Spark
> We use  autoBroadcastJoinThreshold + 1L as the default value of size estimation, that
is not good in 2.0, because we will calculate the size based on size of schema, then the estimation
could be less than autoBroadcastJoinThreshold if you have an SELECT on top of an DataFrame
created from RDD.
> We should use an even bigger default value, for example, MaxLong.

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message