spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Srinivasarao Daruna (JIRA)" <>
Subject [jira] [Created] (SPARK-12596) spark.sql.autoBroadcastJoinThreshold should be int
Date Fri, 01 Jan 2016 13:38:39 GMT
Srinivasarao Daruna created SPARK-12596:

             Summary: spark.sql.autoBroadcastJoinThreshold should be int
                 Key: SPARK-12596
             Project: Spark
          Issue Type: Bug
          Components: SQL
            Reporter: Srinivasarao Daruna


I have tried to set 4GB as broadcast memory, but received following error message.

code used is:
sqlContext.setConf("spark.sql.autoBroadcastJoinThreshold", "4294967296")

However, for the broadcast join config, the value seems to be using 4 Byte integer, so the
maximum value that we can set is, only 2147483647. Any thing above that, it is failing. 

I suppose the memory should be based on availability not decided by the size of a variable.

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message