spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sunil Tripathy <sunil.tripa...@gmail.com>
Subject Spark Memory Allocation Exception
Date Fri, 09 Sep 2016 22:42:04 GMT
Hi,
  I am using spark 1.6 to load a history activity dataset for last 3/4
years and write that to a parquet file partitioned by day. I am using the
following exception when the insert command is running to insert the data
onto the parquet partitions.

org.apache.hadoop.hive.ql.metadata.HiveException:
parquet.hadoop.MemoryManager$1: New Memory allocation 1047552 bytes is
smaller than the minimum allocation size of 1048576 bytes.

The input data size is around 350 GB and we have around 145 nodes with
384GB on each node.
Any pointers to resolve the issue will be appreciated.

Thanks

Mime
View raw message