sqoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harpreet Singh <hs.kund...@gmail.com>
Subject Sqoop job to import data failing due to physical memory breach
Date Wed, 02 Aug 2017 11:00:45 GMT
Hi All,
I have a sqoop job which is running in production and fails sometimes.
Restart of job executes successfully .
Logs show that failure happens with error that container is running beyond
physical memory limits. Current usage 2.3 GB of 2GB physical memory used.
4.0 GB of 4.2 GB virtual memory used. Killing container.
Environment is
Cdh5.8.3
Sqoop 1 client
Mapreduce.map.Java.opts=-Djava.net.preferIPv4Stack=true -Xmx1717986918
Mapreduce.map.memory.MB= 2GB

Sqoop job details. Pulling data from netezza using 6 mappers and putting
into parquet format on hdfs. Data processed is 14 GB. Splits seem to be
even.
Please provide your insights.

Regards
Harpreet Singh

Mime
View raw message