spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "''癫、砜'" <>
Subject Spark Pipe wrapException
Date Thu, 27 Mar 2014 12:07:45 GMT
When I use RDD.pipe("program") to analysis data, the spark throw wrapException. Something special
is the native program just do "scanf" and "printf", we find when the scale of data is small,
everything is ok, but when the scale of data increate, we got these exception.
We try to analysis the reason:The stack tell us socket time out, and the time is about 60
seconds, so we add "dfs.socket.timeout" to "dfs.xml" but it doesn't work.
Here is the error stack, maybe someone got the same problem as me,looking forward to the Reply.
View raw message