spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Krishnaprasad <krishnaprasad.naraya...@conduent.com>
Subject py4j.protocol.Py4JNetworkError: Error while receiving Socket.timeout: timed out
Date Sat, 30 Sep 2017 10:22:28 GMT
Hi all,

I am developing an application that can run on Apache Spark (setup on single
node) and as part of the implementation, I am using PySpark version 2.2.0. 

Environment - OS is Ubuntu 14.04 and Python version is 3.4.

I am getting the following error as shown below. It will be helpful if
somebody can suggest a quick resolution / work around for this problem:

Traceback (most recent call last):
File
"/usr/local/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py",
line 1028, in send_command
answer = smart_decode(self.stream.readline()[:-1])
File "/usr/lib/python3.4/socket.py", line 374, in readinto return
self._sock.recv_into(b)
socket.timeout: timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File
"/usr/local/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py",
line 883, in send_command
response = connection.send_command(command) File
"/usr/local/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py",
line 1040, in send_command
"Error while receiving", e, proto.ERROR_ON_RECEIVE)
py4j.protocol.Py4JNetworkError: Error while receiving Process Process-1:
Traceback (most recent call last):
File "/usr/lib/python3.4/multiprocessing/process.py", line 254, in
_bootstrap
self.run()
File "/usr/lib/python3.4/multiprocessing/process.py", line 93, in run
self._target(*self._args, **self._kwargs) File
"/usr/local/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py",
line 1133, in _call_
answer, self.gateway_client, self.target_id, self.name) File
"/usr/local/spark-2.2.0-bin-hadoop2.7/python/pyspark/sql/utils.py",
line 63, in deco
return f(*a, **kw)
File
"/usr/local/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py",
line 327, in get_return_value
format(target_id, ".", name))
py4j.protocol.Py4JError: An error occurred while calling o180.fit

Thanks,
Krishnaprasad



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message