spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From zenglong chen <>
Subject spark python script importError problem
Date Tue, 16 Jul 2019 11:15:24 GMT
      When i run a run a python script on spark submit,it done well in
local[*] mode,but not in standalone mode or yarn mode.The error like below:

Caused by: org.apache.spark.api.python.PythonException: Traceback (most
recent call last):
  File "/usr/local/lib/python2.7/dist-packages/pyspark/", line
364, in main
    func, profiler, deserializer, serializer = read_command(pickleSer,
  File "/usr/local/lib/python2.7/dist-packages/pyspark/", line 69,
in read_command
    command = serializer._read_with_length(file)
  File "/usr/local/lib/python2.7/dist-packages/pyspark/",
line 172, in _read_with_length
    return self.loads(obj)
  File "/usr/local/lib/python2.7/dist-packages/pyspark/",
line 583, in loads
    return pickle.loads(obj)
ImportError: No module named feature.user.user_feature

The script also run well in "sbin/ sbin/",but
it has the same importError problem in "sbin/
sbin/".The conf/slaves contents is 'localhost'.

What should i do to solve this import problem?Thanks!!!

View raw message