spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "李奇平" <>
Subject Can't find pyspark when using PySpark on YARN
Date Tue, 10 Jun 2014 13:35:17 GMT
Dear all,

When I submit a pyspark application using this command:
./bin/spark-submit --master yarn-client examples/src/main/python/ "hdfs://..."
I get the following exception:
Error from python worker:
Traceback (most recent call last):
File "/usr/ali/lib/python2.5/", line 85, in run_module
loader = get_loader(mod_name)
File "/usr/ali/lib/python2.5/", line 456, in get_loader
return find_loader(fullname)
File "/usr/ali/lib/python2.5/", line 466, in find_loader
for importer in iter_importers(fullname):
File "/usr/ali/lib/python2.5/", line 422, in iter_importers
ImportError: No module named pyspark
Maybe `pyspark/python` and `` is not included in the YARN worker, How can
I distribute these files with my application? Can I use `--pyfiles, `?Or
how can I package modules in pyspark to a .egg file?

View raw message