spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From sdeb <>
Subject hadoop + yarn + spark
Date Sat, 28 Jun 2014 00:00:32 GMT

I have installed spark on top of hadoop + yarn.
when I launch the pyspark shell & try to compute something I get this error.

Error from python worker:
  /usr/bin/python: No module named pyspark

The pyspark module should be there, do I have to put an external link to it?


View this message in context:
Sent from the Apache Spark User List mailing list archive at

View raw message