spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Christian <>
Subject pyspark and Python virtual enviroments
Date Wed, 05 Mar 2014 12:54:31 GMT

I usually create different python virtual environments for different
projects to avoid version conflicts and skip the requirement to be root to
install libs.

How can I specify to pyspark to activate a virtual environment before
executing the tasks ?

Further info on virtual envs:

Thanks in advance,

View raw message