spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hossein Vatani <vhp1...@yahoo.com>
Subject Python in worker has different version 2.7 than that in driver 3.5, PySpark cannot run with different minor versions
Date Mon, 29 Feb 2016 06:05:07 GMT
Hi,
Affects Version/s:1.6.0
Component/s:PySpark

I faced below exception when I tried to run
http://spark.apache.org/docs/latest/api/python/pyspark.sql.html?highlight=filter#pyspark.sql.SQLContext.jsonRDD
samples:
Exception: Python in worker has different version 2.7 than that in driver
3.5, PySpark cannot run with different minor versions
my OS is : CentOS 7 and I installed anaconda3,also I have to keep python2.7
for some another application, run Spark with:
PYSPARK_DRIVER_PYTHON=ipython3 pyspark
I have not any config regarding "python" or "ipython" in my profile or
spark-defualt.conf .
could you please assist me?




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Python-in-worker-has-different-version-2-7-than-that-in-driver-3-5-PySpark-cannot-run-with-differents-tp26356.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message