spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hyukjin Kwon (JIRA)" <j...@apache.org>
Subject [jira] [Assigned] (SPARK-26831) bin/pyspark: avoid hardcoded `python` command and improve version checks
Date Fri, 08 Feb 2019 02:49:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-26831?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Hyukjin Kwon reassigned SPARK-26831:
------------------------------------

    Assignee: Stefaan Lippens

> bin/pyspark: avoid hardcoded `python` command and improve version checks
> ------------------------------------------------------------------------
>
>                 Key: SPARK-26831
>                 URL: https://issues.apache.org/jira/browse/SPARK-26831
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 2.4.0
>            Reporter: Stefaan Lippens
>            Assignee: Stefaan Lippens
>            Priority: Major
>             Fix For: 3.0.0
>
>
> (this originally started at https://github.com/apache/spark/pull/23736)
> I was trying out pyspark on a system with only a {{python3}}  command but no {{python}}
command and got this error:
> {code}
> /opt/spark/bin/pyspark: line 45: python: command not found
> {code}
> While the pyspark script is full of variables to refer to a python interpreter there
is still a hardcoded {{python}} used for
> {code}
> WORKS_WITH_IPYTHON=$(python -c 'import sys; print(sys.version_info >= (2, 7, 0))')
> {code}
> While looking into this, I also noticed the bash syntax for the IPython version check
is wrong: 
> {code}
> if [[ ! $WORKS_WITH_IPYTHON ]]
> {code}
> always evaluates to false when {{$WORKS_WITH_IPYTHON}} is non-empty (so in both cases
"True" and "False")



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message