spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hyukjin Kwon (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-26807) Confusing documentation regarding installation from PyPi
Date Sat, 02 Feb 2019 03:49:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-26807?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16758866#comment-16758866
] 

Hyukjin Kwon commented on SPARK-26807:
--------------------------------------

Can you post a PR?

> Confusing documentation regarding installation from PyPi
> --------------------------------------------------------
>
>                 Key: SPARK-26807
>                 URL: https://issues.apache.org/jira/browse/SPARK-26807
>             Project: Spark
>          Issue Type: Documentation
>          Components: Documentation
>    Affects Versions: 2.4.0
>            Reporter: Emmanuel Arias
>            Priority: Minor
>
> Hello!
> I am new using Spark. Reading the documentation I think that is a little confusing on
Downloading section.
> [ttps://spark.apache.org/docs/latest/#downloading|https://spark.apache.org/docs/latest/#downloading]
write: "Scala and Java users can include Spark in their projects using its Maven coordinates
and in the future Python users can also install Spark from PyPI.", I interpret that currently
Spark is not on PyPi yet. But  [https://spark.apache.org/downloads.html] write: "[PySpark|https://pypi.python.org/pypi/pyspark]
is now available in pypi. To install just run {{pip install pyspark}}."



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message