Thanks for the specific mention of the new PySpark packaging Shivaram,
For *nix (Linux, Unix, OS X, etc.) Python users interested in helping test the new artifacts you can do as follows:
Setup PySpark with pip by:
2. (Optional): Create a virtual env (e.g. virtualenv /tmp/pysparktest; source /tmp/pysparktest/bin/activate)
3. (Possibly required depending on pip version): Upgrade pip to a recent version (e.g. pip install --upgrade pip)
3. Install the package with pip install pyspark-2.1.0+hadoop2.7.tar.gz
4. If you have SPARK_HOME set to any specific path unset it to force the pip installed pyspark to run with its provided jars
In the future we hope to publish to PyPI allowing you to skip the download step, but there just wasn't a chance to get that part included for this release. If everything goes smoothly hopefully we can add that soon (see SPARK-18128
Some things to verify:
1) Verify you can start the PySpark shell (e.g. run pyspark)
2) Verify you can start PySpark from python (e.g. run python, verify you can import pyspark and construct a SparkContext).
3) Verify you PySpark programs works with pip installed PySpark as well as regular spark (e.g. spark-submit my-workload.py)
4) Have a different version of Spark downloaded locally as well? Verify that launches and runs correctly & pip installed PySpark is not taking precedence (make sure to use the fully qualified path when executing).
Some things that are explicitly not supported in pip installed PySpark:
1) Starting a new standalone cluster with pip installed PySpark (connecting to an existing standalone cluster is expected to work)
2) non-Python Spark interfaces (e.g. don't pip install pypsark for SparkR, use the SparkR packaging instead :)).
4) Python versions prior to 2.7
Post verification cleanup:
1. Uninstall the pip installed PySpark since it is just an RC and you don't want it getting in the way later (e.g. pip uninstall pypsark-2.1.0 )
2 (Optional). deactivate your pip environment
If anyone has any questions about the new PySpark packaging I'm more than happy to chat :)