spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Davies Liu <dav...@databricks.com>
Subject Re: Iterative pyspark / scala codebase development
Date Fri, 27 Mar 2015 17:21:30 GMT
put these lines in your ~/.bash_profile

export SPARK_PREPEND_CLASSES=true
export SPARK_HOME=path_to_spark
export PYTHONPATH="${SPARK_HOME}/python/lib/py4j-0.8.2.1-src.zip:${SPARK_HOME}/python:${PYTHONPATH}"

$ source ~/.bash_profile
$ build/sbt assembly
$ build/sbt ~compile  # do not stop this

Then in another terminal you could run python tests as
$ cd python/pyspark/
$  python rdd.py


cc to dev list


On Fri, Mar 27, 2015 at 10:15 AM, Stephen Boesch <javadba@gmail.com> wrote:
> Which aspect of that page are you suggesting provides a more optimized
> alternative?
>
> 2015-03-27 10:13 GMT-07:00 Davies Liu <davies@databricks.com>:
>
>> see
>> https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools
>>
>> On Fri, Mar 27, 2015 at 10:02 AM, Stephen Boesch <javadba@gmail.com>
>> wrote:
>> > I am iteratively making changes to the scala side of some new pyspark
>> > code
>> > and re-testing from the python/pyspark side.
>> >
>> > Presently my only solution is to rebuild completely
>> >
>> >       sbt assembly
>> >
>> > after any scala side change - no matter how small.
>> >
>> > Any better / expedited way for pyspark to see small scala side updates?
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Mime
View raw message