I just flailed on this a bit before finding this email.  Can someone please update https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IDESetup

On Mon, Apr 4, 2016 at 10:01 PM, Reynold Xin <rxin@databricks.com> wrote:
pyspark and R

On Mon, Apr 4, 2016 at 9:59 PM, Marcelo Vanzin <vanzin@cloudera.com> wrote:
No, tests (except pyspark) should work without having to package anything first.

On Mon, Apr 4, 2016 at 9:58 PM, Koert Kuipers <koert@tresata.com> wrote:
> do i need to run sbt package before doing tests?
>
> On Mon, Apr 4, 2016 at 11:00 PM, Marcelo Vanzin <vanzin@cloudera.com> wrote:
>>
>> Hey all,
>>
>> We merged  SPARK-13579 today, and if you're like me and have your
>> hands automatically type "sbt assembly" anytime you're building Spark,
>> that won't work anymore.
>>
>> You should now use "sbt package"; you'll still need "sbt assembly" if
>> you require one of the remaining assemblies (streaming connectors,
>> yarn shuffle service).
>>
>>
>> --
>> Marcelo
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>



--
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org





--
Michael Gummelt
Software Engineer
Mesosphere