spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nicholas Chammas <nicholas.cham...@gmail.com>
Subject Re: Contributing to Spark needs PySpark build/test instructions
Date Tue, 22 Jul 2014 04:20:10 GMT
Looks good. Does sbt/sbt test cover the same tests as /dev/run-tests?

I’m looking at step 5 under “Contributing Code”. Someone contributing to
PySpark will want to be directed to run something in addition to (or
instead of) sbt/sbt test, I believe.

Nick
​


On Mon, Jul 21, 2014 at 11:43 PM, Reynold Xin <rxin@databricks.com> wrote:

> I added an automated testing section:
>
> https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-AutomatedTesting
>
> Can you take a look to see if it is what you had in mind?
>
>
>
> On Mon, Jul 21, 2014 at 3:54 PM, Nicholas Chammas <
> nicholas.chammas@gmail.com> wrote:
>
> > For the record, the triggering discussion is here
> > <https://github.com/apache/spark/pull/1505#issuecomment-49671550>. I
> > assumed that sbt/sbt test covers all the tests required before
> submitting a
> > patch, and it appears that it doesn’t.
> > ​
> >
> >
> > On Mon, Jul 21, 2014 at 6:42 PM, Nicholas Chammas <
> > nicholas.chammas@gmail.com> wrote:
> >
> > > Contributing to Spark
> > > <
> https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark
> > >
> > > needs a line or two about building and testing PySpark. A call out of
> > > run-tests, for example, would be helpful for new contributors to
> PySpark.
> > >
> > > Nick
> > > ​
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message