spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ☼ R Nair <ravishankar.n...@gmail.com>
Subject Re: Testing Apache Spark applications
Date Thu, 15 Nov 2018 18:42:50 GMT
Sparklens from qubole is a good source. Other tests are to be handled by
developer.

Best,
Ravi

On Thu, Nov 15, 2018, 12:45 PM <Omer.Ozsakarya@sony.com wrote:

> Hi all,
>
>
>
> How are you testing your Spark applications?
>
> We are writing features by using Cucumber. This is testing the behaviours.
> Is this called functional test or integration test?
>
>
>
> We are also planning to write unit tests.
>
>
>
> For instance we have a class like below. It has one method. This methos is
> implementing several things: like DataFrame operations, saving DataFrame
> into database table, insert, update,delete statements.
>
>
>
> Our classes generally contains 2 or 3 methods. These methods cover a lot
> of tasks in the same function defintion. (like the function below)
>
> So I am not sure how I can write unit tests for these classes and methods.
>
> Do you have any suggestion?
>
>
>
> class CustomerOperations
>
>
>
>    def doJob(inputDataFrame : DataFrame) = {
>
>            // definitions (value/variable)
>
>            // spark context, session etc definition
>
>
>
>           //  filtering, cleansing on inputDataframe and save results on a
> new dataframe
>
>           // insert new dataframe to a database table
>
>          //  several insert/update/delete statements on the database tables
>
>
>
>     }
>
>
>
>
>

Mime
View raw message