spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <Omer.Ozsaka...@sony.com>
Subject Testing Apache Spark applications
Date Thu, 15 Nov 2018 17:44:49 GMT
Hi all,

How are you testing your Spark applications?
We are writing features by using Cucumber. This is testing the behaviours. Is this called
functional test or integration test?

We are also planning to write unit tests.

For instance we have a class like below. It has one method. This methos is implementing several
things: like DataFrame operations, saving DataFrame into database table, insert, update,delete
statements.

Our classes generally contains 2 or 3 methods. These methods cover a lot of tasks in the same
function defintion. (like the function below)
So I am not sure how I can write unit tests for these classes and methods.
Do you have any suggestion?


class CustomerOperations

   def doJob(inputDataFrame : DataFrame) = {
           // definitions (value/variable)
           // spark context, session etc definition

          //  filtering, cleansing on inputDataframe and save results on a new dataframe
          // insert new dataframe to a database table
         //  several insert/update/delete statements on the database tables

    }


Mime
View raw message