I am using spark to ingest data from file to database Oracle table . For one of the fields , the value to be populated is generated from a function that is written in database .
The input to the function is one of the fields of data frame
I wanted to use spark.dbc.write to perform the operation, which generates the insert query at back end .
For example : It can generate the insert query as :
Insert into table values (?,?, getfunctionvalue(?) )
Please advise if it is possible in spark and if yes , how can it be done
This is little urgent for me . So any help is appreciated