spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From anbutech <anbutec...@outlook.com>
Subject Spark Write method not ignoring double quotes in the csv file
Date Fri, 12 Jul 2019 03:45:33 GMT
Hello All, Could you please help me to fix the below questions

Question 1:

I have tried the below options while writing the final data in a csv file to
ignore double quotes in the same csv file .nothing is worked. I'm using
spark version 2.2 and scala version 2.11 .

option("quote", "\"")

.option("escape", ":")

.option("escape", "")

.option("quote", "\u0000")

Code:

finaldataset

.repartitions(numberofpartitions)

.mode(Savemode.overwrite)

.option("delimiter","|")

.option("header","true")

.csv("path")

output_data.csv

field|field2|""|field4|field5|""|field6|""|field7

I want to remove double quotes in the csv file while writing spark method.is
there any options available?

Question 2: Is there any way to remove the trailing white spaces in the
fields while reading the parquet file.

Thanks Anbu



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message