spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jain, Nishit" <nja...@underarmour.com>
Subject Re: CSV escaping not working
Date Thu, 27 Oct 2016 17:24:53 GMT
Interesting finding: Escaping works if data is quoted but not otherwise.

From: "Jain, Nishit" <njain1@underarmour.com<mailto:njain1@underarmour.com>>
Date: Thursday, October 27, 2016 at 10:54 AM
To: "user@spark.apache.org<mailto:user@spark.apache.org>" <user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: CSV escaping not working


I am using spark-core version 2.0.1 with Scala 2.11. I have simple code to read a csv file
which has \ escapes.

val myDA = spark.read
      .option("quote",null)
    .schema(mySchema)
    .csv(filePath)


As per documentation \ is default escape for csv reader. But it does not work. Spark is reading
\ as part of my data. For Ex: City column in csv file is north rocks\,au . I am expecting
city column should read in code as northrocks,au. But instead spark reads it as northrocks\
and moves au to next column.

I have tried following but did not work:

  *   Explicitly defined escape .option("escape",”\\")
  *   Changed escape to | or : in file and in code
  *   I have tried using spark-csv library

Any one facing same issue? Am I missing something?

Thanks
Mime
View raw message