spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From jeff saremi <jeffsar...@hotmail.com>
Subject Continue reading dataframe from file despite errors
Date Tue, 12 Sep 2017 21:32:03 GMT
I'm using a statement like the following to load my dataframe from some text file

Upon encountering the first error, the whole thing throws an exception and processing stops.

I'd like to continue loading even if that results in zero rows in my dataframe. How can i
do that?
thanks


spark.read.schema(SomeSchema).option("sep", "\t").format("csv").load("somepath")



Mime
View raw message