spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From SNEHASISH DUTTA <info.snehas...@gmail.com>
Subject Re: Serialize a DataFrame with Vector values into text/csv file
Date Tue, 20 Feb 2018 22:08:12 GMT
 Hi Mina,
This might work then

df.coalesce(1).write.option("header","true").mode("overwrite
").text("output")

Regards,
Snehasish

On Wed, Feb 21, 2018 at 3:21 AM, Mina Aslani <aslanimina@gmail.com> wrote:

> Hi Snehasish,
>
> Using df.coalesce(1).write.option("header","true").mode("overwrite
> ").csv("output") throws
>
> java.lang.UnsupportedOperationException: CSV data source does not support
> struct<...> data type.
>
>
> Regards,
> Mina
>
>
>
>
> On Tue, Feb 20, 2018 at 4:36 PM, SNEHASISH DUTTA <info.snehasish@gmail.com
> > wrote:
>
>> Hi Mina,
>> This might help
>> df.coalesce(1).write.option("header","true").mode("overwrite
>> ").csv("output")
>>
>> Regards,
>> Snehasish
>>
>> On Wed, Feb 21, 2018 at 1:53 AM, Mina Aslani <aslanimina@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> I would like to serialize a dataframe with vector values into a text/csv
>>> in pyspark.
>>>
>>> Using below line, I can write the dataframe(e.g. df) as parquet, however
>>> I cannot open it in excel/as text.
>>> df.coalesce(1).write.option("header","true").mode("overwrite
>>> ").save("output")
>>>
>>> Best regards,
>>> Mina
>>>
>>>
>>
>

Mime
View raw message