spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Holden Karau <hol...@pigscanfly.ca>
Subject Re: How to avoid use snappy compression when saveAsSequenceFile?
Date Mon, 27 Oct 2014 16:29:35 GMT
Can you post the error message you get when trying to save the sequence
file? If you call first() on the RDD does it result in the same error?


On Mon, Oct 27, 2014 at 6:13 AM, buring <qyqbird@gmail.com> wrote:

> Hi:
>         After update spark to version1.1.0, I experienced a snappy error
> which was
> posted here
>
> http://apache-spark-user-list.1001560.n3.nabble.com/Update-gcc-version-Still-snappy-error-tt15137.html
> . I avoid this problem with
>
> code:conf.set("spark.io.compression.codec","org.apache.spark.io.LZ4CompressionCodec").I
> run the als and svd algorithm with a huge 5 500 000 *5 000  sparse matrix
> ,I
> want to save some result in binary format which can save space. Then I
> found
> function 'saveAsSequenceFile' occured the same problem,But I don't know how
> two avoid this problem again. There is some problem with my environment ,I
> just can't solve it out. can any one give me some idea about this problem
> or
> to void use snappy when saveAsSequenceFile?
>     Thanks´╝ü
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-avoid-use-snappy-compression-when-saveAsSequenceFile-tp17350.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>


-- 
Cell : 425-233-8271

Mime
View raw message