spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sun, Rui" <rui....@intel.com>
Subject RE: unserialize error in sparkR
Date Mon, 27 Jul 2015 09:07:48 GMT
Hi, 

Do you mean you are running the script with https://github.com/amplab-extras/SparkR-pkg and
spark 1.2? I am afraid that currently there is no development effort and support on  the SparkR-pkg
since it has been integrated into Spark since Spark 1.4.

Unfortunately, the RDD API and RDD-like API of DataFrame of SparkR is not exposed in Spark
1.4 for some considerations. Although not exposed, some RDD-like API of DataFrame are actually
implemented which you can find in the SparkR source code, including lapply/lapplyPartition/flatMap/foreach/foreachPartition.
Though not recommended, but if you really want to use them, you can use SparkR::: to access
them as a temporary workaround.

There is on-going investigation and discussion on whether to expose a subset of RDD API or
not, you can refer to https://issues.apache.org/jira/browse/SPARK-7264 if you are interested.

-----Original Message-----
From: Jennifer15 [mailto:bsaberid@purdue.edu] 
Sent: Monday, July 27, 2015 1:47 PM
To: user@spark.apache.org
Subject: unserialize error in sparkR

Hi,
I have a newbie question; I get the following error by increasing the number of samples in
my sample script  samplescript.R <http://apache-spark-user-list.1001560.n3.nabble.com/file/n24002/samplescript.R>
, which is written in Spark1.2 (no error for small sample of error):

Error in unserialize(obj) : 
ReadItem: unknown type 0, perhaps written by later version of R
Calls: assetForecast ... convertJListToRList -> lapply -> lapply -> FUN   ->
unserialize
Execution halted

I tried using Spark1.4 though I could not find lapply or any similar functions for dataframes.
I am not sure if this error is because of using spark1.2 though if it is, what is the equivalent
of lapply/map to work on dataframes?




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/unserialize-error-in-sparkR-tp24002.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org For additional commands, e-mail:
user-help@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message