spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Xiao Li <lix...@databricks.com>
Subject Re: Fail to use SparkR of 3.0 preview 2
Date Tue, 07 Jan 2020 18:47:49 GMT
We can use R version 3.6.1, if we have a concern about the quality of 3.6.2?

On Thu, Dec 26, 2019 at 8:14 PM Hyukjin Kwon <gurwls223@gmail.com> wrote:

> I was randomly googling out of curiosity, and seems indeed that's the
> problem (
> https://r.789695.n4.nabble.com/Error-in-rbind-info-getNamespaceInfo-env-quot-S3methods-quot-td4755490.html
> ).
> Yes, seems we should make sure we build SparkR in an old version.
> Since that support for R prior to version 3.4 is deprecated as of Spark
> 3.0.0, we could use either R 3.4 or matching to Jenkins's (R 3.1 IIRC) for
> Spark 3.0 release.
>
> Redirecting to a dev list and Yuming as well for visibility.
>
> 2019년 12월 27일 (금) 오후 12:02, Jeff Zhang <zjffdu@gmail.com>님이 작성:
>
>> Yes, I guess so. But R 3.6.2 is just released this month, I think we
>> should use an older version to build SparkR.
>>
>> Felix Cheung <felixcheung_m@hotmail.com> 于2019年12月27日周五 上午10:43写道:
>>
>>> Maybe it’s the reverse - the package is built to run in latest but not
>>> compatible with slightly older (3.5.2 was Dec 2018)
>>>
>>> ------------------------------
>>> *From:* Jeff Zhang <zjffdu@gmail.com>
>>> *Sent:* Thursday, December 26, 2019 5:36:50 PM
>>> *To:* Felix Cheung <felixcheung_m@hotmail.com>
>>> *Cc:* user.spark <user@spark.apache.org>
>>> *Subject:* Re: Fail to use SparkR of 3.0 preview 2
>>>
>>> I use R 3.5.2
>>>
>>> Felix Cheung <felixcheung_m@hotmail.com> 于2019年12月27日周五 上午4:32写道:
>>>
>>> It looks like a change in the method signature in R base packages.
>>>
>>> Which version of R are you running on?
>>>
>>> ------------------------------
>>> *From:* Jeff Zhang <zjffdu@gmail.com>
>>> *Sent:* Thursday, December 26, 2019 12:46:12 AM
>>> *To:* user.spark <user@spark.apache.org>
>>> *Subject:* Fail to use SparkR of 3.0 preview 2
>>>
>>> I tried SparkR of spark 3.0 preview 2, but hit the following issue.
>>>
>>> Error in rbind(info, getNamespaceInfo(env, "S3methods")) :
>>>   number of columns of matrices must match (see arg 2)
>>> Error: package or namespace load failed for ‘SparkR’ in rbind(info,
>>> getNamespaceInfo(env, "S3methods")):
>>>  number of columns of matrices must match (see arg 2)
>>> During startup - Warning messages:
>>> 1: package ‘SparkR’ was built under R version 3.6.2
>>> 2: package ‘SparkR’ in options("defaultPackages") was not found
>>>
>>> Does anyone know what might be wrong ? Thanks
>>>
>>>
>>>
>>> --
>>> Best Regards
>>>
>>> Jeff Zhang
>>>
>>>
>>>
>>> --
>>> Best Regards
>>>
>>> Jeff Zhang
>>>
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>

-- 
[image: Databricks Summit - Watch the talks]
<https://databricks.com/sparkaisummit/north-america>

Mime
View raw message