spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sun, Rui" <rui....@intel.com>
Subject RE: [SparkR] Float type coercion with hiveContext
Date Wed, 08 Jul 2015 13:04:28 GMT
Hi, Evgeny,

I reported a JIRA issue for your problem: https://issues.apache.org/jira/browse/SPARK-8897.
You can track it to see how it will be solved.

Ray

-----Original Message-----
From: Evgeny Sinelnikov [mailto:esinelnikov@griddynamics.com] 
Sent: Monday, July 6, 2015 7:27 PM
To: huangzheng
Cc: Apache Spark User List
Subject: Re: [SparkR] Float type coercion with hiveContext

I used spark 1.4.0 binaries from official site:
http://spark.apache.org/downloads.html

And running it on:
* Hortonworks HDP 2.2.0.0-2041
* with Hive 0.14
* with disabled hooks for Application Timeline Servers (ATSHook) in hive-site.xml (commented
hive.exec.failure.hooks, hive.exec.post.hooks, hive.exec.pre.hooks)


On Mon, Jul 6, 2015 at 1:33 PM, huangzheng <1106944911@qq.com> wrote:
>
> Hi , Are you used sparkR about spark 1.4 version? How do build  from 
> spark source code ?
>
> ------------------ 原始邮件 ------------------
> 发件人: "Evgeny Sinelnikov";<esinelnikov@griddynamics.com>;
> 发送时间: 2015年7月6日(星期一) 晚上6:31
> 收件人: "user"<user@spark.apache.org>;
> 主题: [SparkR] Float type coercion with hiveContext
>
> Hello,
>
> I'm got a trouble with float type coercion on SparkR with hiveContext.
>
>> result <- sql(hiveContext, "SELECT offset, percentage from data limit
>> 100")
>
>> show(result)
> DataFrame[offset:float, percentage:float]
>
>> head(result)
> Error in as.data.frame.default(x[[i]], optional = TRUE) :
>     cannot coerce class ""jobj"" to a data.frame
>
>
> This trouble looks like already exists (SPARK-2863 - Emulate Hive type 
> coercion in native reimplementations of Hive functions) with same 
> reason - not completed "native reimplementations of Hive..." not 
> "...functions" only.
>
> It looks like a bug.
> So, anybody met this issue before?
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org For 
> additional commands, e-mail: user-help@spark.apache.org

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org For additional commands, e-mail:
user-help@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org

Mime
View raw message