spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From zsampson <zsamp...@palantir.com>
Subject When to expect UTF8String?
Date Fri, 12 Jun 2015 03:08:07 GMT
I'm hoping for some clarity about when to expect String vs UTF8String when
using the Java DataFrames API.

In upgrading to Spark 1.4, I'm dealing with a lot of errors where what was
once a String is now a UTF8String. The comments in the file and the related
commit message indicate that maybe it should be internal to SparkSQL's
implementation. 

However, when I add a column containing a custom subclass of Expression, the
row passed to the eval method contains instances of UTF8String. Ditto for
AggregateFunction.update. Is this expected? If so, when should I generally
know to deal with UTF8String objects?



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/When-to-expect-UTF8String-tp12710.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Mime
View raw message