spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dr Mich Talebzadeh" <>
Subject When char will be availble in Spark
Date Mon, 01 Feb 2016 09:42:29 GMT


I am using spark on Hive. Some tables have CHAR type characters. It is my
understanding that spark converts varchar characters to String internally
however the Spark version 1.5.2 that I have throws error when the
underlying Hive table has CHAR fields.

I wanted to when Varchar will be available in Spark.

Also Spark does not seem to understand temporary tables. For example the
following throws error

         > SELECT t.calendar_month_desc, c.channel_desc,
SUM(s.amount_sold) AS TotalSales
         > FROM sales s, times t, channels c
         > WHERE s.time_id = t.time_id
         > AND   s.channel_id = c.channel_id
         > GROUP BY t.calendar_month_desc, c.channel_desc
         > ;
Error in query: Unhandled clauses: TEMPORARY 1, 2,2, 7
You are likely trying to use an unsupported Hive feature.";


Dr Mich Talebzadeh


Sybase ASE 15 Gold Medal Award 2008
A Winning Strategy: Running the most Critical Financial Data on ASE 15
Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE
15", ISBN 978-0-9563693-0-7.
co-author "Sybase Transact SQL Guidelines Best Practices", ISBN
Publications due shortly:
Complex Event Processing in Heterogeneous Environments, ISBN:
Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume
one out shortly

NOTE: The information in this email is proprietary and confidential. This
message is for the designated recipient only, if you are not the intended
recipient, you should destroy it immediately. Any information in this
message shall not be understood as given or endorsed by
Cloudtechnologypartners Ltd, its subsidiaries or their employees, unless
expressly so stated. It is the responsibility of the recipient to ensure
that this email is virus free, therefore neither Cloudtechnologypartners
Ltd, its subsidiaries nor their employees accept any responsibility.

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message