spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: spark-shell can't import the default hive-site.xml options probably.
Date Sun, 01 Feb 2015 17:36:23 GMT
Looking at common/src/java/org/apache/hadoop/hive/conf/HiveConf.java :


METASTORE_CLIENT_CONNECT_RETRY_DELAY("hive.metastore.client.connect.retry.delay",
"1s",
        new TimeValidator(TimeUnit.SECONDS),
        "Number of seconds for the client to wait between consecutive
connection attempts"),

It seems having the 's' suffix is legitimate.

On Sun, Feb 1, 2015 at 9:14 AM, Denny Lee <denny.g.lee@gmail.com> wrote:

> I may be missing something here but typically when the hive-site.xml
> configurations do not require you to place "s" within the configuration
> itself.  Both the retry.delay and socket.timeout values are in seconds so
> you should only need to place the integer value (which are in seconds).
>
>
> On Sun Feb 01 2015 at 2:28:09 AM guxiaobo1982 <guxiaobo1982@qq.com> wrote:
>
>> Hi,
>>
>> To order to let a local spark-shell connect to  a remote spark
>> stand-alone cluster and access  hive tables there, I must put the
>> hive-site.xml file into the local spark installation's conf path, but
>> spark-shell even can't import the default settings there, I found two
>> errors:
>>
>> <property>
>>
>>       <name>hive.metastore.client.connect.retry.delay</name>
>>
>>       <value>5s</value>
>>
>>     </property>
>>
>>     <property>
>>
>>       <name>hive.metastore.client.socket.timeout</name>
>>
>>       <value>1800s</value>
>>
>>     </property>
>> Spark-shell try to read 5s and 1800s and integers, they must be changed
>> to 5 and 1800 to let spark-shell work, It's suggested to be fixed in future
>> versions.
>>
>

Mime
View raw message