spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <so...@cloudera.com>
Subject Re: Unable to load realm info from SCDynamicStore
Date Sun, 02 Mar 2014 11:37:10 GMT
This is completely normal for Hadoop. Unless you specially install
some libraries like snappy you will get this, but it does not hurt.
--
Sean Owen | Director, Data Science | London


On Sun, Mar 2, 2014 at 8:40 AM, xiiik <xiiik@qq.com> wrote:
> hi all,
>
> i have build spark-0.9.0-incubating-bin-hadoop2.tgz on my MacBook,  and the pyspark works
well, but got the message below.
> (i don’t have Hadoop installed on my MacBook)
>
>
> …...
> 14/03/02 15:31:59 INFO HttpServer: Starting HTTP Server
> 14/03/02 15:31:59 INFO SparkUI: Started Spark Web UI at http://192.168.1.106:4040
> 2014-03-02 15:32:00.151 java[4814:e903] Unable to load realm info from SCDynamicStore
> 14/03/02 15:32:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
>
>
> how can i fix this?
>
> thanks

Mime
View raw message