spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Richard Hillegas <rhil...@us.ibm.com>
Subject Re: Spark scala REPL - Unable to create sqlContext
Date Mon, 26 Oct 2015 15:18:40 GMT

Note that embedded Derby supports multiple, simultaneous connections, that
is, multiple simultaneous users. But a Derby database is owned by the
process which boots it. Only one process can boot a Derby database at a
given time. The creation of multiple SQL contexts must be spawning multiple
attempts to boot and own the database. If multiple different processes want
to access the same Derby database simultaneously, then the database should
be booted by the Derby network server. After that, the processes which want
to access the database simultaneously can use the Derby network client
driver, not the Derby embedded driver. For more information, see the Derby
Server and Administration Guide:
http://db.apache.org/derby/docs/10.12/adminguide/index.html

Thanks,
Rick Hillegas



Deenar Toraskar <deenar.toraskar@gmail.com> wrote on 10/25/2015 11:29:54
PM:

> From: Deenar Toraskar <deenar.toraskar@gmail.com>
> To: "Ge, Yao (Y.)" <yge@ford.com>
> Cc: Ted Yu <yuzhihong@gmail.com>, user <user@spark.apache.org>
> Date: 10/25/2015 11:30 PM
> Subject: Re: Spark scala REPL - Unable to create sqlContext
>
> Embedded Derby, which Hive/Spark SQL uses as the default metastore
> only supports a single user at a time. Till this issue is fixed, you
> could use another metastore that supports multiple concurrent users
> (e.g. networked derby or mysql) to get around it.
>
> On 25 October 2015 at 16:15, Ge, Yao (Y.) <yge@ford.com> wrote:
> Thanks. I wonder why this is not widely reported in the user forum.
> The RELP shell is basically broken in 1.5 .0 and 1.5.1
> -Yao
>
> From: Ted Yu [mailto:yuzhihong@gmail.com]
> Sent: Sunday, October 25, 2015 12:01 PM
> To: Ge, Yao (Y.)
> Cc: user
> Subject: Re: Spark scala REPL - Unable to create sqlContext
>
> Have you taken a look at the fix for SPARK-11000 which is in the
> upcoming 1.6.0 release ?
>
> Cheers
>
> On Sun, Oct 25, 2015 at 8:42 AM, Yao <yge@ford.com> wrote:
> I have not been able to start Spark scala shell since 1.5 as it was not
able
> to create the sqlContext during the startup. It complains the
metastore_db
> is already locked: "Another instance of Derby may have already booted the
> database". The Derby log is attached.
>
> I only have this problem with starting the shell in yarn-client mode. I
am
> working with HDP2.2.6 which runs Hadoop 2.6.
>
> -Yao derby.log
>
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n25195/derby.log>
>
>
>
> --
> View this message in context: http://apache-spark-user-list.
> 1001560.n3.nabble.com/Spark-scala-REPL-Unable-to-create-sqlContext-
> tp25195.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
Mime
View raw message