spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From linkpatrickliu <linkpatrick...@live.com>
Subject RE: SparkSQL 1.1 hang when "DROP" or "LOAD"
Date Tue, 16 Sep 2014 06:45:42 GMT
Seems like the thriftServer cannot connect to Zookeeper, so it cannot get
lock.

This is how it the log looks when I run SparkSQL:
"load data inpath "kv1.txt" into table src;"
log:
14/09/16 14:40:47 INFO Driver: <PERFLOG method=acquireReadWriteLocks>
14/09/16 14:40:47 INFO ClientCnxn: Opening socket connection to server
SVR4044HW2285.hadoop.lpt.qa.nt.ctripcorp.com/10.2.4.191:2181. Will not
attempt to authenticate using SASL (unknown error)
14/09/16 14:40:47 INFO ClientCnxn: Socket connection established to
SVR4044HW2285.hadoop.lpt.qa.nt.ctripcorp.com/10.2.4.191:2181, initiating
session
14/09/16 14:40:47 INFO ClientCnxn: Session establishment complete on server
SVR4044HW2285.hadoop.lpt.qa.nt.ctripcorp.com/10.2.4.191:2181, sessionid =
0x347c1b1f78d495e, negotiated timeout = 180000
14/09/16 14:40:47 INFO Driver: </PERFLOG method=acquireReadWriteLocks
start=1410849647447 end=1410849647457 duration=10>

You can see, between the PERFLOG of acquireReadWriteLocks, the ClientCnxn
will try to connect to Zookeeper. After the connection has been successfully
established, the acquireReadWriteLocks phrase can be finished.

But, when I run the ThriftServer, and run the same SQL.
Here is the log:
14/09/16 14:40:09 INFO Driver: <PERFLOG method=acquireReadWriteLocks>

It will wait here. 
So I doubt, the reason why "Drop" or "Load" failed in thriftServer mode, is
because of the thriftServer cannot connect to Zookeeper.





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-1-1-hang-when-DROP-or-LOAD-tp14222p14336.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message