spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aakash Basu <aakash.spark....@gmail.com>
Subject Re: PySpark 2.1 Not instantiating properly
Date Fri, 20 Oct 2017 16:57:15 GMT
Hey Marco/Jagat,

As I earlier informed you, that I've already done those basic checks and
permission changes.

eg: D:\winutils\bin\winutils.exe chmod 777 D:\tmp\hive, but to no avail. It
still throws the same error. At the very first place, I do not understand,
without any manual change, how did the permissions change automatically?

To Jagat's question - "Do you have winutils in your system relevant for
your system." - How to understand that? I did not find winutils specific to
OS/bits.

Any other solutions? Should I download the fresh zip of Spark and redo all
the steps of configuring? The chmod is just not working (without any errors
while submitting the above command).


Thanks,
Aakash.

On Fri, Oct 20, 2017 at 9:53 PM, Jagat Singh <jagatsingh@gmail.com> wrote:

> Do you have winutils in your system relevant for your system.
>
> This SO post has infomation related https://stackoverflow.
> com/questions/34196302/the-root-scratch-dir-tmp-hive-on-
> hdfs-should-be-writable-current-permissions
>
>
>
> On 21 October 2017 at 03:16, Marco Mistroni <mmistroni@gmail.com> wrote:
>
>> Did u build spark or download the zip?
>> I remember having similar issue...either you have to give write perm to
>> your /tmp directory or there's a spark config you need to override
>> This error is not 2.1 specific.......let me get home and check my configs
>> I think I amended my /tmp permissions via xterm instead of control panel
>>
>> Hth
>>  Marco
>>
>>
>> On Oct 20, 2017 8:31 AM, "Aakash Basu" <aakash.spark.raj@gmail.com>
>> wrote:
>>
>> Hi all,
>>
>> I have Spark 2.1 installed in my laptop where I used to run all my
>> programs. PySpark wasn't used for around 1 month, and after starting it
>> now, I'm getting this exception (I've tried the solutions I could find on
>> Google, but to no avail).
>>
>> Specs: Spark 2.1.1, Python 3.6, HADOOP 2.7, Windows 10 Pro, 64 Bits.
>>
>>
>> py4j.protocol.Py4JJavaError: An error occurred while calling
>> o27.sessionState.
>> : java.lang.IllegalArgumentException: Error while instantiating
>> 'org.apache.spark.sql.hive.HiveSessionState':
>>         at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$Spar
>> kSession$$reflect(SparkSession.scala:981)
>>         at org.apache.spark.sql.SparkSession.sessionState$lzycompute(Sp
>> arkSession.scala:110)
>>         at org.apache.spark.sql.SparkSession.sessionState(SparkSession.
>> scala:109)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
>> ssorImpl.java:62)
>>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>> thodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:498)
>>         at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
>>         at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.jav
>> a:357)
>>         at py4j.Gateway.invoke(Gateway.java:280)
>>         at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.j
>> ava:132)
>>         at py4j.commands.CallCommand.execute(CallCommand.java:79)
>>         at py4j.GatewayConnection.run(GatewayConnection.java:214)
>>         at java.lang.Thread.run(Thread.java:748)
>> Caused by: java.lang.reflect.InvocationTargetException
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(Native
>> ConstructorAccessorImpl.java:62)
>>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(De
>> legatingConstructorAccessorImpl.java:45)
>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:4
>> 23)
>>         at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$Spar
>> kSession$$reflect(SparkSession.scala:978)
>>         ... 13 more
>> Caused by: java.lang.IllegalArgumentException: Error while instantiating
>> 'org.apache.spark.sql.hive.HiveExternalCatalog':
>>         at org.apache.spark.sql.internal.SharedState$.org$apache$spark$
>> sql$internal$SharedState$$reflect(SharedState.scala:169)
>>         at org.apache.spark.sql.internal.SharedState.<init>(SharedState
>> .scala:86)
>>         at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.app
>> ly(SparkSession.scala:101)
>>         at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.app
>> ly(SparkSession.scala:101)
>>         at scala.Option.getOrElse(Option.scala:121)
>>         at org.apache.spark.sql.SparkSession.sharedState$lzycompute(Spa
>> rkSession.scala:101)
>>         at org.apache.spark.sql.SparkSession.sharedState(SparkSession.s
>> cala:100)
>>         at org.apache.spark.sql.internal.SessionState.<init>(SessionSta
>> te.scala:157)
>>         at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessio
>> nState.scala:32)
>>         ... 18 more
>> Caused by: java.lang.reflect.InvocationTargetException
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(Native
>> ConstructorAccessorImpl.java:62)
>>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(De
>> legatingConstructorAccessorImpl.java:45)
>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:4
>> 23)
>>         at org.apache.spark.sql.internal.SharedState$.org$apache$spark$
>> sql$internal$SharedState$$reflect(SharedState.scala:166)
>>         ... 26 more
>> Caused by: java.lang.reflect.InvocationTargetException
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(Native
>> ConstructorAccessorImpl.java:62)
>>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(De
>> legatingConstructorAccessorImpl.java:45)
>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:4
>> 23)
>>         at org.apache.spark.sql.hive.client.IsolatedClientLoader.create
>> Client(IsolatedClientLoader.scala:264)
>>         at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(Hi
>> veUtils.scala:358)
>>         at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(Hi
>> veUtils.scala:262)
>>         at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExt
>> ernalCatalog.scala:66)
>>         ... 31 more
>> Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The
>> root scratch dir: /tmp/hive on HDFS should be writable. Current permissions
>> are: rw-rw-rw-
>>         at org.apache.hadoop.hive.ql.session.SessionState.start(Session
>> State.java:522)
>>         at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveC
>> lientImpl.scala:188)
>>         ... 39 more
>> Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on
>> HDFS should be writable. Current permissions are: rw-rw-rw-
>>         at org.apache.hadoop.hive.ql.session.SessionState.createRootHDF
>> SDir(SessionState.java:612)
>>         at org.apache.hadoop.hive.ql.session.SessionState.createSession
>> Dirs(SessionState.java:554)
>>         at org.apache.hadoop.hive.ql.session.SessionState.start(Session
>> State.java:508)
>>         ... 40 more
>>
>>
>> During handling of the above exception, another exception occurred:
>>
>> Traceback (most recent call last):
>>   File "C:\opt\spark\spark-2.1.1-bin-hadoop2.7\bin\..\python\pyspark\shell.py",
>> line 43, in <module>
>>     spark = SparkSession.builder\
>>   File "C:\opt\spark\spark-2.1.1-bin-hadoop2.7\python\pyspark\sql\session.py",
>> line 179, in getOrCreate
>>     session._jsparkSession.sessionState().conf().setConfString(key,
>> value)
>>   File "C:\opt\spark\spark-2.1.1-bin-hadoop2.7\python\lib\py4j-0.10
>> .4-src.zip\py4j\java_gateway.py", line 1133, in __call__
>>   File "C:\opt\spark\spark-2.1.1-bin-hadoop2.7\python\pyspark\sql\utils.py",
>> line 79, in deco
>>     raise IllegalArgumentException(s.split(': ', 1)[1], stackTrace)
>> pyspark.sql.utils.IllegalArgumentException: "Error while instantiating
>> 'org.apache.spark.sql.hive.HiveSessionState':"
>>
>>
>>
>> Please help!
>>
>> Thanks,
>> Aakash.
>>
>>
>>
>

Mime
View raw message