spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hyukjin Kwon (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-24475) Nested JSON count() Exception
Date Thu, 07 Jun 2018 04:47:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-24475?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16504235#comment-16504235
] 

Hyukjin Kwon commented on SPARK-24475:
--------------------------------------

I don't think Spark currently support Java 9 and 10 yet. Let's leave this as a duplicate of
SPARK-24417

> Nested JSON count() Exception
> -----------------------------
>
>                 Key: SPARK-24475
>                 URL: https://issues.apache.org/jira/browse/SPARK-24475
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.3.0
>            Reporter: Joseph Toth
>            Priority: Major
>
> I have nested structure json file only 2 rows.
>  
> {{spark = SparkSession.builder.appName("JSONRead").getOrCreate()}}
> {{jsonData = spark.read.json(file)}}
> {{jsonData.count() will crash with the following exception, jsonData.head(10) works.}}{{}}
>  
> Traceback (most recent call last):
>  File "/usr/lib/python3/dist-packages/IPython/core/interactiveshell.py", line 2882, in
run_code
>  exec(code_obj, self.user_global_ns, self.user_ns)
>  File "<ipython-input-46-ef7220990d92>", line 1, in <module>
>  jsonData.count()
>  File "/usr/local/lib/python3.6/dist-packages/pyspark/sql/dataframe.py", line 455, in
count
>  return int(self._jdf.count())
>  File "/usr/local/lib/python3.6/dist-packages/py4j/java_gateway.py", line 1160, in __call__
>  answer, self.gateway_client, self.target_id, self.name)
>  File "/usr/local/lib/python3.6/dist-packages/pyspark/sql/utils.py", line 63, in deco
>  return f(*a, **kw)
>  File "/usr/local/lib/python3.6/dist-packages/py4j/protocol.py", line 320, in get_return_value
>  format(target_id, ".", name), value)
> py4j.protocol.Py4JJavaError: An error occurred while calling o411.count.
> : java.lang.IllegalArgumentException
>  at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
>  at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
>  at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
>  at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:46)
>  at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:449)
>  at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:432)
>  at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
>  at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
>  at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
>  at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
>  at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
>  at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:103)
>  at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
>  at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:432)
>  at org.apache.xbean.asm5.ClassReader.a(Unknown Source)
>  at org.apache.xbean.asm5.ClassReader.b(Unknown Source)
>  at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
>  at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
>  at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:262)
>  at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:261)
>  at scala.collection.immutable.List.foreach(List.scala:381)
>  at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:261)
>  at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:159)
>  at org.apache.spark.SparkContext.clean(SparkContext.scala:2292)
>  at org.apache.spark.SparkContext.runJob(SparkContext.scala:2066)
>  at org.apache.spark.SparkContext.runJob(SparkContext.scala:2092)
>  at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:939)
>  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
>  at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
>  at org.apache.spark.rdd.RDD.collect(RDD.scala:938)
>  at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:297)
>  at org.apache.spark.sql.Dataset$$anonfun$count$1.apply(Dataset.scala:2770)
>  at org.apache.spark.sql.Dataset$$anonfun$count$1.apply(Dataset.scala:2769)
>  at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3253)
>  at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
>  at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3252)
>  at org.apache.spark.sql.Dataset.count(Dataset.scala:2769)
>  at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>  at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.base/java.lang.reflect.Method.invoke(Method.java:564)
>  at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
>  at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
>  at py4j.Gateway.invoke(Gateway.java:282)
>  at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
>  at py4j.commands.CallCommand.execute(CallCommand.java:79)
>  at py4j.GatewayConnection.run(GatewayConnection.java:214)
>  at java.base/java.lang.Thread.run(Thread.java:844)
>  
> {"insertId":"1x3kn8pg3lweeql","jsonPayload":\{"channel":"ORDER-BITF--NEO--USD","ordertype":"Sell","price":30.007999999999999,"quantity":5.2375409399999997,"timestamp":"2017-10-18
03:59:59","total":"157.16812853"},"logName":"projects/m/logs/coinigy-dev","receiveTimestamp":"2017-10-18T03:59:59.911829261Z","resource":\{"labels":{"project_id":"m"},"type":"global"},"timestamp":"2017-10-18T03:59:59.911829261Z"}
> {"insertId":"2shvsbg3lt5jpc","jsonPayload":\{"channel":"ORDER-BITF--NEO--USD","ordertype":"Sell","price":30,"quantity":353.83487022999998,"timestamp":"2017-10-18
03:59:59","total":"10615.04610690"},"logName":"projects/m/logs/coinigy-dev","receiveTimestamp":"2017-10-18T03:59:59.994692698Z","resource":\{"labels":{"project_id":"m"},"type":"global"},"timestamp":"2017-10-18T03:59:59.994692698Z"}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message