spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "m3.sharma" <sharm...@umn.edu>
Subject Re: spark github source build error
Date Wed, 23 Jul 2014 20:54:57 GMT
Thanks, it works now :)


On Wed, Jul 23, 2014 at 11:45 AM, Xiangrui Meng [via Apache Spark User
List] <ml-node+s1001560n10537h15@n3.nabble.com> wrote:

> try `sbt/sbt clean` first? -Xiangrui
>
> On Wed, Jul 23, 2014 at 11:23 AM, m3.sharma <[hidden email]
> <http://user/SendEmail.jtp?type=node&node=10537&i=0>> wrote:
>
> > I am trying to build spark after cloning from github repo:
> >
> > I am executing:
> > ./sbt/sbt -Dhadoop.version=2.4.0 -Pyarn assembly
> >
> > I am getting following error:
> > [warn]                                 ^
> > [error]
> > [error]      while compiling:
> >
> /home/m3.sharma/installSrc/spark/spark/sql/core/src/main/scala/org/apache/spark/sql/test/TestSQLContext.scala
>
> > [error]         during phase: jvm
> > [error]      library version: version 2.10.4
> > [error]     compiler version: version 2.10.4
> > [error]   reconstructed args: -classpath /home/ .........
> > ....
> > [error]
> > [error]   last tree to typer:
> > Literal(Constant(org.apache.spark.sql.catalyst.types.PrimitiveType))
> > [error]               symbol: null
> > [error]    symbol definition: null
> > [error]                  tpe:
> > Class(classOf[org.apache.spark.sql.catalyst.types.PrimitiveType])
> > [error]        symbol owners:
> > [error]       context owners: object TestSQLContext -> package test
> > [error]
> > [error] == Enclosing template or block ==
> > [error]
> > [error] Template( // val <local TestSQLContext>: <notype> in object
> > TestSQLContext, tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
> > [error]   "org.apache.spark.sql.SQLContext" // parents
> > [error]   ValDef(
> > [error]     private
> > [error]     "_"
> > [error]     <tpt>
> > [error]     <empty>
> > [error]   )
> > [error]   // 2 statements
> > [error]   DefDef( // private def readResolve(): Object in object
> > TestSQLContext
> > [error]     <method> private <synthetic>
> > [error]     "readResolve"
> > [error]     []
> > [error]     List(Nil)
> > [error]     <tpt> // tree.tpe=Object
> > [error]     test.this."TestSQLContext" // object TestSQLContext in
> package
> > test, tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
> > [error]   )
> > [error]   DefDef( // def <init>():
> > org.apache.spark.sql.test.TestSQLContext.type in object TestSQLContext
> > [error]     <method>
> > [error]     "<init>"
> > [error]     []
> > [error]     List(Nil)
> > [error]     <tpt> //
> tree.tpe=org.apache.spark.sql.test.TestSQLContext.type
> > [error]     Block( // tree.tpe=Unit
> > [error]       Apply( // def <init>(sparkContext:
> > org.apache.spark.SparkContext): org.apache.spark.sql.SQLContext in class
> > SQLContext, tree.tpe=org.apach
> > e.spark.sql.SQLContext
> > [error]         TestSQLContext.super."<init>" // def
> <init>(sparkContext:
> > org.apache.spark.SparkContext): org.apache.spark.sql.SQLContext in class
> > SQLCo
> > ntext, tree.tpe=(sparkContext:
> > org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext
> > [error]         Apply( // def <init>(master: String,appName:
> String,conf:
> > org.apache.spark.SparkConf): org.apache.spark.SparkContext in class
> > SparkConte
> > xt, tree.tpe=org.apache.spark.SparkContext
> > [error]           new org.apache.spark.SparkContext."<init>" // def
> > <init>(master: String,appName: String,conf: org.apache.spark.SparkConf):
> > org.apache.
> > spark.SparkContext in class SparkContext, tree.tpe=(master: String,
> appName:
> > String, conf: org.apache.spark.SparkConf)org.apache.spark.SparkContext
> > [error]           // 3 arguments
> > [error]           "local"
> > [error]           "TestSQLContext"
> > [error]           Apply( // def <init>(): org.apache.spark.SparkConf in
> > class SparkConf, tree.tpe=org.apache.spark.SparkConf
> > [error]             new org.apache.spark.SparkConf."<init>" // def
> <init>():
> > org.apache.spark.SparkConf in class SparkConf,
> tree.tpe=()org.apache.spark.
> > SparkConf
> > [error]             Nil
> > [error]           )
> > [error]         )
> > [error]       )
> > [error]       ()
> > [error]     )
> > [error]   )
> > [error] )
> > [error]
> > [error] == Expanded type of tree ==
> > [error]
> > [error] ConstantType(
> > [error]   value =
> > Constant(org.apache.spark.sql.catalyst.types.PrimitiveType)
> > [error] )
> > [error]
> > [error] uncaught exception during compilation: java.lang.AssertionError
> > java.lang.AssertionError: assertion failed: List(object
> package$DebugNode,
> > object package$DebugNode)
> >   at scala.reflect.internal.Symbols$Symbol.suchThat(Symbols.scala:1678)
> >         at
> >
> scala.reflect.internal.Symbols$ClassSymbol.companionModule0(Symbols.scala:2988)
>
> >         at
> >
> scala.reflect.internal.Symbols$ClassSymbol.companionModule(Symbols.scala:2991)
>
> >         at
> >
> scala.tools.nsc.backend.jvm.GenASM$JPlainBuilder.genClass(GenASM.scala:1371)
>
> >         at
> scala.tools.nsc.backend.jvm.GenASM$AsmPhase.run(GenASM.scala:120)
> >         at
> > scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1583)
> >         at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1557)
> >         at scala.tools.nsc.Global$Run.compileSources(Global.scala:1553)
> >         at scala.tools.nsc.Global$Run.compile(Global.scala:1662)
> >         at xsbt.CachedCompiler0.run(CompilerInterface.scala:123)
> >         at xsbt.CachedCompiler0.run(CompilerInterface.scala:99)
> >         at xsbt.CompilerInterface.run(CompilerInterface.scala:27)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
> >         at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> >         at java.lang.reflect.Method.invoke(Method.java:606)
> >         at
> sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:102)
> >         at
> > sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:48)
> >         at
> > sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:41)
> >         at
> >
> sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply$mcV$sp(AggressiveCompile.scala:99)
>
> >         at
> >
> sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
>
> >         at
> >
> sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
>
> >         at
> >
> sbt.compiler.AggressiveCompile.sbt$compiler$AggressiveCompile$$timed(AggressiveCompile.scala:166)
>
> >         at
> >
> sbt.compiler.AggressiveCompile$$anonfun$3.compileScala$1(AggressiveCompile.scala:98)
>
> >         at
> >
> sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:143)
>
> >         at
> >
> sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:87)
> >         at
> > sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:39)
> >         at
> > sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:37)
> >         at sbt.inc.IncrementalCommon.cycle(Incremental.scala:99)
> >         at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:38)
> >         at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:37)
> >         at sbt.inc.Incremental$.manageClassfiles(Incremental.scala:65)
> >         at sbt.inc.Incremental$.compile(Incremental.scala:37)
> >         at sbt.inc.IncrementalCompile$.apply(Compile.scala:27)
> >         at
> > sbt.compiler.AggressiveCompile.compile2(AggressiveCompile.scala:157)
> >         at
> > sbt.compiler.AggressiveCompile.compile1(AggressiveCompile.scala:71)
> >         at
> sbt.compiler.AggressiveCompile.apply(AggressiveCompile.scala:46)
> >         at sbt.Compiler$.apply(Compiler.scala:75)
> >         at sbt.Compiler$.apply(Compiler.scala:66)
> >         at
> sbt.Defaults$.sbt$Defaults$$compileTaskImpl(Defaults.scala:770)
> >         at sbt.Defaults$$anonfun$compileTask$1.apply(Defaults.scala:762)
> >         at sbt.Defaults$$anonfun$compileTask$1.apply(Defaults.scala:762)
> >         at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
> >         at
> > sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:42)
> >         at sbt.std.Transform$$anon$4.work(System.scala:64)
> >         at
> > sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
> >         at
> > sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
> >         at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
> >         at sbt.Execute.work(Execute.scala:244)
> >         at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
> >         at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
> >         at
> >
> sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
>
> >         at
> sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
> >         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> >         at
> > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> >         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> >         at
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>
> >         at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>
> >         at java.lang.Thread.run(Thread.java:744)
> > [error] (sql/compile:compile) java.lang.AssertionError: assertion
> failed:
> > List(object package$DebugNode, object package$DebugNode)
> > [error] Total time: 126 s, completed Jul 23, 2014 11:19:27 AM
> >
> > I dont want spark sql, I can do without it.
> >
> >
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-github-source-build-error-tp10532.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>
> ------------------------------
>  If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-github-source-build-error-tp10532p10537.html
>  To unsubscribe from spark github source build error, click here
> <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=10532&code=c2hhcm0xNjNAdW1uLmVkdXwxMDUzMnwxMjc3MzY0Mzc1>
> .
> NAML
> <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-github-source-build-error-tp10532p10545.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Mime
View raw message