spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Larry Xiao <>
Subject Compiling Spark master (6ba6c3eb) with sbt/sbt assembly
Date Mon, 04 Aug 2014 03:48:52 GMT
On the latest pull today (6ba6c3ebfe9a47351a50e45271e241140b09bf10) meet 
assembly problem.

$ ./sbt/sbt assembly
Using /usr/lib/jvm/java-7-oracle as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
[info] Loading project definition from ~/spark/project/project
[info] Loading project definition from 
[warn] Multiple resolvers having different access mechanism configured 
with same name 'sbt-plugin-releases'. To avoid conflict, Remove 
duplicate project resolvers (`resolvers`) or rename publishing resolv
er (`publishTo`).
[info] Loading project definition from ~/spark/project
[info] Set current project to spark-parent (in build file:~/spark/)
[info] Compiling 372 Scala sources and 35 Java sources to 
type mismatch;
[error]  found   :
[error]  required:
[error]       stageData.taskData.put(taskInfo.taskId, new 
[error]                                               ^
type mismatch;
[error]  found   :
[error]  required:
[error]       val execSummary = 
execSummaryMap.getOrElseUpdate(info.executorId, new ExecutorSummary)
[error] ^
type mismatch;
[error]  found   :
[error]  required:
[error]       val taskData = 
stageData.taskData.getOrElseUpdate(info.taskId, new TaskUIData(info))
[error] ^
type mismatch;
[error]  found   :
[error]  required:
[error]     val execSummary = 
stageData.executorSummary.getOrElseUpdate(execId, new ExecutorSummary)
[error] ^
~/spark/core/src/main/scala/org/apache/spark/ui/jobs/StagePage.scala:109: type 
[error]  found   : => 
[error]  required: => 
[error] Error occurred in an application involving default arguments.
[error]         taskHeaders, taskRow(hasInput, hasShuffleRead, 
hasShuffleWrite, hasBytesSpilled), tasks)
[error]                             ^
~/spark/core/src/main/scala/org/apache/spark/ui/jobs/StagePage.scala:119: constructor 
cannot be instantiated to expected type;
[error]  found   :
[error]  required:
[error]           val serializationTimes = { case 
TaskUIData(_, metrics, _) =>
[error]                                                          ^
~/spark/core/src/main/scala/org/apache/spark/ui/jobs/StagePage.scala:120: not 
found: value metrics
[error]             metrics.get.resultSerializationTime.toDouble
[error]             ^

I think the code doesn't make correct reference to the updated structure.

"core/src/main/scala/org/apache/spark/ui/jobs/UIData.scala" is 
introduced in commit 72e9021eaf26f31a82120505f8b764b18fbe8d48


To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message