spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Todd <>
Subject Re:Re: EventBatch and SparkFlumeProtocol not found in spark codebase?
Date Fri, 09 Jan 2015 23:37:12 GMT
Thanks  Sean.
I follow the guide, import the codebase into IntellijIdea as Maven project, with the profiles:hadoop2.4
and yarn.
In the maven project view, I run  Maven Install against the module: Spark Project Parent POM(root).After
a pretty long time, all the modules are built successfully.
But when I run the LocalPi example, the compiling errors emerge,
1. EventBatch and SparkFlumeProtocol don't exist
2. There are a bunch of errors complaining q is not member of StringContext in CodeGenerator.scala
Then, I try by clicking the "Generate Sources and Update Folders For All Projects", and repeat
maven install...still success with compiling errors there

Sean, any guide on this?Thanks

At 2015-01-09 18:08:11, "Sean Owen" <> wrote:
>What's up with the IJ questions all of the sudden?
>This PR from yesterday contains a summary of the answer to your question:
> :
>"Rebuild Project" can fail the first time the project is compiled,
>because generate source files are not automatically generated. Try
>clicking the "Generate Sources and Update Folders For All Projects"
>button in the "Maven Projects" tool window to manually generate these
>On Fri, Jan 9, 2015 at 10:03 AM, <> wrote:
>> Hi,
>> When I fetch the Spark code base and import into Intellj Idea as SBT
>> project, then I build it with SBT, but there is compiling errors in the
>> examples module,complaining that the EventBatch and SparkFlumeProtocol,looks
>> they should be in
>> org.apache.spark.streaming.flume.sink package.
>> Not sure what happens.
>> Thanks.
>> ________________________________
>To unsubscribe, e-mail:
>For additional commands, e-mail:
View raw message