spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Soumya Simanta <soumya.sima...@gmail.com>
Subject Re: scalac crash when compiling DataTypeConversions.scala
Date Tue, 28 Oct 2014 01:12:09 GMT
You need to change the Scala compiler from IntelliJ to “sbt incremental
compiler” (see the

screenshot below).



You can access this by going to “preferences” ­> “scala”.

NOTE: This is supported only for certain version of IntelliJ scala plugin.
See this link for details.

http://blog.jetbrains.com/scala/2014/01/30/try
­faster­scala­compiler­in­intellij­idea­13­0­2/

On Mon, Oct 27, 2014 at 9:04 PM, Yana Kadiyska <yana.kadiyska@gmail.com>
wrote:

> guoxu1231, I struggled with the Idea problem for a full week. Same thing
> -- clean builds under MVN/Sbt, but no luck with IDEA. What worked for me
> was the solution posted higher up in this thread -- it's a SO post that
> basically says to delete all iml files anywhere under the project
> directory.
>
> Let me know if you can't see this mail and I'll locate the exact SO post
>
> On Mon, Oct 27, 2014 at 5:15 AM, guoxu1231 <guoxu1231@gmail.com> wrote:
>
>> Hi Stephen,
>> I tried it again,
>> To avoid the profile impact,  I execute "mvn -DskipTests clean package"
>> with
>> Hadoop 1.0.4 by default and open the IDEA and import it as a maven
>> project,
>> and I didn't choose any profile in the import wizard.
>> Then "Make project" or "re-build project" in IDEA,  unfortunately the
>> DataTypeConversions.scala compile failed agian.........
>>
>>
>> Any updated guide for using With IntelliJ IDEA?  I'm following the
>> "Building
>> Spark with Maven" in website.
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/scalac-crash-when-compiling-DataTypeConversions-scala-tp17083p17333.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>

Mime
View raw message