spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matei Zaharia <matei.zaha...@gmail.com>
Subject Re: "log" overloaded in SparkContext/ Spark 1.0.x
Date Tue, 05 Aug 2014 00:41:54 GMT
Hah, weird. "log" should be protected actually (look at trait Logging). Is your class extending
SparkContext or somehow being placed in the org.apache.spark package? Or maybe the Scala compiler
looks at it anyway.. in that case we can rename it. Please open a JIRA for it if that's the
case.

On August 4, 2014 at 5:02:27 PM, Dmitriy Lyubimov (dlieu.7@gmail.com) wrote:

it would seem the code like 

import o.a.spark.SparkContext._ 
import math._ 

.... 

a = log(b) 

does not seem to compile anymore with Spark 1.0.x since SparkContext._ also 
exposes a `log` function. Which happens a lot to a guy like me. 

obvious workaround is to use something like 

import o.a.spark.SparkContext.{log => sparkLog, _} 

but wouldn't it be easier just to avoid so expected clash in the first 
place? 

thank you. 
-d 

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message