spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dmitriy Lyubimov <dlie...@gmail.com>
Subject "log" overloaded in SparkContext/ Spark 1.0.x
Date Tue, 05 Aug 2014 00:01:58 GMT
it would seem the code like

import o.a.spark.SparkContext._
import math._

....

a = log(b)

does not seem to compile anymore with Spark 1.0.x since SparkContext._ also
exposes a `log` function. Which happens a lot to a guy like me.

obvious workaround is to use something like

import o.a.spark.SparkContext.{log => sparkLog,  _}

but wouldn't it be easier just to avoid so expected clash in the first
place?

thank you.
-d

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message