spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mich Talebzadeh <>
Subject Reading configuration file in Spark Scala throws error
Date Sat, 03 Aug 2019 19:28:57 GMT

I have a config file application.conf that I am trying to read.

The skeleton code is as follows:

import com.typesafe.config.ConfigFactory
import scala.collection.JavaConverters
  def main(args: Array[String]): Unit = {
    val globalConfig = ConfigFactory.load()  // pass in filename (without
extension) to load additional config file in src/main/resources or CLASSPATH
    val conf       = globalConfig.getConfig("database")  // extract out top
level key from top level namespace
    conf.entrySet().iterator().forEachRemaining { entry =>
      val key:    String = entry.getKey
      val value:  Any    = entry.getValue.unwrapped()  // access via entry
      val value2: Any    = conf.getAnyRef(key)         // access via hash
lookup from config
      println( s"$key : $value | $value2" )              // string

But I am getting the following error

[info] Compiling 1 Scala source to
missing parameter type
[error]     conf.entrySet().iterator().forEachRemaining { entry =>
[error]                                                   ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
The application.conf has the following layout

database = {
  dbDatabase = "trading"
  dbPassword = "mongodb"
  dbUsername = "trading_user_RW"
  bootstrapServers = "rhes75:9092"

I appreciate any hint


Dr Mich Talebzadeh

LinkedIn *

*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.

View raw message