spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jesús Vásquez <>
Subject Re: How to add more imports at the start of REPL
Date Tue, 05 Mar 2019 11:46:55 GMT
Hi Nuthan, I have had the same issue before.
As a shortcut i created a text file called imports and then loaded it's
content with the command :load of the scala repl
Create "imports" textfile with the import instructions you need

import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.fs.Path

Then load it with :load

:load imports

Once you have the textfile with the import instructions you just have to
load it's content each time you start the repl.

El mar., 5 de mar. de 2019 a la(s) 12:14, Nuthan Reddy ( escribió:

> Hi,
> When launching the REPL using spark-submit, the following are loaded
> automatically.
> scala> :imports
>  1) import org.apache.spark.SparkContext._ (70 terms, 1 are implicit)
>  2) import spark.implicits._       (1 types, 67 terms, 37 are implicit)
>  3) import spark.sql               (1 terms)
>  4) import org.apache.spark.sql.functions._ (385 terms)
> And i would like to add more imports which i frequently use to reduce the
> typing that i do for these imports.
> Can anyone suggest a way to do this?
> Nuthan Reddy
> Sigmoid Analytics
> *Disclaimer*: This is not a mass e-mail and my intention here is purely
> from a business perspective, and not to spam or encroach your privacy. I am
> writing with a specific agenda to build a personal business connection.
> Being a reputed and genuine organization, Sigmoid respects the digital
> security of every prospect and tries to comply with GDPR and other regional
> laws. Please let us know if you feel otherwise and we will rectify the
> misunderstanding and adhere to comply in the future. In case we have missed
> any of the compliance, it is completely unintentional.

View raw message