spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Russell Jurney <>
Subject Automating lengthy command to pyspark with configuration?
Date Sun, 28 Aug 2016 22:30:20 GMT
In order to use PySpark with MongoDB and ElasticSearch, I currently run the
rather long commands of:

1) pyspark --executor-memory 10g --jars

2) pyspark --jars ../lib/elasticsearch-hadoop-2.3.4.jar --driver-class-path
Can all these things be made a part of my configuration, so that I don't
have to call these lengthy additions to pyspark?

Russell Jurney

View raw message