spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From charles li <>
Subject confusing about start ipython notebook with spark between 1.3.x and 1.6.x
Date Mon, 01 Feb 2016 02:43:37 GMT
I used to use spark 1.3.x before, and explore my data in ipython [3.2]
notebook, which was very stable. but I came across an error

 " Java gateway process exited before sending the driver its port number "

my code is as bellow:

import pyspark
from pyspark import SparkConf

sc_conf = SparkConf()             ### error occurs here

then I ask google for help, here is a answer on stackoverflow:
, it says:


One solution is adding pyspark-shell to the shell environment variable

export PYSPARK_SUBMIT_ARGS="--master local[2] pyspark-shell"

There is a change in python/pyspark/
which requires PYSPARK_SUBMIT_ARGS includes pyspark-shell if a
PYSPARK_SUBMIT_ARGS variable is set by a user.


then I change my  PYSPARK_SUBMIT_ARGS from `--master spark:// --deploy-mode client` to `--master spark:// --deploy-mode client  pyspark-shell` , it does works, but
it raise another question, each time when I create sc in different
notebooks, the spark app name is `pyspark-shell` even though I explicitily
set the app name using SparkConf, that's really confused me these days.

then My questions come:

   - How to start ipython notebook with spark integrated in spark 1.6.0;
   - why it works when I set `pyspark-shell` in PYSPARK_SUBMIT_ARGS when
   start ipython notebook with spark 1.6.0;
   - why it does not work when I explicitly set the app name using

great thanks.

a spark lover, a quant, a developer and a good man.

View raw message