spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From jamborta <>
Subject create SparkContext dynamically
Date Wed, 18 Jun 2014 23:10:29 GMT
Hi all,

I am setting up a system where spark contexts would be created by a web
server that would handle the computation and return the results. I have the
following code (in python)

os.environ['SPARK_HOME'] = "/home/spark/spark-1.0.0-bin-hadoop2/"
sc = SparkContext(master="spark://ip-xx-xx-xx-xx:7077", appName="Simple
l =sc.parallelize([1,2,3,4])
c = l.count() 

but it throws an unrelated error 'TypeError: an integer is required' in the
last line.

I assume I did not setup the environment properly. I have added spark_home
and py4j source to the classpath. not sure what is missing.


View this message in context:
Sent from the Apache Spark User List mailing list archive at

View raw message