spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From jamborta <>
Subject stack map functions in a loop (pyspark)
Date Thu, 19 Feb 2015 15:57:49 GMT
Hi all,

I think I have run into an issue on the lazy evaluation of variables in
pyspark, I have to following

functions = [func1, func2, func3]

for counter in range(len(functions)):
    data = value: [functions[counter](value)])

it looks like that the counter is evaluated when the RDD is computed, so it
fills in all the three mappers with the last value of it. Is there any way
to get it forced to be evaluated at the time? (I am aware that I could run
persist it after each step, which sounds a bit of a waste)


View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message