spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sai Prasanna <>
Subject [No Subject]
Date Fri, 18 Apr 2014 07:06:27 GMT
Hi All,

In the interactive shell the spark context remains same. So if run a query
multiple times, the RDDs created by previous runs will be reused in the
subsequent runs and not recomputed until i exit and restart the shell again

Or is there a way to force to reuse/recompute in the presence/absence of
RDDs programmatically?

Thanks !

View raw message