Hi all, 

When I run sc.stop() in a standalone program, does that mean all resource used but sc, such as memory, process created by it and CPU will be free? It is possible that restart a SparkContext in a standalone program?

I want to use Spark to run a job on files batch by batch. Let's say there are 100 files in one batch. What I tried is as following:

  Create a new sc in SparkContext

But it doesn't work. So is there any way that can allow me do it in this way?



Xiang Huo
Department of Computer Science
University of Illinois at Chicago(UIC)
Chicago, Illinois
Email: huoxiang5659@gmail.com
           or xhuo4@uic.edu