spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Xiang Huo <huoxiang5...@gmail.com>
Subject Question about SparkContext.stop()
Date Mon, 30 Sep 2013 22:45:39 GMT
Hi all,

When I run sc.stop() in a standalone program, does that mean all resource
used but sc, such as memory, process created by it and CPU will be free? It
is possible that restart a SparkContext in a standalone program?

I want to use Spark to run a job on files batch by batch. Let's say there
are 100 files in one batch. What I tried is as following:

while(){
  Create a new sc in SparkContext
  .
  .
  .
  sc.stop()
}

But it doesn't work. So is there any way that can allow me do it in this
way?

Thanks,

Xiang

-- 
Xiang Huo
Department of Computer Science
University of Illinois at Chicago(UIC)
Chicago, Illinois
US
Email: huoxiang5659@gmail.com
           or xhuo4@uic.edu

Mime
View raw message