My system is windows 64 bit. I looked into the resource manager, Java is the only process that used about 13% CPU recourse; no disk activity related to Java; only about 6GB memory used out of 56GB in total.
My system response very well. I don't think it is a system issue.
I think you'd have to say more about "stopped working". Is the GC
thrashing? does the UI respond? is the CPU busy or not?
On Mon, Mar 16, 2015 at 4:25 AM, Xi Shen <firstname.lastname@example.org> wrote:
> I am running k-means using Spark in local mode. My data set is about 30k
> records, and I set the k = 1000.
> The algorithm starts and finished 13 jobs according to the UI monitor, then
> it stopped working.
> The last log I saw was:
> [Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned
> broadcast 16
> There're many similar log repeated, but it seems it always stop at the 16th.
> If I try to low down the k value, the algorithm will terminated. So I just
> want to know what's wrong with k=1000.