spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From James <>
Subject Re: How to correctly extimate the number of partition of a graph in GraphX
Date Sun, 02 Nov 2014 06:28:58 GMT

We get a graph with 100B edges of nearly 800GB in gz format.
We have 80 machines, each one has 60GB memory.
I have not ever seen the program run to completion.


2014-11-02 14:06 GMT+08:00 Ankur Dave <>:

> How large is your graph, and how much memory does your cluster have?
> We don't have a good way to determine the *optimal* number of partitions
> aside from trial and error, but to get the job to at least run to
> completion, it might help to use the MEMORY_AND_DISK storage level and a
> large number of partitions.
> Ankur <>
> On Sat, Nov 1, 2014 at 10:57 PM, James <> wrote:
>> Hello,
>> I am trying to run Connected Component algorithm on a very big graph. In
>> practice I found that a small number of partition size would lead to OOM,
>> while a large number would cause various time out exceptions. Thus I wonder
>> how to estimate the number of partition of a graph in GraphX?
>> Alcaid

View raw message