spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Isca Harmatz <>
Subject Random Forest driver memory
Date Tue, 16 Jun 2015 04:45:35 GMT

i have noticed that the random forest implementation crashes when
to many trees/ to big maxDepth is used.

im guessing that this is something to do with the amount of nodes that need
to be
kept in driver's memory during the run.

but when i examined the nodes structure is seems rather small

does anyone now where does the memory issue come from?


View raw message