Not sure dropout but if you change the solver from breeze bfgs to breeze owlqn or breeze.proximal.NonlinearMinimizer you can solve ann loss with l1 regularization which will yield elastic net style sparse solutions....using that you can clean up edges which has 0.0 as weight...
BTW thanks for pointing out the typos, I've included them in my MLP cleanup PROn Mon, Sep 7, 2015 at 7:34 PM, Feynman Liang <firstname.lastname@example.org> wrote:Unfortunately, not yet... Deep learning support (autoencoders, RBMs) is on the roadmap for 1.6 though, and there is a spark package for dropout regularized logistic regression.On Mon, Sep 7, 2015 at 3:15 PM, Ruslan Dautkhanov <email@example.com> wrote:Thanks!It does not look Spark ANN yet supports dropout/dropconnect or any other techniques that help avoiding overfitting?ps. There is a small copy-paste typo inhttps://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/ml/ann/BreezeUtil.scala#L43should read B&C :)
Ruslan DautkhanovOn Mon, Sep 7, 2015 at 12:47 PM, Feynman Liang <firstname.lastname@example.org> wrote:On Mon, Sep 7, 2015 at 11:24 AM, Nick Pentreath <email@example.com> wrote:Haven't checked the actual code but that doc says "MLPC employes backpropagation for learning the model. .."?
Sent from Mailbox
On Mon, Sep 7, 2015 at 8:18 PM, Ruslan Dautkhanov <firstname.lastname@example.org> wrote:http://people.apache.org/~pwendell/spark-releases/latest/ml-ann.htmlImplementation seems missing backpropagation?Was there is a good reason to omit BP?What are the drawbacks of a pure feedforward-only ANN?Thanks!