Not sure dropout but if you change the solver from breeze bfgs to breeze owlqn or breeze.proximal.NonlinearMinimizer you can solve ann loss with l1 regularization which will yield elastic net style sparse solutions....using that you can clean up edges which has 0.0 as weight...

On Sep 7, 2015 7:35 PM, "Feynman Liang" <fliang@databricks.com> wrote:
BTW thanks for pointing out the typos, I've included them in my MLP cleanup PR

On Mon, Sep 7, 2015 at 7:34 PM, Feynman Liang <fliang@databricks.com> wrote:
Unfortunately, not yet... Deep learning support (autoencoders, RBMs) is on the roadmap for 1.6 though, and there is a spark package for dropout regularized logistic regression.


On Mon, Sep 7, 2015 at 3:15 PM, Ruslan Dautkhanov <dautkhanov@gmail.com> wrote:
Thanks!

It does not look Spark ANN yet supports dropout/dropconnect or any other techniques that help avoiding overfitting?


--
Ruslan Dautkhanov

On Mon, Sep 7, 2015 at 12:47 PM, Feynman Liang <fliang@databricks.com> wrote:
Backprop is used to compute the gradient here, which is then optimized by SGD or LBFGS here

On Mon, Sep 7, 2015 at 11:24 AM, Nick Pentreath <nick.pentreath@gmail.com> wrote:
Haven't checked the actual code but that doc says "MLPC employes backpropagation for learning the model. .."?




Sent from Mailbox


On Mon, Sep 7, 2015 at 8:18 PM, Ruslan Dautkhanov <dautkhanov@gmail.com> wrote:

http://people.apache.org/~pwendell/spark-releases/latest/ml-ann.html 

Implementation seems missing backpropagation?
Was there is a good reason to omit BP? 
What are the drawbacks of a pure feedforward-only ANN?

Thanks!


--
Ruslan Dautkhanov