spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ulanov, Alexander" <>
Subject RE: Spark ANN
Date Wed, 09 Sep 2015 00:46:51 GMT
That is an option too. Implementing convolutions with FFTs should be considered as well

From: Feynman Liang []
Sent: Tuesday, September 08, 2015 12:07 PM
To: Ulanov, Alexander
Cc: Ruslan Dautkhanov; Nick Pentreath; user;
Subject: Re: Spark ANN

Just wondering, why do we need tensors? Is the implementation of convnets using im2col (see
here<>) insufficient?

On Tue, Sep 8, 2015 at 11:55 AM, Ulanov, Alexander <<>>
Ruslan, thanks for including me in the discussion!

Dropout and other features such as Autoencoder were implemented, but not merged yet in order
to have room for improving the internal Layer API. For example, there is an ongoing work with
convolutional layer that consumes/outputs 2D arrays. We’ll probably need to change the Layer’s
input/output type to tensors. This will influence dropout which will need some refactoring
to handle tensors too. Also, all new components should have ML pipeline public interface.
There is an umbrella issue for deep learning in Spark
which includes various features of Autoencoder, in particular
You are very welcome to join and contribute since there is a lot of work to be done.

Best regards, Alexander
From: Ruslan Dautkhanov [<>]
Sent: Monday, September 07, 2015 10:09 PM
To: Feynman Liang
Cc: Nick Pentreath; user;<>
Subject: Re: Spark ANN

Found a dropout commit from avulanov:

It probably hasn't made its way to MLLib (yet?).

Ruslan Dautkhanov

On Mon, Sep 7, 2015 at 8:34 PM, Feynman Liang <<>>
Unfortunately, not yet... Deep learning support (autoencoders, RBMs) is on the roadmap for
1.6<> though, and there is a spark
package<> for dropout
regularized logistic regression.

On Mon, Sep 7, 2015 at 3:15 PM, Ruslan Dautkhanov <<>>

It does not look Spark ANN yet supports dropout/dropconnect or any other techniques that help
avoiding overfitting?

ps. There is a small copy-paste typo in
should read B&C :)

Ruslan Dautkhanov

On Mon, Sep 7, 2015 at 12:47 PM, Feynman Liang <<>>
Backprop is used to compute the gradient here<>,
which is then optimized by SGD or LBFGS here<>

On Mon, Sep 7, 2015 at 11:24 AM, Nick Pentreath <<>>
Haven't checked the actual code but that doc says "MLPC employes backpropagation for learning
the model. .."?

Sent from Mailbox<>

On Mon, Sep 7, 2015 at 8:18 PM, Ruslan Dautkhanov <<>>

Implementation seems missing backpropagation?
Was there is a good reason to omit BP?
What are the drawbacks of a pure feedforward-only ANN?


Ruslan Dautkhanov

View raw message