Hi,
Is my understanding that little research has been done yet on distributed computation (without
access to shared memory) in RNN. I also look forward to contributing in this respect.
> El 03/11/2015, a las 16:00, Disha Shrivastava <dishu.905@gmail.com> escribió:
>
> I would love to work on this and ask for ideas on how it can be done or can suggest some
papers as starting point. Also, I wanted to know if Spark would be an ideal platform to have
a distributive implementation for RNN/LSTM
>
>> On Mon, Nov 2, 2015 at 10:52 AM, Sasaki Kai <lewuathe@me.com> wrote:
>> Hi, Disha
>>
>> There seems to be no JIRA on RNN/LSTM directly. But there were several tickets about
other type of networks regarding deep learning.
>>
>> Stacked Auto Encoder
>> https://issues.apache.org/jira/browse/SPARK-2623
>> CNN
>> https://issues.apache.org/jira/browse/SPARK-9129
>> https://issues.apache.org/jira/browse/SPARK-9273
>>
>> Roadmap of MLlib deep learning
>> https://issues.apache.org/jira/browse/SPARK-5575
>>
>> I think it may be good to join the discussion on SPARK-5575.
>> Best
>>
>> Kai Sasaki
>>
>>
>>> On Nov 2, 2015, at 1:59 PM, Disha Shrivastava <dishu.905@gmail.com> wrote:
>>>
>>> Hi,
>>>
>>> I wanted to know if someone is working on implementing RNN/LSTM in Spark or has
already done. I am also willing to contribute to it and get some guidance on how to go about
it.
>>>
>>> Thanks and Regards
>>> Disha
>>> Masters Student, IIT Delhi
>
|