ignite-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alexey Zinoviev <zaleslaw....@gmail.com>
Subject Re: ML stable and performance
Date Fri, 13 Sep 2019 19:12:48 GMT
The reason was that the last year there is no significant releases of
Ignite between 2.7 and 2.8, only minor releases with long story of renaming.
I am and another ML guys are ready in 1-2 months prepare ML module for 2.8
or for the minor release 2.7.7 = 2.7.6 + updated ML + new fixed bugs

Let's discuss it in separate thread next week



пт, 13 сент. 2019 г. в 21:55, Denis Magda <dmagda@apache.org>:

> Alexey, I'm wondering,
>
> Are there any dependencies on Ignite Core that make us put off the ML
> changes release until 2.8? I know that you do not support the idea of ML as
> a separate Ignite module but this concept would allow us to release ML as
> frequently as we want not being blocked by Ignite core releases.
>
>
> -
> Denis
>
>
> On Fri, Sep 13, 2019 at 11:45 AM Alexey Zinoviev <zaleslaw.sin@gmail.com>
> wrote:
>
> > I could answer as one of developers of ML module.
> > Currently is available the ML in version 2.7.5, it supports a lot of
> > algorithms and could be used in production, but the API is not stable and
> > will be changed in 2.8
> >
> > The ML module will be stable since next release 2.8, also we have no
> > performance report to compare for example with Spark ML
> > Based on my exploration the performance of in terms of Big O notation is
> > the same like in Spark ML (real numbers says that Ignite ML is more
> faster
> > than Spark ML due to Ignite in-memory nature and so on)
> >
> > Since 2.8 it will have good integration with TensorFlow, Spark ML,
> XGBoost
> > via model inference.
> >
> > You as a user have no ability to run, for-example scikit-learn or R
> > packages in distributed mode over Ignite, but you could run the
> TensorFlow,
> > using Ignite as a distributed back-end instead of Horovod.
> >
> > If you have any questions, please let me know
> >
> >
> >
> > пт, 13 сент. 2019 г. в 21:28, Denis Magda <dmagda@apache.org>:
> >
> >> David,
> >>
> >> Let me loop in Ignite dev list that has Ignite ML experts subscribed.
> >> Please, could you share more details in regards to your performance
> >> testing
> >> and objectives for Ignite ML overall?
> >>
> >> The module is ready for production and we're ready to help solve any
> >> cornerstones.
> >>
> >> -
> >> Denis
> >>
> >>
> >> On Fri, Sep 6, 2019 at 4:50 AM David Williams <leeon2013@gmail.com>
> >> wrote:
> >>
> >> > Python is 25 times slower than Java for ML at runtimes, which I found
> >> out
> >> > online. But I don't know that statement is true or not. I need
> insiders'
> >> > opinion.  Which ml other packages are best options for Ignite?
> >> >
> >> > On Fri, Sep 6, 2019 at 7:28 PM Mikael <mikael-aronsson@telia.com>
> >> wrote:
> >> >
> >> >> Hi!
> >> >>
> >> >> I have never used it myself but it's been there for long time and I
> >> >> would expect it to be stable, and yes it will run distributed, I
> can't
> >> >> say anything about performance as I have never used it.
> >> >>
> >> >> You will find a lot of more information at:
> >> >>
> >> >> https://apacheignite.readme.io/docs/machine-learning
> >> >>
> >> >> Mikael
> >> >>
> >> >>
> >> >> Den 2019-09-06 kl. 11:50, skrev David Williams:
> >> >> >
> >> >> >
> >> >> > I am evaluating ML framework for Java platform. I knew Ignite
has
> ML
> >> >> > package.
> >> >> > But I like to know its stability and performance for production.
> Can
> >> >> > Ignite
> >> >> > ML code run in distribute way?
> >> >> >
> >> >> > Except its own ML package, which ml packages are best options
for
> >> >> Ignite?
> >> >>
> >> >
> >>
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message