spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Rowe <sar...@gmail.com>
Subject Re: Writing custom Transformers and Estimators like Tokenizer in spark ML
Date Wed, 27 Jul 2016 18:35:26 GMT
You can see the source for my transformer configurable bridge to Lucene analysis components
here, in my company Lucidworks’ spark-solr project: <https://github.com/lucidworks/spark-solr/blob/master/src/main/scala/com/lucidworks/spark/ml/feature/LuceneTextAnalyzerTransformer.scala>.

Here’s a blog I wrote about using this transformer, as well as non-ML-context use in Spark
of the underlying analysis component, here: <https://lucidworks.com/blog/2016/04/13/spark-solr-lucenetextanalyzer/>.

--
Steve
www.lucidworks.com

> On Jul 27, 2016, at 1:31 PM, janardhan shetty <janardhanp22@gmail.com> wrote:
> 
> 1.  Any links or blogs to develop custom transformers ? ex: Tokenizer
> 
> 2. Any links or blogs to develop custom estimators ? ex: any ml algorithm


---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message