spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Xiangrui Meng <men...@gmail.com>
Subject Re: MLLib beginner question
Date Tue, 30 Dec 2014 00:56:46 GMT
b0c1, did you apply model.predict to a DStream? Maybe it would help
understand your question better if you can post your code. -Xiangrui

On Tue, Dec 23, 2014 at 11:54 AM, boci <boci.boci@gmail.com> wrote:
> Xiangrui: Hi, I want to using this with streaming and with job too. I using
> kafka (streaming) and elasticsearch (job) as source and want to calculate
> sentiment value from the input text.
> Simon: great, you have any doc how can I embed into my application without
> using the http interface? (how can I direct call the service?)
>
> b0c1
>
> ----------------------------------------------------------------------------------------------------------------------------------
> Skype: boci13, Hangout: boci.boci@gmail.com
>
> On Tue, Dec 23, 2014 at 1:35 AM, Xiangrui Meng <mengxr@gmail.com> wrote:
>>
>> How big is the dataset you want to use in prediction? -Xiangrui
>>
>> On Mon, Dec 22, 2014 at 1:47 PM, boci <boci.boci@gmail.com> wrote:
>> > Hi!
>> >
>> > I want to try out spark mllib in my spark project, but I got a little
>> > problem. I have training data (external file), but the real data com
>> > from
>> > another rdd. How can I do that?
>> > I try to simple using same SparkContext to boot rdd (first I create rdd
>> > using sc.textFile() and after NaiveBayes.train... After that I want to
>> > fetch
>> > the real data using same context and internal the map using the predict.
>> > But
>> > My application never exit (I think stucked or something). Why not work
>> > this
>> > solution?
>> >
>> > Thanks
>> >
>> > b0c1
>> >
>> >
>> >
>> > ----------------------------------------------------------------------------------------------------------------------------------
>> > Skype: boci13, Hangout: boci.boci@gmail.com
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message