lucene-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ayyanar <>
Subject RE: Tokenizer Question
Date Mon, 05 Jan 2009 20:07:44 GMT

My objective is to retain the keyword (input stream) as is a token like a
keyword tokenizer does and also split the keyword by whitespace and maintain
that tokens as a white space tokenizer does
View this message in context:
Sent from the Lucene - General mailing list archive at

View raw message