lucene-solr-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jan Høydahl / Cominvent <>
Subject Re: Basic conceptual questions about solr
Date Fri, 20 Aug 2010 00:02:02 GMT

You can place Solr wherever you want, but if your data is veery large, you'd want dedicated

Have a look at DIH ( It can both crawl a file
share periodically, indexing only files changed since a timestamp (can be e.g. NOW-1HOUR)
and extract resulting text using Tika.

However if you require security, have a look at LCF (
which adds security but may lack a powerful file crawler..

You choose how the results are presented back to the user, but normally it's a traditional
web page with links which when clicked will point to that resource in some way.

Wrt. user's local content - what is that? Sounds like you want to hook in to a local search
on the laptop like Google does. To do that you'd have to develop a local service sitting in
the system tray on each computer, exposing some API on some port. And then when user searches
your search portal, e.g., the GUI uses some AJAX to reach out
to the local search service and filter that in to the results...

Jan Høydahl, search solution architect
Cominvent AS -
Training in Europe -

On 19. aug. 2010, at 21.31, Shaun McArthur wrote:

> I'm looking for a Google search appliance look-a-like. We have a file share with 1000's
of documents in a hierarchy that makes it ridiculously difficult to locate documents.
> Here are some basic questions:
> Is the idea to install Solr on separate hardware and have it crawl the file system?
> Can crawls be scheduled?
> If installed on a remote server, can it be configured to insert users' local content
in search results?
> I assumed that once it's functioning, users surf to a web page for results?
> Appreciate any input, and I have started to RTFJavadocs :)
> Shaun
> Shaun McArthur
> Dir. Technical Operations
> Autodata Solutions
> Mobile : (226) 268-6458
> Skype :shaun-mcarthur

View raw message