lucene-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Vivek Narang (JIRA)" <>
Subject [jira] [Commented] (SOLR-10317) Solr Nightly Benchmarks
Date Tue, 06 Jun 2017 02:24:18 GMT


Vivek Narang commented on SOLR-10317:

Hi [~ichattopadhyaya] I think there is some confusion and I think I should give some explanation.

- I am not building a new framework, but I am extending the benchmarks framework that you
created [] as mentioned in the proposal. 
- The reason behind me extending upon your framework is that it already has many flexible,
ready to use resources and that it is written in one language. I am comfortable using one
language over two languages together.   
- For the remaining things, I already am in the process of using the required resources from
Shalin's work and adding to the framework that you created.  
- As far as tagging/addition of significant events go while I think that the current logic
in Shalin's code base related to listing significant events is hard coded [],
the closest that I am in making it dynamic and self-dependent is showing relevant commit messages
with each metric point [] (please
hover over any point to see the relevant commit message).  
- I will try to add a feature through which you would be able to view all the graphs together.

Please access the latest codebase [].
As per the agreement in the proposal, the code for benchmarks suite is under dev-tools framework
and in the SolrNightlyBenchmarks branch. 


> Solr Nightly Benchmarks
> -----------------------
>                 Key: SOLR-10317
>                 URL:
>             Project: Solr
>          Issue Type: Task
>            Reporter: Ishan Chattopadhyaya
>              Labels: gsoc2017, mentor
>         Attachments: changes-lucene-20160907.json, changes-solr-20160907.json, managed-schema,
Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf,
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be found here,
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr nodes, both in
SolrCloud and standalone mode, and record timing information of various operations like indexing,
querying, faceting, grouping, replication etc.
> # It should be possible to run them either as an independent suite or as a Jenkins job,
and we should be able to report timings as graphs (Jenkins has some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it never goes
out of date.
> There is some prior work / discussion:
> # (Shalin)
> # (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # (Mike McCandless)
> # (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr in some
of these frameworks above. However, the benchmarks run are very limited. Any of these can
be a starting point, or a new framework can as well be used. The motivation is to be able
to cover every functionality of Solr with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure [~shalinmangar]
and [] would help here.

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message