qpid-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jiri Daněk (Jira) <j...@apache.org>
Subject [jira] [Commented] (PROTON-220) Create a set of "glass box" tests to quantify the performance of the proton codebase
Date Mon, 27 Apr 2020 08:12:00 GMT

    [ https://issues.apache.org/jira/browse/PROTON-220?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17093161#comment-17093161
] 

Jiri Daněk commented on PROTON-220:
-----------------------------------

I think there are two pieces that are missing. 1) Microbenchmark that would involve the proactor.
2) Good place to store previous results for easy and meaningful comparison. TravisCI logs
are not that great, but might do in the short term for this.

Further ideas for benchmarks

* http://nowlab.cse.ohio-state.edu/projects/26/ (Design and Evaluation of Advanced Message
Queuing Protocol (AMQP) over InfiniBand, 2008)
* https://github.com/grpc/grpc/tree/6b676440f5a0f2b6c9532bf74476678ca475b3b6/test/cpp/microbenchmarks

> Create a set of "glass box" tests to quantify the performance of the proton codebase
> ------------------------------------------------------------------------------------
>
>                 Key: PROTON-220
>                 URL: https://issues.apache.org/jira/browse/PROTON-220
>             Project: Qpid Proton
>          Issue Type: Test
>          Components: proton-c, proton-j
>            Reporter: Ken Giusti
>            Assignee: Jiri Daněk
>            Priority: Major
>              Labels: perf, testing
>             Fix For: proton-c-future
>
>
> The goal of these tests would be to detect any performance degradation inadvertently
introduced during development.   These tests would not be intended to provide any metrics
regarding the "real world" behavior of proton-based applications.  Rather, these tests are
targeted for use by the proton developers to help gauge the effect their code changes may
have on performance.
> These tests should require no special configuration or setup in order to run.  It should
be easy to run these test as part of the development process.  The intent would be to have
developer run the tests prior to making any code changes, and record the metrics for comparison
against the results obtained after making changes to the code base.
> As described by Rafi:
> "I think it would be good to include some performance metrics that isolate
> the various components of proton. For example having a metric that simply
> repeatedly encodes/decodes a message would be quite useful in isolating the
> message implementation. Setting up two engines in memory and using them to
> blast zero sized messages back and forth as fast as possible would tell us
> how much protocol overhead the engine is adding. Using the codec directly
> to encode/decode data would also be a useful measure. Each of these would
> probably want to have multiple profiles, different message content,
> different acknowledgement/flow control patterns, and different kinds of
> data.
> I think breaking out the different dimensions of the implementation as
> above would provide a very useful tool to run before/after any performance
> sensitive changes to detect and isolate regressions, or to test potential
> improvements."



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@qpid.apache.org
For additional commands, e-mail: dev-help@qpid.apache.org


Mime
View raw message