jmeter-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Philippe Mouawad <philippe.moua...@gmail.com>
Subject Re: Graphite Listener
Date Wed, 01 Jan 2014 20:40:38 GMT
Regarding the vote I propose the following.
-----------------------------------------------------------------------------------------------------------------------------------------------------
The JMeter team is asking for your opinion regarding the inclusion in
JMeter of a Graphite Listener.
Graphite overview can be found here:
- http://graphite.readthedocs.org/en/latest/overview.html

This listener will allow JMeter to dump summary of response time and status
every second:

1) Per second:
- min response time
- max response time
- percentile 90 for response time rolling over last 100 values
- percentile 95 for response time rolling over last 100 values
- successful requests
- failed requests
- total request

2) Listener will also send the following metrics which are not based
on a variation per second:
- ActiveThreads
- Stopped Threads
- Started Threads

More details can be found here:
- https://issues.apache.org/bugzilla/show_bug.cgi?id=55932


To give you opinion, select one of the 3 options:
[+1] I support this inclusion in JMeter core
[0] It should be available within a third-party library
[-1] I vote against any inclusion


Vote is open for 1 week.

Regards
Apache JMeter Team
-----------------------------------------------------------------------------------------------------------------------------------------------------


On Wed, Jan 1, 2014 at 9:28 PM, Philippe Mouawad <philippe.mouawad@gmail.com
> wrote:

>
>
>
> On Wed, Jan 1, 2014 at 8:04 PM, sebb <sebbaz@gmail.com> wrote:
>
>> On 1 January 2014 15:08, Philippe Mouawad <philippe.mouawad@gmail.com>
>> wrote:
>> > On Tue, Dec 31, 2013 at 9:00 PM, Vladimir Sitnikov <
>> > sitnikov.vladimir@gmail.com> wrote:
>> >
>> >> >> I believe JMeter plugin-like interface between injector and
>> performance
>> >> >> repository would enable live monitoring of sytem under test.
>> >> sebb>Exactly. The JMeter architecture is designed to allow for easy
>> >> integration of 3rd party plugins.
>> >>
>> >> I need to double-check, however I did not manage to specify "path to
>> >> plugins.jar folder" via command line option.
>> >> Ultimately I would love to have JMeter installation and plugins in
>> >> completely separate folders. That simplifies "jmeter upgrade", "check
>> >> what plugins are installed", "compose test harness from maven"
>> >> usecases.
>> >>
>> >> Add this to user.properties:
>> > plugin_dependency_paths which will contains the path to third party jar
>> >
>> > user.classpath which will contain path to dependencies of third party
>> jar
>> >
>> >
>> >
>> >> Are there "asynchronous output" interfaces from JMeter?
>> >>
>> >
>> > in remote testing you are in async mode with:
>> > mode=StrippedAsynch
>> > or
>> > mode= Asynch
>> >
>> >>
>> >> Is there a way to send listener result via regular samplers?
>> >>
>> >
>> > Why would you like to do that ?
>> >
>> > sebb> Proper analysis should take place offline after the test has
>> >> completed.
>> >> Very true.
>> >> However, it is quite important to perform online analysis to be able
>> >> to adjust the test.
>> >> Say, adjust the load, fix bugs in script, correct system configuration,
>> >> etc.
>> >>
>> >> >> One can parse raw csv/xml results and upload for the analysis,
>> however
>> >> it
>> >> >> is likely to create big latency gap between collection and the
>> >> >> visualization.
>> >> >
>> >> sebb> Is that really a problem for most users?
>> >> How do we measure that?
>> >>
>> >
>> > A vote would be a way to sort it out.
>>
>> Perhaps, though it's very difficult to know whether a vote is
>> representative.
>>
>> >>
>> >> Here are the most relevant scenarios for our company:
>> >> 1) Durability testing. Say, you launch 1..7days long test script.
>> >> It is crucial to know if the system is yet stable.
>> >> That includes multiple KPIs (throughput, latency, failrate%, memory
>> >> consumption, gc, cpu%), and the request processing KPIs are not the
>> >> least important ones.
>> >> In case JMeter sampler info is not ready until the whole test is
>> >> finished it is major drawback.
>> >>
>> >
>> > Agree
>>
>> There are ways round this.
>> The Generate Summary Results Listener was designed for just this sort
>> of scenario.
>> I was involved in testing a large system where the test ran for
>> several days, and that was ideal for checking that the test was still
>> running well.
>>
>> Obviously it only provides basic info, but sufficient to check that
>> the test is generally behaving (or not).
>> Its big advantage is the low overhead and simplicity.
>>
>> >
>> >>
>> >> 2) Automated scalability testing. Say you want to identify maximum
>> >> load the system will sustain. One way to identify it is to gradually
>> >> increase the load and see if the system is stable (e.g. queues do not
>> >> build up, failrate is 0, response times are stable, etc).
>> >> Having data displayed in near-realtime helps a lot.
>> >> Especially, when you run test suite in different environment (e.g.
>> >> acceptance of new software/hardware)
>> >>
>> >
>> > Agree
>>
>> We used the Generate Summary Results Listener for that as well.
>>
>> >>
>> >> 3) Tight-schedule testing. When performing load testing at customer
>> >> environment (e.g. acceptance of production environment), it is
>> >> important to make informed decisions. Is good to see if your test
>> >> works as expected when the test is running, not when you've done 4
>> >> hours of testing and analyzing afterwards.
>> >>
>> >
>> > Agree
>> >
>> >>
>> >> 4) Red-green detection during regular 30-60min testing. Our scenarios
>> >> involve multiple jmx scripts and lots of samples.
>> >> We use JMeter GUI only for script development. We use just console for
>> >> load testing to avoid injector slowness/out of memory/etc.
>> >> Currently it is hard to tell if the test goes as expected: failrate%,
>> >> number of samples, response times, etc.
>>
>> Most of that is available from the Generate Summary Results Listener
>> already.
>>
>> >>
>> >> Agree
>> >
>> >
>> >> sebb> That would be a sensible addition to JMeter to allow performance
>> data
>> >> > to be readily saved to an arbitrary repo.
>> >> True.
>> >> OOB samplers might work (e.g. http sampler) might work for the "output
>> >> interfaces".
>> >> For instance "http sampler" under "stats listener" posts the result of
>> >> "stats listener" to the configured end-point.
>> >>
>> >> However, I believe this kind of integration should be asynchronous to
>> >> avoid impact result collection on the test scenario. If trying to use
>> >> sync result posting, we could result into multi-second test hang due
>> >> to lag of performance repository receiver.
>> >>
>> >> Currently the Graphite Listener patch it is implemented this way:
>> > 1) Synchronously, success , failures, min and max are updated (very fast
>> > computations), percentile related data is also computed (Looking at
>> code of
>> > commons-math3 I don't see why it would not perform fine). If  further
>> tests
>> > show the percentile computation is costly , then we could delay this to
>> the
>> > 2) thread below. But the tests I did did not show problem.
>> > 2) And in the background, a task runs every minute to send current
>> computed
>> > results to Graphite server , resetting datas except for percentile
>> related
>> > ones.
>>
>> Seems to me all this could be done by having a generic interface to
>> enable the raw results to be saved to whatever backend is required.
>>
>
> We could refactor code in the following way:
> - GraphiteListener => BackendListener
> - Introduce Backend Interface
> - Backend would be called in testStarted/testEnded and within the existing
> run() to send metrics
> - There would be a select box to select backend. Backend implementations
> would be searched within classpaths
> - Graphite part and PickleMetricsManager would be moved to Graphite
> Backend.
>
> We could after that add a JBDC Backend.
>
>>
>> What I don't agree with is yet more code that is very specific to a
>> single 3rd party software solution.
>>
>> Whatever is added needs to be
>> - generally useful to the JMeter population. Niche requirements should
>> be met by 3rd party add-ons developed for the purpose.
>> - use a generic API, for example like JSR223, JDBC etc.
>>
>> I understand your wish but take the history of JMeter. At some time there
> was a need for custom scripting but no standard like BSF of JSR223.
> So you selected Beanshell . Once standard was introduced we moved to it.
> I think we cannot wait for all standards to exist, for example if we take
> NoSQL Databases, it is hard to tell when a JDBC equivalent will be
> available, does it means we should
> wait until they are available ? I don't think so.
>
>
>
>> >
>> >
>> >> Regards,
>> >> Vladimir Sitnikov
>> >>
>> >
>> >
>> >
>> > --
>> > Cordialement.
>> > Philippe Mouawad.
>>
>
>
>
> --
> Cordialement.
> Philippe Mouawad.
>
>
>


-- 
Cordialement.
Philippe Mouawad.

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message