Overhead Comparison of Instrumentation Frameworks

Reichelt, David Georg and Bulej, Lubomír and Jung, Reiner and Hoorn, Andre van (2024) Overhead Comparison of Instrumentation Frameworks. In: ICPE '24 Companion: Companion of the 15th ACM/SPEC International Conference on Performance Engineering :. Companion of the International Conference on Performance Engineering . ACM, New York, pp. 249-256. ISBN 9798400704451

[thumbnail of instrumentationTechnologiesBenchmark]
Text (instrumentationTechnologiesBenchmark)
instrumentationTechnologiesBenchmark.pdf - Published Version
Available under License Creative Commons Attribution.

Download (525kB)

Abstract

Application Performance Monitoring (APM) tools are used in the industry to gain insights, identify bottlenecks, and alert to issues related to software performance. The available APM tools generally differ in terms of functionality and licensing, but also in monitoring overhead, which should be minimized due to use in production deployments. One notable source of monitoring overhead is the instrumentation technology, which adds code to the system under test to obtain monitoring data. Because there are many ways how to instrument applications, we study the overhead of five different instrumentation technologies (AspectJ, ByteBuddy, DiSL, Javassist, and pure source code instrumentation) in the context of the Kieker open-source monitoring framework, using the MooBench benchmark as the system under test. Our experiments reveal that ByteBuddy, DiSL, Javassist, and source instrumentation achieve low monitoring overhead, and are therefore most suitable for achieving generally low overhead in the monitoring of production systems. However, the lowest overhead may be achieved by different technologies, depending on the configuration and the execution environment (e.g., the JVM implementation or the processor architecture). The overhead may also change due to modifications of the instrumentation technology. Consequently, if having the lowest possible overhead is crucial, it is best to analyze the overhead in concrete scenarios, with specific fractions of monitored methods and in the execution environment that accurately reflects the deployment environment. To this end, our extensions of the Kieker framework and the MooBench benchmark enable repeated assessment of monitoring overhead in different scenarios.

Item Type:
Contribution in Book/Report/Proceedings
ID Code:
216420
Deposited By:
Deposited On:
31 May 2024 13:20
Refereed?:
Yes
Published?:
Published
Last Modified:
07 Oct 2024 00:46