Return to search

Benchmarking microservices: effects of tracing and service mesh

Microservices have become the current standard in software architecture. As the number of microservices increases, there is an increased need for better visualization, debugging and configuration management. Developers currently adopt various tools to achieve the above functionalities two of which are tracing tools and service meshes. Despite the advantages, they bring to the table, the overhead they add is also significant. In this thesis, we try to understand these overheads in latency and throughput by conducting experiments on known benchmarks with different tracing tools and service meshes. We introduce a new tool called Unified Benchmark Runner (UBR) that allows easy benchmark setup, enabling a more systematic way to run multiple benchmark experiments under different scenarios. UBR supports Jaeger, TCP Dump, Istio, and three popular microservice benchmarks, namely, Social Network, Hotel Reservation, and Online Boutique. Using UBR, we conduct experiments with all three benchmarks and report performance for different deployments and configurations.

Identiferoai:union.ndltd.org:bu.edu/oai:open.bu.edu:2144/47469
Date04 November 2023
CreatorsUnnikrishnan, Vivek
ContributorsLiagouris, Ioannis
Source SetsBoston University
Languageen_US
Detected LanguageEnglish
TypeThesis/Dissertation
RightsAttribution 4.0 International, http://creativecommons.org/licenses/by/4.0/

Page generated in 0.0017 seconds