1 |
CLIENT-SIDE CACHING: REDUCING SERVER LOAD AND LATENCY IN A NETWORK TRAFFIC ANALYSIS TOOLSödermark, Oskar January 2023 (has links)
Caching is a fundamental technique widely used in the field of computing to reduce network traffic, server load, and latency. Storing frequently accessed data in a high-speed cache layer can make future requests process faster by involving fewer system components when generating and serving the response. Kalix is a software product that demands a caching solution since it faces latency and is frequently processing partially repeated queries. However, a cache does not guarantee improved performance, which is why the main problems of caching are: determining what content to cache, when to insert or remove cache content, implementing the caching logic, and deciding where to store the cache efficiently. Therefore, this paper theoretically investigates where a cache solution should be implemented within the Kalix system architecture to decrease latency and server load, and evaluates the subsequent cache implementation experimentally. As a result, a client-side cache is implemented which decreases the latency of Kalix by up to 74%, while reducing the I/O load and memory utilization on the server by 98%. The reason for the decrease is that the cache in the client can directly serve the majority of the content, allowing the servers of Kalix to do substantially fewer computations. The evaluation acts as a recommendation for the company behindKalix, Polystar, as to if a cache is beneficial and where the cache can efficiently be deployed, and this paper gives valuable insights into the decision-making of cache placement. Concludingly, implementing the cache positively impacts the Kalix user experience.
|
Page generated in 0.0591 seconds