• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 4
  • 1
  • Tagged with
  • 16
  • 8
  • 4
  • 4
  • 4
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Uso de técnicas e informações em algoritmos adaptativos para substituição de páginas. / Use of technics and information on adaptive page replacement algorithms.

Ricardo Leandro Piantola da Silva 19 March 2010 (has links)
O desempenho do sistema de memória virtual depende diretamente da qualidade da política de gerência de memória. Estratégias podem ser desenvolvidas para melhorar tal desempenho: uma delas é criar novas políticas de gerência de memória que tenham, ao mesmo tempo, bom desempenho e simplicidade; outra maneira é desenvolver técnicas e incluir informações para auxiliar as políticas já existentes. Este trabalho procura mostrar uma estratégia para auxiliar políticas de substituição com a finalidade de obter bom desempenho em um sistema de gerência de memória, sem a necessidade de alterar o comportamento da política de substituição. Para isso, foi utilizada a técnica de busca antecipada de páginas em conjunto com a informação de frequência de acessos, obtida por meio de um método usado em processamento estatístico de linguagem natural. Os resultados mostram, além do bom desempenho, que a mesma estratégia pode ser adotada em qualquer algoritmo. / The virtual memory system performance depends directly on the quality of the memory management policy. Strategies can be developed to improve such performance: one of them is creating new memory management policies that present, at the same time, simplicity and good performance; another one is developing techniques and include information that will aid the policies that already exist. This paper aims to show a strategy that will aid replacement policies in order to obtain a good performance in a memory management system without changing the replacement policy behavior. To do so, a page prefetching technique along with information about access frequency, obtained through a method used in a statistical natural language processing, was used. The results show, besides the good performance, that the same strategy can be adopted in any algorithm.
12

An Energy Efficient Data Cache Implementing 2-way LRC Architecture

Musalappa, Saibhushan 09 December 2006 (has links)
Conventional level one data caches are widely used in high-performance microprocessors. Shrinking process parameters in chip fabrication technology allow a much larger number of devices on a chip with every new generation. This reduction in device size has led to an increase in the magnitude of a type of energy dissipation hitherto ignored?leakage energy. Transistor level leakage energy research for sub-micron processes has shown that leakage can be as much as or greater than the dynamic energy for advanced circuit designs. Researchers have devised techniques to reduce leakage energy at the fabrication and circuit levels. Transitioning the idle circuits from operating voltage to a reduced voltage is one such circuit-level technique. The ELRU-SEQ replacement policy exploits this technique to control cache bank transitions. This thesis proposes a new cache architecture called 2-way Leakage Reduction Cache (LRC) that uses this replacement policy. The architecture employs xor-mapping function to reduce conflict misses.
13

A Design of Buffer Scheme by Using Data Filter for Solid State Disk

Yang, Jing pei 09 August 2010 (has links)
No description available.
14

Dynamic Level-2 Cache Memory Locking by Utilizing Multiple Miss Tables

Mocniak, Andrew Louis 01 June 2016 (has links)
No description available.
15

Cache Design for Massive Heterogeneous Data of Mobile Social Media

Zhang, Ruiyang January 2014 (has links)
Since social media gains ever increasing popularity, Online Social Networks have become important repositories for information retrieval. The concept of social search, therefore, is gradually being recognized as the next breakthrough in this field, and it is expected to dominate topics in industry. However, retrieving information from OSNs with high Quality of Experience is non-trivial as a result of the prevalence of mobile applications for social networking services. For the sake of shortening user perceived latency Web caching was introduced and has been studied extensively for years. Nevertheless, the previous works seldom focus on the Web caching solutions for social search. In the context of this master’s thesis project, emphasis is given to the design of a Web caching system which is used to cache public data from social media with the objective of improving the user experience in terms of the freshness of data and the perceived service latency. To be more specific, a Web caching strategy named Staleness Bounded LRU algorithm is proposed to limit the term of validity of the cached data. In addition, a Two-Level Web Caching System that adopts the SB-LRU algorithm is proposed in order for shortening the user perceived latency. Results of trace-driven simulations and performance evaluations demonstrate that serving clients with stale data is avoided and the user perceived latencies are significantly shortened when the proposed Web caching system is used in the use case of unauthenticated social search. Besides, the design idea in this project is believed to be helpful to the design of a Web caching system for social search, which is capable of caching user specific data for different clients.
16

Sur des modèles pour l’évaluation de performance des caches dans un réseau cœur et de la consommation d’énergie dans un réseau d’accès sans-fil / On models for performance analysis of a core cache network and power save of a wireless access network

Choungmo Fofack, Nicaise Éric 21 February 2014 (has links)
Internet est un véritable écosystème. Il se développe, évolue et s’adapte aux besoins des utilisateurs en termes de communication, de connectivité et d’ubiquité. Dans la dernière décennie, les modèles de communication ont changé passant des interactions machine-à-machine à un modèle machine-à-contenu. Cependant, différentes technologies sans-fil et de réseaux (tels que les smartphones et les réseaux 3/4G, streaming en ligne des médias, les réseaux sociaux, réseaux-orientés contenus) sont apparues pour améliorer la distribution de l’information. Ce développement a mis en lumière les problèmes liés au passage à l’échelle et à l’efficacité énergétique; d’où la question: Comment concevoir ou optimiser de tels systèmes distribués qui garantissent un accès haut débit aux contenus tout en (i) réduisant la congestion et la consommation d’énergie dans le réseau et (ii) s’adaptant à la demande des utilisateurs dans un contexte connectivité quasi-permanente? Dans cette thèse, nous nous intéressons à deux solutions proposées pour répondre à cette question: le déploiement des réseaux de caches et l’implantation des protocoles économes en énergie. Précisément, nous proposons des modèles analytiques pour la conception de ces réseaux de stockage et la modélisation de la consommation d’énergie dans les réseaux d’accès sans fil. Nos études montrent que la prédiction de la performance des réseaux de caches réels peut être faite avec des erreurs relatives absolues de l’ordre de 1% à 5% et qu’une proportion importante soit 70% à 90% du coût de l’énergie dans les cellules peut être économisée au niveau des stations de base et des mobiles sous des conditions réelles de trafic. / Internet is a real ecosystem. It grows, evolves and adapts to the needs of users in terms of communication, connectivity and ubiquity of users. In the last decade, the communication paradigm has shifted from traditional host-to-host interactions to the recent host-to-content model; while various wireless and networking technologies (such as 3/4G smartphones and networks, online media streaming, social networks, clouds, Big-Data, information-centric networks) emerged to enhance content distribution. This development shed light on scalability and energy efficiency issues which can be formulated as follows. How can we design or optimize such large scale distributed systems in order to achieve and maintain high-speed access to contents while (i) reducing congestion and energy consumption in the network and (ii) adapting to the temporal locality of users demand in a continuous connectivity paradigm? In this thesis we focus on two solutions proposed to answer this question: In-network caching and Power save protocols for scalability and energy efficiency issues respectively. Precisely, we propose analytic models for designing core cache networks and modeling energy consumption in wireless access networks. Our studies show that the prediction of the performance of general core cache networks in real application cases can be done with absolute relative errors of order of 1%–5%; meanwhile, dramatic energy save can be achieved by mobile devices and base stations, e.g., as much as 70%–90% of the energy cost in cells with realistic traffic load and the considered parameter settings.

Page generated in 0.0196 seconds