• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 47
  • 12
  • 11
  • 10
  • 4
  • Tagged with
  • 97
  • 23
  • 22
  • 15
  • 14
  • 13
  • 13
  • 11
  • 11
  • 10
  • 9
  • 9
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

RTP Compatible: Two Models of Video Streaming Over VANETs

Fang, Zhifei January 2014 (has links)
Because Vehicular Ad Hoc Networks (VANETs) often have a high packet loss rate, the formerly used protocol for video streaming, Real-time Transport Protocol (RTP), is no longer suitable for this specific environment. Previous conducted research has offered many new protocols to solve this problem; however, most of them cannot make full use of the existing Internet video streaming resources like RTP servers. Our work proposes two models to solve this compatibility issue. The first model is called the converter model. Based on this model, we first modify RTP using Erasure Coding (EC) technique in order to adapt it to the high packet loss rate of VANETs. This newly developed protocol is called EC-RTP. And, we then developed two converters. The first converter stands on the boundary between the Internet and VANETs. It receives the RTP packets which sent from Internet. And then it translates them to the EC-RTP packets. These packets are transported over the VANETs. The second converter receives these EC-RTP packets, translates them back to the RTP packets. It then sends them to the RTP player, so that the RTP player can play these packets. To make EC-RTP can carry more kinds of video streams other than RTP, we proposed a second model. The second model is called the redundancy tunnel. Based on this model, we let the protocol between the two converters carry RTP protocol as its payload. We use the same technique as we have used to modify RTP. At last, we did some experiments with Android tablets. The experiment results show our solution can use the same player to play the same video resources as RTP does. However, unlike RTP, it can reduce packet loss rate.
12

State, democracy and development: An exploration of the scholarship of professor (Archie) Monwabisi Mafeje

Funani, Luthando Sinethemba January 2016 (has links)
Magister Artium - MA / The departing point of the thesis is that the neglect of African's intellectual heritage within the South African Universities and in public discourse undermines the ability of the post-apartheid government to set its developmental agenda and maximize its democratic potential. The thesis highlights the neglect of Professor Mafeje's scholarly contribution as an example of this neglect and argues that an engagement with his scholarly output might have differently shaped the debate on the thematic issues that are covered in this study. Against this backdrop, this study explores Mafeje's scholarly works in the areas of state, development and democracy, specifically focusing on the insight that we can garner from his scholarly works that will allow us to re-examine the challenges of development. In this context Mafeje's work is examined and situated within the social history of his milieus. The study employs social constructionism to explore the scholarship of Professor Mafeje. An important aspect of this theoretical framework is social embeddedness. Brunner (1990:30) has argued that it is culture, not biology that shapes human life and mind. The important aspect of this approach is that it acknowledges that the way we commonly understand the world, the categories and concepts we use, are historically and cultural specific. Mafeje's ideas make sense when located within complex social contexts in which they were produced. Because he was not producing knowledge in a vacuum, an understanding and appreciation of his ideas must be located within the social history that produced them.
13

Reasoning About Multi-stage Programs

Inoue, Jun 24 July 2013 (has links)
Multi-stage programming (MSP) is a style of writing program generators---programs which generate programs---supported by special annotations that direct construction, combination, and execution of object programs. Various researchers have shown MSP to be effective in writing efficient programs without sacrificing genericity. However, correctness proofs of such programs have so far received limited attention, and approaches and challenges for that task have been largely unexplored. In this thesis, I establish formal equational properties of the multi-stage lambda calculus and related proof techniques, as well as results that delineate the intricacies of multi-stage languages that one must be aware of. In particular, I settle three basic questions that naturally arise when verifying multi-stage functional programs. Firstly, can adding staging MSP to a language compromise the interchangeability of terms that held in the original language? Unfortunately it can, and more care is needed to reason about terms with free variables. Secondly, staging annotations, as the term ``annotations'' suggests, are often thought to be orthogonal to the behavior of a program, but when is this formally guaranteed to be the case? I give termination conditions that characterize when this guarantee holds. Finally, do multi-stage languages satisfy extensional facts, for example that functions agreeing on all arguments are equivalent? I develop a sound and complete notion of applicative bisimulation, which can establish not only extensionality but, in principle, any other valid program equivalence as well. These results improve our general understanding of staging and enable us to prove the correctness of complicated multi-stage programs.
14

Landauer Erasure For Quantum Systems

Aksak, Cagan 01 September 2009 (has links) (PDF)
Maxwell&rsquo / s thought experiment on a demon performing microscopic actions and violating the second law of thermodynamics has been a challenging paradox for a long time. It is finally resolved in the seventies and eighties by using Landauer&rsquo / s principle, which state that erasing information is necessarily accompanied with a heat dumped to the environment. The purpose of this study is to describe the heat dumped to the environment associated with erasure operations on quantum systems. To achieve this, first a brief introduction to necessary tools like density matrix formalism, quantum operators and entropy are given. Second, the Maxwell&rsquo / s demon and Szilard model is described. Also the connection between information theory and physics is discussed via this model. Finally, heat transfer operators associated with quantum erasure operations are defined and all of their properties are obtained.
15

Enhancing the Performance of Relay Networks with Network Coding

Melvin, Scott Harold 02 August 2012 (has links)
This dissertation examines the design and application of network coding (NC) strategies to enhance the performance of communication networks. With its ability to combine information packets from different, previously independent data flows, NC has the potential to improve the throughput, reduce delay and increase the power efficiency of communication systems in ways that have not yet been fully utilized given the current lack of processing power at relay nodes. With these motivations in mind, this dissertation presents three main contributions that employ NC to improve the efficiency of practical communication systems. First, the integration of NC and erasure coding (EC) is presented in the context of wired networks. While the throughput gains from utilizing NC have been demonstrated, and EC has been shown to be an efficient means of reducing packet loss, these have generally been done independently. This dissertation presents innovative methods to combine these two techniques through cross-layer design methodologies. Second, three methods to reduce or limit the delay introduced by NC when deployed in networks with asynchronous traffic are developed. Also, a novel opportunistic approach of applying EC for improved data reliability is designed to take advantage of unused opportunities introduced by the delay reduction methods proposed. Finally, computationally efficient methods for the selection of relay nodes and the assignment of transmit power values to minimize the total transmit power consumed in cooperative relay networks with NC are developed. Adaptive power allocation is utilized to control the formation of the network topology to maximize the efficiency of the NC algorithm. This dissertation advances the efficient deployment of NC through its integration with other algorithms and techniques in cooperative communication systems within the framework of cross-layer protocol design. The motivation is that to improve the performance of communication systems, relay nodes will need to perform more intelligent processing of data units than traditional routing. The results presented in this work are applicable to both wireless and wired networks with real-time traffic which exist in such systems ranging from cellular and ad-hoc networks to fixed optical networks.
16

Secure Cloud Storage

Luo, Jeff Yucong 23 May 2014 (has links)
The rapid growth of Cloud based services on the Internet invited many critical security attacks. Consumers and corporations who use the Cloud to store their data encounter a difficult trade-off of accepting and bearing the security, reliability, and privacy risks as well as costs in order to reap the benefits of Cloud storage. The primary goal of this thesis is to resolve this trade-off while minimizing total costs. This thesis presents a system framework that solves this problem by using erasure codes to add redundancy and security to users’ data, and by optimally choosing Cloud storage providers to minimize risks and total storage costs. Detailed comparative analysis of the security and algorithmic properties of 7 different erasure codes is presented, showing codes with better data security comes with a higher cost in computational time complexity. The codes which granted the highest configuration flexibility bested their peers, as the flexibility directly corresponded to the level of customizability for data security and storage costs. In-depth analysis of the risks, benefits, and costs of Cloud storage is presented, and analyzed to provide cost-based and security-based optimal selection criteria for choosing appropriate Cloud storage providers. A brief historical introduction to Cloud Computing and security principles is provided as well for those unfamiliar with the field. The analysis results show that the framework can resolve the trade-off problem by mitigating and eliminating the risks while preserving and enhancing the benefits of using Cloud storage. However, it requires higher total storage space due to the redundancy added by the erasure codes. The storage provider selection criteria will minimize the total storage costs even with the added redundancies, and minimize risks.
17

On Message Fragmentation, Coding and Social Networking in Intermittently Connected Networks

Altamimi, Ahmed B. 23 October 2014 (has links)
An intermittently connected network (ICN) is defined as a mobile network that uses cooperation between nodes to facilitate communication. This cooperation consists of nodes carrying messages from other nodes to help deliver them to their destinations. An ICN does not require an infrastructure and routing information is not retained by the nodes. While this may be a useful environment for message dissemination, it creates routing challenges. In particular, providing satisfactory delivery performance while keeping the overhead low is difficult with no network infrastructure or routing information. This dissertation explores solutions that lead to a high delivery probability while maintaining a low overhead ratio. The efficiency of message fragmentation in ICNs is first examined. Next, the performance of the routing is investigated when erasure coding and network coding are employed in ICNs. Finally, the use of social networking in ICNs to achieve high routing performance is considered. The aim of this work is to improve the better delivery probability while maintaining a low overhead ratio. Message fragmentation is shown to improve the CDF of the message delivery probability compared to existing methods. The use of erasure coding in an ICN further improve this CDF. Finally, the use of network coding was examined. The advantage of network coding over message replication is quantified in terms of the message delivery probability. Results are presented which show that network coding can improve the delivery probability compared to using just message replication. / Graduate / 0544 / 0984 / ahmedbdr@engr.uvic.ca
18

Haudenosaunee Good Mind: Tribalographies Recognizing American Indian Genocide and Restoring Balance in Literature Classrooms by Shifting Literary Criticism and Educational Curricula

January 2017 (has links)
abstract: The question of whether there has been an American Indian genocide has been contested, when genocide is defined according to the 1948 Convention on the Prevention and Punishment of the Crime of Genocide. Yet, I argue that both social and cultural genocide of American Indians has had volatile consequences for both Native and non-Native peoples. Because of the contested nature of this genocide, American Indian Studies scholars contend that Indigenous people's experiences often get marginalized and reconstructed, relegating stories to the category oppression, rather than proof of genocide, which has created intellectual and social absences (Vizenor 2009). Other American Indian Studies scholars argue for reform within American Indian educational settings, where Indigenous nations use their values and traditions within curricula to combat national absences. Despite excellent work on American Indian education, scholars have not addressed the central questions of how such absences affect both Native and non-Native students, why those absences exist, and why the U.S. dialogue around genocide is a rhetoric of avoidance and erasure, once any comparison begins with other genocide victims. Without adequate analysis of both American Indian genocide and absences within curricula, particularly humanities courses such as literature, where stories about American Indians can have a prominent space, we undervalue their impact on America's past and present histories, as well as current knowledges and values. Erasure of American Indian presence affects both Native and non-Native youth. Many American Indians are traumatized and believe their tribe’s stories are not worthy of inclusion. As well many non-Natives are unaware of Indigenous experiences and often left with stereotypes rather than realities. A Haudenosaunee paradigm of Good Mind can re-situate how we think about the canon, literature, and the classroom. The Good Mind allows for a two-way path where ideas pass back and forth, respecting differences, rather than replacing those differences with one ideology. This path is meant to open minds to connections with others which are kind and loving and lead to peaceful relationships. Theorizing literary erasure and genocide of the mind through experiences from Native and non-Native students and teachers embodies the Good Mind. / Dissertation/Thesis / Doctoral Dissertation English 2017
19

Conception et optimisation de codes AL-FEC : les codes GLDPC-Staircase / Design and Optimization of Forward Erasure Correction (FEC) codes : the GLDPC-Staircase AL-FEC codes

Mattoussi, Ferdaouss 13 February 2014 (has links)
Ce travail est consacré à la conception, l'analyse et l'optimisation des codes correcteurs d'effacements de niveau applicatif (AL-FEC). Nous nous intéressons à une famille des codes LDPC généralisés (GLDPC), nommés les codes GLDPC-Staircase, qui sont composé d'un code LDPC-Staircase code de base ainsi que des codes Reed-Solomon (RS) (codes externes). Dans la première partie de cette thèse, nous commençons par montrer que les codes RS ayant une construction basée sur la matrice "quasi" Hankel sont les codes MDS les plus appropriés pour obtenir la structure des codes GLDPC-Staircase. Ensuite, nous proposons un nouveau type de décodage à ces codes, baptisé décodage hybride (IT/RS/ML), pour atteindre les capacités de correction du décodage par maximum de vraisemblance (ML) avec de faible complexité. Afin d'étudier l'impact de la structure des codes GLDPC-Staircase sur le décodage, nous proposons une autre construction : ils diffèrent sur la nature des symboles de redondance LDPC générés. Puis, pour prédire le seuil de décodage et les écarts par rapport à la limite de Shannon des codes GLDPC-Staircase, nous élaborons une analyse asymptotique en fonction de la technique d'évolution de densité (DE), les technique EXIT (Extrinsic Information Transfer) et la théorème d'air. Finalement, en se basant sur l'analyse à taille finie et l'analyse asymptotique, nous réglons les importants paramètres internes de ces codes pour obtenir la configuration optimale sous le décodage hybride (IT/RS/ML). La deuxième partie de la thèse est consacrée à l'étude des codes GLDPC-Staircase dans diverses situations. Tout d'abord, nous montrons que ces codes ont des performances asymptotiquement très proches des limites théoriques de Shannon. En plus, à taille fini, ils permettent d'atteindre d'excellentes capacités de correction d'effacements (i.e., très proches de celle des codes MDS idéal) peu importe la taille des objets : très faible overhead de décodage, faible plancher d'erreur, et une zone ``waterfull'' raide. Nous montrons aussi que ces codes surpassent les codes Raptor, les codes LDPC-Staircase, et un autre code GLDPC avec une construction differente. Finallement, nous proposons une méthodologie générale pour régler le problème de l'impact de l'ordonnancement des paquets sur les performance des codes GLDPC-Staircase sur un grand nombre des canaux à effacements (avec perte en rafale ou pas). Cette étude montre le meilleur ordonnancement de paquets. Tous les résultats mentionnés ci-dessus montrent que les codes GLDPC-Staircase peuvent considérés comme des codes FEC de niveau applicatif (AL-FEC) universelle. / This work is dedicated to the design, analysis and optimization of Application-Level Forward Erasure Correction (AL-FEC) codes. In particular, we explore a class of Gen- eralized LDPC (GLDPC) codes, named GLDPC-Staircase codes, involving the LDPC- Staircase code (base code) as well as Reed-Solomon (RS) codes (outer codes). In the first part of this thesis, we start by showing that RS codes having “quasi” Han- kel matrix-based construction are the most suitable MDS codes to obtain the structure of GLDPC-Staircase codes. Then, we propose a new decoding type, so-called hybrid (IT/RS/ML) decoding, for these codes to achieve Maximum Likelihood (ML) correction capabilities with a lower complexity. To investigate the impact of the structure of GLDPC- Staircase codes on decoding, we propose another construction: they differ on the nature of generated LDPC repair symbols. Afterwards, to predict the capacity approaching GLDPC- Staircase codes, we derive an asymptotic analysis based on DE, EXIT functions, and area theorem. Eventually, based on finite length analysis and asymptotic analysis, we tune important internal parameters of GLDPC-Staircase codes to obtain the best configuration under hybrid (IT/RS/ML) decoding. The second part of the thesis benchmarks GLDPC-Staircase codes in various situations. First, we show that these codes are asymptotically quite close to Shannon limit performance and achieve finite length excellent erasure correction capabilities very close to that of ideal MDS codes no matter the objects size: very small decoding overhead, low error floor, and steep waterfall region. Second, we show that these codes outperform Raptor codes, LDPC- Staircase codes, other construction of GLDPC codes, and have correction capabilities close to that of RaptorQ codes. Last but not least, we propose a general-methodology to address the problem of the impact of packet scheduling on GLDPC-Staircase codes for a large set of loss channels (with burst loss or not). This study shows the best packet scheduling. All the aforementioned results make GLDPC-Staircase codes an ubiquitous Application-Level FEC (AL-FEC) solution.
20

Power-benefit analysis of erasure encoding with redundant routing in sensor networks.

Vishwanathan, Roopa 12 1900 (has links)
One of the problems sensor networks face is adversaries corrupting nodes along the path to the base station. One way to reduce the effect of these attacks is multipath routing. This introduces some intrusion-tolerance in the network by way of redundancy but at the cost of a higher power consumption by the sensor nodes. Erasure coding can be applied to this scenario in which the base station can receive a subset of the total data sent and reconstruct the entire message packet at its end. This thesis uses two commonly used encodings and compares their performance with respect to power consumed for unencoded data in multipath routing. It is found that using encoding with multipath routing reduces the power consumption and at the same time enables the user to send reasonably large data sizes. The experiments in this thesis were performed on the Tiny OS platform with the simulations done in TOSSIM and the power measurements were taken in PowerTOSSIM. They were performed on the simple radio model and the lossy radio model provided by Tiny OS. The lossy radio model was simulated with distances of 10 feet, 15 feet and 20 feet between nodes. It was found that by using erasure encoding, double or triple the data size can be sent at the same power consumption rate as unencoded data. All the experiments were performed with the radio set at a normal transmit power, and later a high transmit power.

Page generated in 0.0344 seconds