• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 763
  • 170
  • 24
  • 21
  • 21
  • 21
  • 21
  • 21
  • 21
  • 6
  • 6
  • 4
  • 1
  • 1
  • Tagged with
  • 2872
  • 2872
  • 2521
  • 2129
  • 1312
  • 553
  • 527
  • 462
  • 443
  • 382
  • 373
  • 306
  • 262
  • 223
  • 208
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
461

A linguistic approach to concurrent, distributed, and adaptive programming across heterogeneous platforms

Harvey, Paul January 2015 (has links)
Two major trends in computing hardware during the last decade have been an increase in the number of processing cores found in individual computer hardware platforms and an ubiquity of distributed, heterogeneous systems. Together, these changes can improve not only the performance of a range of applications, but the types of applications that can be created. Despite the advances in hardware technology, advances in programming of such systems has not kept pace. Traditional concurrent programming has always been challenging, and is only set to be come more so as the level of hardware concurrency increases. The different hardware platforms which make up heterogeneous systems come with domain-specific programming models, which are not designed to interact, or take into account the different resource-constraints present across different hardware devices, motivating a need for runtime reconfiguration or adaptation. This dissertation investigates the actor model of computation as an appropriate abstraction to address the issues present in programming concurrent, distributed, and adaptive applications across different scales and types of computing hardware. Given the limitations of other approaches, this dissertation describes a new actor-based programming language (Ensemble) and its runtime to address these challenges. The goal of this language is to enable non-specialist programmers to take advantage of parallel, distributed, and adaptive programming without the programmer requiring in-depth knowledge of hardware architectures or software frameworks. There is also a description of the design and implementation of the runtime system which executes Ensemble applications across a range of heterogeneous platforms. To show the suitability of the actor-based abstraction in creating applications for such systems, the language and runtime were evaluated in terms of linguistic complexity and performance. These evaluations covered programming embedded, concurrent, distributed, and adaptable applications, as well as combinations thereof. The results show that the actor provides an objectively simple way to program such systems without sacrificing performance.
462

GUMSMP : a scalable parallel Haskell implementation

Aljabri, Malak Saleh January 2015 (has links)
The most widely available high performance platforms today are hierarchical, with shared memory leaves, e.g. clusters of multi-cores, or NUMA with multiple regions. The Glasgow Haskell Compiler (GHC) provides a number of parallel Haskell implementations targeting different parallel architectures. In particular, GHC-SMP supports shared memory architectures, and GHC-GUM supports distributed memory machines. Both implementations use different, but related, runtime system (RTS) mechanisms and achieve good performance. A specialised RTS for the ubiquitous hierarchical architectures is lacking. This thesis presents the design, implementation, and evaluation of a new parallel Haskell RTS, GUMSMP, that combines shared and distributed memory mechanisms to exploit hierarchical architectures more effectively. The design evaluates a variety of design choices and aims to efficiently combine scalable distributed memory parallelism, using a virtual shared heap over a hierarchical architecture, with low-overhead shared memory parallelism on shared memory nodes. Key design objectives in realising this system are to prefer local work, and to exploit mostly passive load distribution with pre-fetching. Systematic performance evaluation shows that the automatic hierarchical load distribution policies must be carefully tuned to obtain good performance. We investigate the impact of several policies including work pre-fetching, favouring inter-node work distribution, and spark segregation with different export and select policies. We present the performance results for GUMSMP, demonstrating good scalability for a set of benchmarks on up to 300 cores. Moreover, our policies provide performance improvements of up to a factor of 1.5 compared to GHC- GUM. The thesis provides a performance evaluation of distributed and shared heap implementations of parallel Haskell on a state-of-the-art physical shared memory NUMA machine. The evaluation exposes bottlenecks in memory management, which limit scalability beyond 25 cores. We demonstrate that GUMSMP, that combines both distributed and shared heap abstractions, consistently outper- forms the shared memory GHC-SMP on seven benchmarks by a factor of 3.3 on average. Specifically, we show that the best results are obtained when shar- ing memory only within a single NUMA region, and using distributed memory system abstractions across the regions.
463

Pressure as a non-dominant hand input modality for bimanual interaction techniques on touchscreen tablets

McLachlan, Ross David January 2015 (has links)
Touchscreen tablet devices present an interesting challenge to interaction designers: they are not quite handheld like their smartphone cousins, though their form factor affords usage away from the desktop and other surfaces, requires a user to support a larger weight and navigate more screen space. Thus, the repertoire of touch input techniques is often reduced to those performable with one hand. Previous studies have suggested there are bimanual interaction techniques that offer both manual and cognitive benefits over equivalent unimanual techniques and that pressure is useful as a primary input modality on mobile devices and as an augmentation to finger/stylus input on touchscreens. However, there has been no research on the use of pressure as a modality to expand the range of bimanual input techniques on tablet devices. The first two experiments investigated bimanual scrolling on tablet devices, based on the premise that the control of scrolling speed and vertical scrolling direction could be thought of as separate tasks and that the current status quo of combining both into a single one- handed (unimanual) gesture on a touchscreen or on physical dial can be improved upon. Four bimanual scrolling techniques were compared to two status quo unimanual scrolling techniques in a controlled linear targeting task. The Dial and Slider bimanual technique was superior to the others in terms of Movement Time and the Dial and Pressure bimanual technique was superior in terms of Subjective Workload, suggesting that the bimanual scrolling techniques are better than the status quo unimanual techniques in terms of both performance and preference. The same interaction techniques were then evaluated using a photo browsing task that was chosen to resemble the way people browse their music collections when they are unsure about what they are looking for. These studies demonstrated that pressure is a more effective auxiliary modality than a touch slider in the context of bimanual scrolling techniques. These studies also demonstrated that the bimanual techniques did not provide any concrete benefits over the Unimanual touch scrolling technique, which is the status quo scrolling technique on commercially available touchscreen tablets and smartphones, in the context of an image browsing task. A novel investigation of pressure input was presented where it was characterised as a transient modality, one that has a natural inverse, bounce-back and a state that only persists during interaction. Two studies were carried out investigating the precision of applied pressure as part of a bimanual interaction, where the selection event is triggered by the dominant hand on the touchscreen (using existing touchscreen input gestures) with the goal of study- ing pressure as a functional primitive, without implying any particular application. Two aspects of pressure input were studied – pressure Targeting and Maintaining pressure over time. The results demonstrated that, using a combination of non-dominant hand pressure and dominant-hand touchscreen taps, overall pressure targeting accuracy was high (93.07%). For more complicated dominant-hand input techniques (swipe, pinch and rotate gestures), pressure targeting accuracy was still high (86%). The results demonstrated that participants were able to achieve high levels of pressure accuracy (90.3%) using DH swipe gestures (the simplest gesture in the study) suggesting that the ability to perform a simultaneous combination of pressure and touchscreen gesture input depends on the complexity of the dominant hand action involved. This thesis provides the first detailed study of the use of non-dominant hand pressure input to enable bimanual interaction techniques for tablet devices. It explores the use of pressure as a modality that can expand the range of available bimanual input techniques while the user is seated and comfortably holding the device and offers designers guidelines for including pressure as a non-dominant hand input modality for bimanual interaction techniques, in a way that supplements existing dominant-hand action.
464

On the complexities of polymorphic stream equation systems, isomorphism of finitary inductive types, and higher homotopies in univalent universes

Sattler, Christian January 2015 (has links)
This thesis is composed of three separate parts. The first part deals with definability and productivity issues of equational systems defining polymorphic stream functions. The main result consists of showing such systems composed of only unary stream functions complete with respect to specifying computable unary polymorphic stream functions. The second part deals with syntactic and semantic notions of isomorphism of finitary inductive types and associated decidability issues. We show isomorphism of so-called guarded types decidable in the set and syntactic model, verifying that the answers coincide. The third part deals with homotopy levels of hierarchical univalent universes in homotopy type theory, showing that the n-th universe of n-types has truncation level strictly n+1.
465

Quotient types in type theory

Li, Nuo January 2015 (has links)
Martin-Lof's intuitionistic type theory (Type Theory) is a formal system that serves not only as a foundation of constructive mathematics but also as a dependently typed programming language. Dependent types are types that depend on values of other types. Type Theory is based on the Curry-Howard isomorphism which relates computer programs with mathematical proofs so that we can do computer-aided formal reasoning and write certified programs in programming languages like Agda, Epigram etc. Martin Lof proposed two variants of Type Theory which are differentiated by the treatment of equality. In Intensional Type Theory, propositional equality defined by identity types does not imply definitional equality, and type checking is decidable. In Extensional Type Theory, propositional equality is identified with definitional equality which makes type checking undecidable. Because of the good computational properties, Intensional Type Theory is more popular, however it lacks some important extensional concepts such as functional extensionality and quotient types. This thesis is about quotient types. A quotient type is a new type whose equality is redefined by a given equivalence relation. However, in the usual formulation of Intensional Type Theory, there is no type former to create a quotient. We also lose canonicity if we add quotient types into Intensional Type Theory as axioms. In this thesis, we first investigate the expected syntax of quotient types and explain it with categorical notions. For quotients which can be represented as a setoid as well as defined as a set without a quotient type former, we propose to define an algebraic structure of quotients called definable quotients. It relates the setoid interpretation and the set definition via a normalisation function which returns a normal form (canonical choice) for each equivalence class. It can be seen as a simulation of quotient types and it helps theorem proving because we can benefit from both representations. However this approach cannot be used for all quotients. It seems that we cannot define a normalisation function for some quotients in Type Theory, e.g. Cauchy reals and finite multisets. Quotient types are indeed essential for formalisation of mathematics and reasoning of programs. Then we consider some models of Type Theory where types are interpreted as structured objects such as setoids, groupoids or weak omega-groupoids. In these models equalities are internalised into types which means that it is possible to redefine equalities. We present an implementation of Altenkirch's setoid model and show that quotient types can be defined within this model. We also describe a new extension of Martin-Lof type theory called Homotopy Type Theory where types are interpreted as weak omega-groupoids. It can be seen as a generalisation of the groupoid model which makes extensional concepts including quotient types available. We also introduce a syntactic encoding of weak omega-groupoids which can be seen as a first step towards building a weak omega-groupoids model in Intensional Type Theory. All of these implementations were performed in the dependently typed programming language Agda which is based on intensional Martin-Lof type theory.
466

Truncation levels in homotopy type theory

Kraus, Nicolai January 2015 (has links)
Homotopy type theory (HoTT) is a branch of mathematics that combines and benefits from a variety of fields, most importantly homotopy theory, higher dimensional category theory, and, of course, type theory. We present several original results in homotopy type theory which are related to the truncation level of types, a concept due to Voevodsky. To begin, we give a few simple criteria for determining whether a type is 0-truncated (a set), inspired by a well-known theorem by Hedberg, and these criteria are then generalised to arbitrary n. This naturally leads to a discussion of functions that are weakly constant, i.e. map any two inputs to equal outputs. A weakly constant function does in general not factor through the propositional truncation of its domain, something that one could expect if the function really did not depend on its input. However, the factorisation is always possible for weakly constant endofunctions, which makes it possible to define a propositional notion of anonymous existence. We additionally find a few other non-trivial special cases in which the factorisation works. Further, we present a couple of constructions which are only possible with the judgmental computation rule for the truncation. Among these is an invertibility puzzle that seemingly inverts the canonical map from Nat to the truncation of Nat, which is perhaps surprising as the latter type is equivalent to the unit type. A further result is the construction of strict n-types in Martin-Lof type theory with a hierarchy of univalent universes (and without higher inductive types), and a proof that the universe U(n) is not n-truncated. This solves a hitherto open problem of the 2012/13 special year program on Univalent Foundations at the Institute for Advanced Study (Princeton). The main result of this thesis is a generalised universal property of the propositional truncation, using a construction of coherently constant functions. We show that the type of such coherently constant functions between types A and B, which can be seen as the type of natural transformations between two diagrams over the simplex category without degeneracies (i.e. finite non-empty sets and strictly increasing functions), is equivalent to the type of functions with the truncation of A as domain and B as codomain. In the general case, the definition of natural transformations between such diagrams requires an infinite tower of conditions, which exists if the type theory has Reedy limits of diagrams over the ordinal omega. If B is an n-type for some given finite n, (non-trivial) Reedy limits are unnecessary, allowing us to construct functions from the truncation of A to B in homotopy type theory without further assumptions. To obtain these results, we develop some theory on equality diagrams, especially equality semi-simplicial types. In particular, we show that the semi-simplicial equality type over any type satisfies the Kan condition, which can be seen as the simplicial version of the fundamental result by Lumsdaine, and by van den Berg and Garner, that types are weak omega-groupoids. Finally, we present some results related to formalisations of infinite structures that seem to be impossible to express internally. To give an example, we show how the simplex category can be implemented so that the categorical laws hold strictly. In the presence of very dependent types, we speculate that this makes the Reedy approach for the famous open problem of defining semi-simplicial types work.
467

Efficient hand orientation and pose estimation for uncalibrated cameras

Asad, M. January 2017 (has links)
We proposed a staged probabilistic regression method that is capable of learning well from a number of variations within a dataset. The proposed method is based on multi layered Random Forest, where the first layer consisted of a single marginalization weights regressor and second layer contained an ensemble of expert learners. The expert learners are trained in stages, where each stage involved training and adding an expert learner to the intermediate model. After every stage, the intermediate model was evaluated to reveal a latent variable space defining a subset that the model had difficulty in learning from. This subset was used to train the next expert regressor. The posterior probabilities for each training sample were extracted from each expert regressors. These posterior probabilities were then used along with a Kullback-Leibler divergence-based optimization method to estimate the marginalization weights for each regressor. A marginalization weights regressor was trained using CDF and the estimated marginalization weights. We showed the extension of our work for simultaneous hand orientation and pose inference. The proposed method outperformed the state-of-the-art for marginalization of multi-layered Random Forest and hand orientation inference. Furthermore, we show that a method which simultaneously learns from hand orientation and pose outperforms pose classification as it is able to better understand the variations in pose induced due to viewpoint changes.
468

Energy efficient and secure wireless communications for wireless sensor networks

Gong, P. January 2017 (has links)
This dissertation considers wireless sensor networks (WSNs) operating in severe environments where energy efficiency and security are important factors. This main aim of this research is to improve routing protocols in WSNs to ensure efficient energy usage and protect against attacks (especially energy draining attacks) targeting WSNs. An enhancement of the existing AODV (Ad hoc On-Demand Distance Vector) routing protocol for energy efficiency, called AODV-Energy Harvesting Aware (AODVEHA), is proposed and evaluated. It not only inherits the advantages of AODV which are well suited to ad hoc networks, but also makes use of the energy harvesting capability of sensor nodes in the network. In addition to the investigation of energy efficiency, another routing protocol called Secure and Energy Aware Routing Protocol (ETARP) designed for energy efficiency and security of WSNs is presented. The key part of the ETARP is route selection based on utility theory, which is a novel approach to simultaneously factor energy efficiency and trustworthiness of routes in the routing protocol. Finally, this dissertation proposes a routing protocol to protect against a specific type of resource depletion attack called Vampire attacks. The proposed resource-conserving protection against energy draining (RCPED) protocol is independent of cryptographic methods, which brings advantage of less energy cost and hardware requirement. RCPED collaborates with existing routing protocols, detects abnormal sign of Vampire attacks and determines the possible attackers. Then routes are discovered and selected on the basis of maximum priority, where the priority that reflects the energy efficiency and safety level of route is calculated by means of Analytic Hierarchy Process (AHP). The proposed analytic model for the aforementioned routing solutions are verified by simulations. Simulations results validate the improvements of proposed routing approaches in terms of better energy efficiency and guarantee of security.
469

Using directional change for information extraction in financial market data

Tao, Ran January 2018 (has links)
Directional change (DC) is a new concept for summarizing market dynamics. Instead of sampling the financial market at fixed intervals as in the traditional time series analysis, by contrast, DC is data-driven: the price change itself dictates when a price is recorded. DC provides us with a complementary way to extract information from data. The data sampled at irregular time intervals in DC allows us to observe features that may not be recognized under time series. In this thesis we propose our new method for the summarizing of financial markets through the use of the DC framework. Firstly, we define what is the vocabulary needed for a DC market summary. The vocabulary includes DC indicators and metrics. DC indicators are used to build a DC market summary for a single market. DC metrics help us quantitatively measure the differences between two markets under the directional change method. We demonstrate how such metrics could quantitatively measure the differences between different DC market summaries. Then, with real financial market data studied using DC, we aim to demonstrate the practicability of DC market analysis, as a complementary method to that of time series, in the analysis of the financial market.
470

QoS-aware joint power and subchannel allocation algorithms for wireless network virtualization

Wei, Junyi January 2017 (has links)
Wireless network virtualization (WNV) is a promising technology which aims to overcome the network redundancy problems of the current Internet. WNV involves abstraction and sharing of resources among different parties. It has been considered as a long term solution for the future Internet due to its flexibility and feasibility. WNV separates the traditional Internet service provider’s role into the infrastructure provider (InP) and service provider (SP). The InP owns all physical resources while SPs borrow such resources to create their own virtual networks in order to provide services to end users. Because the radio resources is finite, it is sensible to introduce WNV to improve resources efficiency. This thesis proposes three resource allocation algorithms on an orthogonal frequency division multiple access (OFDMA)-based WNV transmission system aiming to improve resources utility. The subject of the first algorithm is to maximize the InP and virtual network operators’ (VNOs’) total throughput by means of subchannel allocation. The second one is a power allocation algorithm which aims to improve VNO’s energy efficiency. In addition, this algorithm also balances the competition across VNOs. Finally, a joint power and subchannel allocation algorithm is proposed. This algorithm tries to find out the overall transmission rate. Moreover, all the above alogorithms consider the InP’s quality of service (QoS) requirement in terms of data rate. The evaluation results indicates that the joint resource allocation algorithm has a better performance than others. Furthermore, the results also can be a guideline for WNV performance guarantees.

Page generated in 0.1128 seconds