• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 763
  • 170
  • 24
  • 21
  • 21
  • 21
  • 21
  • 21
  • 21
  • 6
  • 6
  • 4
  • 1
  • 1
  • Tagged with
  • 2872
  • 2872
  • 2521
  • 2129
  • 1312
  • 553
  • 527
  • 462
  • 443
  • 382
  • 373
  • 306
  • 262
  • 223
  • 208
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
411

Population based spatio-temporal probabilistic modelling of fMRI data

Alowadi, Nahed January 2018 (has links)
High-dimensional functional magnetic resonance imaging (fMRI) data is characterized by complex spatial and temporal patterns related to neural activation. Mixture based Bayesian spatio-temporal modelling is able to extract spatiotemporal components representing distinct haemodyamic response and activation patterns. A recent development of such approach to fMRI data analysis is so-called spatially regularized mixture model of hidden process models (SMM-HPM). SMM-HPM can be used to reduce the four-dimensional fMRI data of a pre-determined region of interest (ROI) to a small number of spatio-temporal prototypes, sufficiently representing the spatio-temporal features of the underlying neural activation. Summary statistics derived from these features can be interpreted as quantification of (1) the spatial extent of sub-ROI activation patterns, (2) how fast the brain respond to external stimuli; and (3) the heterogeneity in single ROIs. This thesis aims to extend the single-subject SMM-HPM to a multi-subject SMM-HPM so that such features can be extracted at group-level, which would enable more robust conclusion to be drawn.
412

On some multivariate control charts

Alfarag, Fadhil January 2016 (has links)
To maintain the quality of a product or to improve the reliability of a process, all industries need to monitor several parameters about their production process. Control charts are some visualization tools for monitoring processes statistically. In this work, we propose a few control charting schemes to monitor several characteristics of a process at the same time and to detect when it goes out of control. Our objective is to reduce the false alarms (the scheme detects a problem when actually there is none) as well as to quickly detect the correct out-of-control situation. The novelty of the proposed schemes are that they do not depend on commonly assumed Normal distribution of the process variables and is applicable for a much wider range of data distributions. At first, we make a detailed literature review of some univariate and multivariate control charts. We perform a comparison study of the commonly used multivariate control charts when the underlying distribution is not normal and show that they perform poorly giving a very high false alarm rate. Next we propose some nonparametric multivariate control charts based on the lengths of the multivariate rank vectors. The ideas are similar to the ones proposed by Liu (1995), however, we show that our proposed methods are computationally simpler in any dimension. We propose some more multivariate versions of Shewhert type, CUSUM and EWMA control charts based on spatial sign vectors and signed rank vectors. We also discuss several design parameters in the construction of these charts. None of the proposed charts depend on the assumption of underlying distribution or estimation of distributional parameters.
413

Nanomaterial sensing : integrating MEMS technology and self-assembled monolayers

Rushdi, Abduljabbar Ibrahim Rasheed January 2018 (has links)
The integration of self-assembled monolayer (SAM) into microelectromechanical system (MEMS) devices is introduced in Chapter 1. Chapter 2 is concerned with the specific immobilization of NeutrAvidin on pure and mixed SAMs of biotinylated tri(ethylene glycol) undecanethiol (BUT, biotin containing sensor element for Neutravidin) and tetra(ethylene glycol) (TEG, spacer) which were deposited on Au surfaces. Contact angle, ellipsometry and XPS were used to characterize the composition of these SAMs. SPR and QCM were used to study the adsorption behavior of NeutrAvidin to the pure and mixed SAMs. Chapter 3 describes the optimum conditions in details of how to obtain the monolayer of 11-amino-1-undecanethiol hydrochloride (Alk-amine) and 4-aminothiophenol (Ar-amine) SAM, which were deposited on an Au surface by using an ethanolic solution of Triethylamine (TEA) and how to reduce the contamination which are combined with the deposition of the two amines. Finally, ellipsometry, contact angle and XPS were used to characterise the monolayer of two amine SAMs. Chapter 4 describes the optimum conditions of gold nanoparticles (G-NPs) deposition on a monolayer of Alk and Ar-amine terminated SAMs, which were described in chapter 3, at different pHs. AFM and QCM confirm that the optimum deposition of G-NPs was at pH 5 for the two amine SAMs and the deposition on Alk-amine SAM is much higher than on Ar-amine SAM. Thus, Alk-amine SAM was chosen for chemically modifying the surface of a micro paddle. After the modification the paddle was used to detect the deposited mass of G-NPs and SEM was used to confirm dispersity of the monolayer of G-NPs.
414

Implementation of data segmentation in a GKS based graphics system

May, Rebecca Edwards January 2010 (has links)
Typescript (photocopy). / Digitized by Kansas Correctional Industries
415

Enhancing programmability for adaptive resource management in next generation data centre networks

Jouet, Simon January 2017 (has links)
Recently, Data Centre (DC) infrastructures have been growing rapidly to support a wide range of emerging services, and provide the underlying connectivity and compute resources that facilitate the "*-as-a-Service" model. This has led to the deployment of a multitude of services multiplexed over few, very large-scale centralised infrastructures. In order to cope with the ebb and flow of users, services and traffic, infrastructures have been provisioned for peak-demand resulting in the average utilisation of resources to be low. This overprovisionning has been further motivated by the complexity in predicting traffic demands over diverse timescales and the stringent economic impact of outages. At the same time, the emergence of Software Defined Networking (SDN), is offering new means to monitor and manage the network infrastructure to address this underutilisation. This dissertation aims to show how measurement-based resource management can improve performance and resource utilisation by adaptively tuning the infrastructure to the changing operating conditions. To achieve this dynamicity, the infrastructure must be able to centrally monitor, notify and react based on the current operating state, from per-packet dynamics to longstanding traffic trends and topological changes. However, the management and orchestration abilities of current SDN realisations is too limiting and must evolve for next generation networks. The current focus has been on logically centralising the routing and forwarding decisions. However, in order to achieve the necessary fine-grained insight, the data plane of the individual device must be programmable to collect and disseminate the metrics of interest. The results of this work demonstrates that a logically centralised controller can dynamically collect and measure network operating metrics to subsequently compute and disseminate fine-tuned environment-specific settings. They show how this approach can prevent TCP throughput incast collapse and improve TCP performance by an order of magnitude for partition-aggregate traffic patterns. Futhermore, the paradigm is generalised to show the benefits for other services widely used in DCs such as, e.g, routing, telemetry, and security.
416

The design and implementation of PRONTO processor for natural text organization

Anderson, Steven Michael January 2010 (has links)
Typescript (photocopy). / Digitized by Kansas Correctional Industries
417

Wave function calculations on small molecules

Hollis, Peter Clement January 1967 (has links)
The ab initio calculation of wave functions for small polyatomic molecules is now feasible but is time-consuming, expensive and limited in accuracy. The most frequently used approach is that of molecular orbital (MO) theory, using the self-consistent field (SCF) method with a linear combination of atomic orbitals (LCAO) approximation to the HO's. On the other hand, semiempirical methods have been widely used and have yielded extremely interesting results in spite of the fact that they have often been based on flimsy theoretical foundations. The first and best known calculations of this type were of course initiated by Huckel and refer to the n-electrons of conjugated molecules. Later semiempirical SCF LCAO MO calculations, in which electron interaction effects are more properly taken into account, were done on n-electron systems. Then the Huckel type LCAO MO method, and later the approximate SCF HO scheme, were applied to more general systems. In this work a new semi-empirical SCF scheme is presented in which an attempt is made to produce a method as close to ab initio procedures as possible. A particular basis of orthogonalised orbitals is chosen to render valid, with a reasonable degree of accuracy, the integral approximations made. The use of a particular set of integral approximations allows the simulation of the results of non-empirical calculations. The semi-empirical calculations described in this work are less empirical than any previously performed on more general systems; this allows the scheme to be built on a sounder basis than other semi-empirical schemes which include all electrons. Results are presented to show that with a relatively simple method of estimating the larger two-electron integrals, over an orthogonal basis, reasonable results can be obtained for small polyatomic molecules. As well as giving good results the method is used as a basis for examining the foundations of more empirical calculations. Two approaches are used to obtain wave functions, the SCF MO LCAO and the self-consistent group function (SCGF) method. It is found that SCGF method has several advantages over the ordinary SCF HO LCAO method in the performance of semi-empirical calculations.
418

An ontology-driven approach to personalised mHealth application development

Campbell, Daniel George January 2018 (has links)
Mobile devices when provisioned with intuitive mobile healthcare (mHealth) applications provide a powerful platform that has been recognised to have made a significant impact on healthcare delivery. The popularity of mHealth applications is rapidly expanding amongst consumers and there is a continuous demand to improve the effectiveness of mHealth applications. Personalisation has already been acknowledged by the healthcare industry as a mechanism to improve healthcare delivery, recognising that each consumer is unique. Yet, a typical mHealth application is designed to cater for the needs of large target demographics and are frequently developed without the necessary knowledge and expertise of healthcare providers. As a result, they often fail to meet the consumer’s specific healthcare requirements. Since healthcare professionals understand the specific healthcare requirements of a consumer, they are best suited for developing personalised mobile healthcare applications. However, they do not possess the familiarity, skills and knowledge to address the challenges associated with mobile application development. Therefore, this research addresses the need for a new approach to personalised mHealth application development in the form of an extensible ontology-driven framework that enables healthcare professionals to create personalised mHealth applications for healthcare consumers. This research explored personalisation & the challenges of personalised mobile application development, existing approaches and related works. Followed by a detailed investigation into the various health-related functions available in mHealth applications designed for healthcare consumers, that led to the creation of the mHealth Application Function Taxonomy. The next phase presents the theoretical design and development considerations of the Personalised Mobile Application Development (PMAD) ontology. The PMAD ontology encapsulates key knowledge associated with the development of personalised mHealth applications, that can be operationalised to compensate for the missing domain expertise during the personalised mHealth application development process. The final and contribution of this research describes and defines the approach and components of the Personalised Mobile Application Development ontology-driven framework that addresses the limitations of existing end-user programming solutions and enables healthcare professionals to create personalised mHealth applications for healthcare consumers.
419

Bayesian inference for continuous time Markov chains

Alharbi, Randa January 2019 (has links)
Continuous time Markov chains (CTMCs) are a flexible class of stochastic models that have been employed in a wide range of applications from timing of computer protocols, through analysis of reliability in engineering, to models of biochemical networks in molecular biology. These models are defined as a state system with continuous time transitions between the states. Extensive work has been historically performed to enable convenient and flexible definition, simulation, and analysis of continuous time Markov chains. This thesis considers the problem of Bayesian parameter inference on these models and investigates computational methodologies to enable such inference. Bayesian inference over continuous time Markov chains is particularly challenging as the likelihood cannot be evaluated in a closed form. To overcome the statistical problems associated with evaluation of the likelihood, advanced algorithms based on Monte Carlo have been used to enable Bayesian inference without explicit evaluation of the likelihoods. An additional class of approximation methods has been suggested to handle such inference problems, known as approximate Bayesian computation. Novel Markov chain Monte Carlo (MCMC) approaches were recently proposed to allow exact inference. The contribution of this thesis is in discussion of the techniques and challenges in implementing these inference methods and performing an extensive comparison of these approaches on two case studies in systems biology. We investigate how the algorithms can be designed and tuned to work on CTMC models, and to achieve an accurate estimate of the posteriors with reasonable computational cost. Through this comparison, we investigate how to avoid some practical issues with accuracy and computational cost, for example by selecting an optimal proposal distribution and introducing a resampling step within the sequential Monte-Carlo method. Within the implementation of the ABC methods we investigate using an adaptive tolerance schedule to maximise the efficiency of the algorithm and in order to reduce the computational cost.
420

Parallel communications in ATM networks. / CUHK electronic theses & dissertations collection

January 1997 (has links)
by Ding Quan-Long. / Thesis (Ph.D.)--Chinese University of Hong Kong, 1997. / Includes bibliographical references (p. 135-141). / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Mode of access: World Wide Web.

Page generated in 0.0879 seconds