61 |
Managing Telemetry Information in the New Era of Test and EvaluationGardner, Lee S., Hoaglund, Catharine M., Painter, Darcy S. 10 1900 (has links)
International Telemetering Conference Proceedings / October 26-29, 1992 / Town and Country Hotel and Convention Center, San Diego, California / Terms like efficiency, quick response, and interoperability are becoming the bywords of the test and evaluation (T&E) community as the Defense Department tightens its corporate belt [1]. These changes mark the end of an era of manual processes and duplication of effort and the beginning of an era of cooperation, standards, and Total Quality Management (TQM). Managing the huge volume of telemetry information required to support flight test at the Air Force Flight Test Center (AFFTC) has required new paradigms and system development strategies. These new ideas have resulted in the Aircraft Information Management System (AIMS), a system designed to meet the challenges of a new era in T&E. This paper discusses the AIMS design and function as background for the deeper issue of effective, efficient management of telemetry setup information. The information history model used in AIMS is presented and discussed. In the process of developing standards for the AIMS a methodology was discovered and successfully implemented for resolving information management issues in the framework of system development.
|
62 |
Antisocial Media: Information Mismanagement in VietnamFlaherty, Nora 01 January 2017 (has links)
This paper examines the ineffective and unsustainable information management system employed by the Socialist Republic of Vietnam. The Vietnamese public faces an absence of information, due to strictly controlled state media and an unreliable unofficial media. Meanwhile, the current information management system does not allow the regime to meet its core interests of increasing government legitimacy and decreasing corruption. Increasing press freedom is also perceived as an existential threat, and so the government’s basic interests contradict each other. This paper examines the literature on information management in authoritarian and democratic regimes to determine the functions of the media in more free and less free contexts. It then examines the information systems of two successful case studies, China and Singapore, to locate applicable lessons for the Vietnam case. The paper finds that while Vietnam should adopt some elements of China’s media demobilization strategies, both the Vietnamese regime and the Vietnamese people are better served by moving towards a Singaporean-style social contract between the public and the regime.
|
63 |
Analysis of magnetoencephalographic data as a nonlinear dynamical systemWoon, Wei Lee January 2002 (has links)
This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.
|
64 |
Pragmatic algorithms for implementing geostatistics with large datasetsIngram, Benjamin R. January 2008 (has links)
With the ability to collect and store increasingly large datasets on modern computers comes the need to be able to process the data in a way that can be useful to a Geostatistician or application scientist. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively for likelihood-based Geostatistics. Various methods have been proposed and are extensively used in an attempt to overcome these complexity issues. This thesis introduces a number of principled techniques for treating large datasets with an emphasis on three main areas: reduced complexity covariance matrices, sparsity in the covariance matrix and parallel algorithms for distributed computation. These techniques are presented individually, but it is also shown how they can be combined to produce techniques for further improving computational efficiency.
|
65 |
Non-linear hierarchical visualisationSun, Yi January 2002 (has links)
This thesis applies a hierarchical latent trait model system to a large quantity of data. The motivation for it was lack of viable approaches to analyse High Throughput Screening datasets which maybe include thousands of data points with high dimensions. We believe that a latent variable model (LTM) with a non-linear mapping from the latent space to the data space is a preferred choice for visualising a complex high-dimensional data set. As a type of latent variable model, the latent trait model can deal with either continuous data or discrete data, which makes it particularly useful in this domain. In addition, with the aid of differential geometry, we can imagine that distribution of data from magnification factor and curvature plots. Rather than obtaining the useful information just from a single plot, a hierarchical LTM arranges a set of LTMs and their corresponding plots in a tree structure. We model the whole data set with a LTM at the top level, which is broken down into clusters at deeper levels of the hierarchy. In this manner, the refined visualisation plots can be displayed in deeper levels and sub-clusters may be found. Hierarchy of LTMs is trained using expectation-maximisation (EM) algorithm to maximise its likelihood with respect to the data sample. Training proceeds interactively in a recursive fashion (top-down). The user subjectively identifies interesting regions on the visualisation plot that they would like to model in a greater detail. At each stage of hierarchical LTM construction, the EM algorithm alternates between the E - and M - step. Another problem that can occur when visualising a large data set is that there may be significant overlaps of data clusters. It is very difficult for the user to judge where centres of regions of interest should be put. We address this problem by employing the minimum message length technique, which can help the user to decide the optimal structure of the model.
|
66 |
Modelling nonlinear stochastic dynamics in financial time seriesLesch, Ragnar H. January 2000 (has links)
For analysing financial time series two main opposing viewpoints exist, either capital markets are completely stochastic and therefore prices follow a random walk, or they are deterministic and consequently predictable. For each of these views a great variety of tools exist with which it can be tried to confirm the hypotheses. Unfortunately, these methods are not well suited for dealing with data characterised in part by both paradigms. This thesis investigates these two approaches in order to model the behaviour of financial time series. In the deterministic framework methods are used to characterise the dimensionality of embedded financial data. The stochastic approach includes here an estimation of the unconditioned and conditional return distributions using parametric, non- and semi-parametric density estimation techniques. Finally, it will be shown how elements from these two approaches could be combined to achieve a more realistic model for financial time series.
|
67 |
Investigating viscous fluid flow in an internal mixer using computational fluid dynamicsHarries, Alun M. January 2000 (has links)
This thesis presents an effective methodology for the generation of a simulation which can be used to increase the understanding of viscous fluid processing equipment and aid in their development, design and optimisation. The Hampden RAPRA Torque Rheometer internal batch twin rotor mixer has been simulated with a view to establishing model accuracies, limitations, practicalities and uses. As this research progressed, via the analyses several 'snap-shot' analysis of several rotor configurations using the commercial code Polyflow, it was evident that the model was of some worth and its predictions are in good agreement with the validation experiments, however, several major restrictions were identified. These included poor element form, high man-hour requirements for the construction of each geometry and the absence of the transient term in these models. All, or at least some, of these limitations apply to the numerous attempts to model internal mixes by other researchers and it was clear that there was no generally accepted methodology to provide a practical three-dimensional model which has been adequately validated. This research, unlike others, presents a full complex three-dimensional, transient, non-isothermal, generalised non-Newtonian simulation with wall slip which overcomes these limitations using unmatched ridding and sliding mesh technology adapted from CFX codes. This method yields good element form and, since only one geometry has to be constructed to represent the entire rotor cycle, is extremely beneficial for detailed flow field analysis when used in conjunction with user defined programmes and automatic geometry parameterisation (AGP), and improves accuracy for investigating equipment design and operation conditions. Model validation has been identified as an area which has been neglected by other researchers in this field, especially for time dependent geometries, and has been rigorously pursued in terms of qualitative and quantitative velocity vector analysis of the isothermal, full fill mixing of generalised non-Newtonian fluids, as well as torque comparison, with a relatively high degree of success. This indicates that CFD models of this type can be accurate and perhaps have not been validated to this extent previously because of the inherent difficulties arising from most real processes.
|
68 |
Digital image watermarkingBounkong, Stephane January 2004 (has links)
In recent years, interest in digital watermarking has grown significantly. Indeed, the use of digital watermarking techniques is seen as a promising mean to protect intellectual property rights of digital data and to ensure the authentication of digital data. Thus, a significant research effort has been devoted to the study of practical watermarking systems, in particular for digital images. In this thesis, a practical and principled approach to the problem is adopted. Several aspects of practical watermarking schemes are investigated. First, a power constaint formulation of the problem is presented. Then, a new analysis of quantisation effects on the information rate of digital watermarking scheme is proposed and compared to other approaches suggested in the literature. Subsequently, a new information embedding technique, based on quantisation, is put forward and its performance evaluated. Finally, the influence of image data representation on the performance of practical scheme is studied along with a new representation based on independent component analysis.
|
69 |
Determinants of Active Pursuit of Kidney Donation: Applying the Theory of Motivated Information ManagementWest, Stacy M 01 January 2016 (has links)
End stage renal disease (ESRD) is a growing epidemic impacting the United States. While the optimal treatment for ESRD is renal replacement, barriers exist making this treatment difficult and sometimes impossible for patients to pursue. One potential solution to existing barriers is to encourage patients to actively seek living donors. This is an inherently communicative and social process. The Theory of Motivated Information Management (TMIM) offers a framework for understanding factors that contribute to patients’ conversations about transplantation with their social networks. It is also possible that Patient Empowerment can add to this model, and inform future patient education.
Specific variables related to the TMIM and Patient Empowerment are analyzed in bivariate and logistic regression analyses.
Variables that were significant in bivariate analysis did not rise to the level of significance when included in a full logistic regression analysis. Study results and outcomes suggest that further research is warranted.
|
70 |
Collecting and interpreting human skulls and hair in late Nineteenth Century London : passing fables & comparative readings at The Wildgoose Memorial Library : an artist's response to the DCMS Guidance for the Care of Human Remains in Museums (2005)Wildgoose, Jane January 2015 (has links)
This practice-based doctoral research project is an artist’s response to the ‘unique status’ ascribed to human remains in the DCMS Guidance for the Care of Human Remains in Museums (2005): as objects, in scientific, medical/anthropological contexts, or subjects, which may be understood in associative, symbolic and/or emotional ways. It is concerned with the circumstances in which human remains were collected and interpreted in the past, and with the legacies of historical practice regarding their presence in museum collections today. Overall, it aims to contribute to public engagement concerning these issues. Taking the form of a Comparative Study the project focuses on the late nineteenth century, when human skulls were collected in great numbers for comparative anatomical and anthropological research, while in wider society the fashion for incorporating human hair into mourning artefacts became ubiquitous following the death of Prince Albert in 1861. William Henry Flower’s craniological work at the Hunterian Museum of the Royal College of Surgeons of England, where he amassed a vast collection of human skulls that he interpreted according to theories of racial “type” (in which hair was identified as an important distinguishing characteristic), is investigated, and its legacy reviewed. His scientific objectification of human remains is presented for comparison, in parallel, with the emotional and associative significance popularly attributed to mourning hairwork, evidenced in accompanying documentation, contemporary diaries, literature, and hairworkers’ manuals. Combining inter-related historical, archival- and object-based research with subjective and intuitive elements in my practice, a synthesis of the artistic and academic is developed in the production of a new “archive” of The Wildgoose Memorial Library - my collection of found and made objects, photographs, documents and books that takes a central place in my practice. Victorian hairworking skills are researched, and a new piece of commemorative hairwork devised and made as the focus for a site-specific presentation of this archive at the Crypt Gallery St. Pancras, in which a new approach to public engagement is implemented and tested, concerning the legacy and special status of human remains in museum collections today.
|
Page generated in 0.0601 seconds