• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 456
  • 10
  • 8
  • 6
  • Tagged with
  • 1051
  • 1051
  • 738
  • 315
  • 307
  • 306
  • 295
  • 287
  • 248
  • 239
  • 205
  • 204
  • 113
  • 86
  • 85
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
941

Adaptation of colour perception through dynamic ICC profile modification : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Computer Science at Massey University, Albany (Auckland), New Zealand

Kloss, Guy Kristoffer January 2010 (has links)
Digital colour cameras are dramatically falling in price, making them a ordable for ubiquitous appliances in many applications. Change in colour perception with changing light conditions induce errors that may escape a user's awareness. Colour constancy algorithms are based on inferring light properties (usually the white point) to correct colour. Other attempts using more data for colour correction such as (ICC based) colour management characterise a capturing device under given conditions through an input device pro le. This pro le can be applied to correct for deviating colour perception. But this pro le is only valid for the speci c conditions at the time of the characterisation, but fails with changes in light. This research presents a solution to the problem of long time observations with changes in the scene's illumination for common natural (overcast or clear, blue sky) and arti cial sources (incandescent or uorescent lamps). Colour measurements for colour based reasoning need to be represented in a robustly de ned way. One such suitable and well de ned description is given by the CIE LAB colour space, a device-independent, visually linearised colour description. Colour transformations using ICC pro le are also based on CIE colour descriptions. Therefore, also the corrective colour processing has been based on ICC based colour management. To verify the viability of CIE LAB based corrective colour processing colour constancy algorithms (White Patch Retinex and Grey World Assumption) have been modi ed to operate on L a b colour tuples. Results were compared visually and numerically (using colour indexing) against those using the same algorithms operating on RGB colour tuples. We can take advantage of the fact that we are dealing with image streams over time, adding another dimension usable for analysis. A solution to the problem of slowly changing light conditions in scenes with a static camera perspective is presented. It takes advantage of the small (frame-to-frame) changes in appearance of colour within the scene over time. Reoccurring objects or (background) areas of the scene are tracked to gather data points for an analysis. As a result, a suitable colour space distortion model has been devised through a rst order Taylor approximation (a ne transformation). By performing a multidimensional linear regression analysis on the tracked data points, parameterisations for the a ne transformations were derived. Finally, the device pro le is updated by amalgamating the corrections from the model into the ICC pro le for a single, comprehensive transformation. Following applications of the ICC based colour pro les are very fast and can be used in real-time with the camera's capturing frame rate (for current normal web cameras and low spec desktop computers). As light conditions usually change on a much slower time scale than the capturing rate of a camera, the computationally expensive pro le adaptation generally showed to be usable for many frames. The goal was to set out and nd a solution for consistent colour capturing using digital cameras, which is capable of coping with changing light conditions. Theoretical backgrounds and strategies for such a system have been devised and implemented successfully.
942

Autism Policy: State and National Legislation Analysis

Johnson, Taylon M. 01 January 2012 (has links)
This research thesis is a policy assessment of the factors that contribute to the current status in treating autism. The policy assessment begins with a description of the key components that that influence policy outcomes in regard to autism. After developing a policy model that outlines various components of issues and approaches to the policy has on Autism, the paper examines several issues with regard to Autism policy, including the lack of insurance coverage, state legislation, waiting lists, evidence vs. non evidence treatments, and the high price for treatments. The paper also examines current approaches to Autism, and potential solutions. Solution analysis on current policy alternatives is provided and, this suggests that increasing knowledge and awareness of the affects of autism on society needs further attention along with proper funding for early treatment.
943

Designing guideline-based workflow-integrated electronic health records

Barretto, Sistine January 2005 (has links)
The recent trend in health care has been on the development and implementation of clinical guidelines to support and comply with evidence-based care. Evidence-based care is established with a view to improve the overall quality of care for patients, reduce costs, and address medico-legal issues. One of the main questions addressed by this thesis is how to support guideline-based care. It is recognised that this is better achieved by taking into consideration the provider workflow. However, workflow support remains a challenging (and hence rarely seen) accomplishment in practice, particularly in the context of chronic disease management (CDM). Our view is that guidelines can be knowledge-engineered into four main artefacts: electronic health record (EHR) content, computer-interpretable guideline (CiG), workflow and hypermedia. The next question is then how to coordinate and make use of these artefacts in a health information system (HIS). We leverage the EHR since we view this as the core component to any HIS. / PhD Doctorate
944

Accelerating classifier training using AdaBoost within cascades of boosted ensembles : a thesis presented in partial fulfillment of the requirements for the degree of Master of Science in Computer Sciences at Massey University, Auckland, New Zealand

Susnjak, Teo January 2009 (has links)
This thesis seeks to address current problems encountered when training classifiers within the framework of cascades of boosted ensembles (CoBE). At present, a signifi- cant challenge facing this framework are inordinate classifier training runtimes. In some cases, it can take days or weeks (Viola and Jones, 2004; Verschae et al., 2008) to train a classifier. The protracted training runtimes are an obstacle to the wider use of this framework (Brubaker et al., 2006). They also hinder the process of producing effective object detection applications and make the testing of new theories and algorithms, as well as verifications of others research, a considerable challenge (McCane and Novins, 2003). An additional shortcoming of the CoBE framework is its limited ability to train clas- sifiers incrementally. Presently, the most reliable method of integrating new dataset in- formation into an existing classifier, is to re-train a classifier from beginning using the combined new and old datasets. This process is inefficient. It lacks scalability and dis- cards valuable information learned in previous training. To deal with these challenges, this thesis extends on the research by Barczak et al. (2008), and presents alternative CoBE frameworks for training classifiers. The alterna- tive frameworks reduce training runtimes by an order of magnitude over common CoBE frameworks and introduce additional tractability to the process. They achieve this, while preserving the generalization ability of their classifiers. This research also introduces a new framework for incrementally training CoBE clas- sifiers and shows how this can be done without re-training classifiers from beginning. However, the incremental framework for CoBEs has some limitations. Although it is able to improve the positive detection rates of existing classifiers, currently it is unable to lower their false detection rates.
945

Designing application-specific processors for image processing : a thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Computer Science, Massey University, Palmerston North, New Zealand

Bishell, Aaron January 2008 (has links)
Implementing a real-time image-processing algorithm on a serial processor is difficult to achieve because such a processor cannot cope with the volume of data in the low-level operations. However, a parallel implementation, required to meet timing constraints for the low-level operations, results in low resource utilisation when implementing the high-level operations. These factors suggested a combination of parallel hardware, for the low-level operations, and a serial processor, for the high-level operations, for implementing a high-level image-processing algorithm. Several types of serial processors were available. A general-purpose processor requires an extensive instruction set to be able to execute any arbitrary algorithm resulting in a relatively complex instruction decoder and possibly extra FUs. An application-specific processor, which was considered in this research, implements enough FUs to execute a given algorithm and implements a simpler, and more efficient, instruction decoder. In addition, an algorithms behaviour on a processor could be represented in either hardware (i.e. hardwired logic), which limits the ability to modify the algorithm behaviour of a processor, or “software” (i.e. programmable logic), which enables external sources to specify the algorithm behaviour. This research investigated hardware- and software- controlled application-specific serial processors for the implementation of high-level image-processing algorithms and compared these against parallel hardware and general-purpose serial processors. It was found that application-specific processors are easily able to meet the timing constraints imposed by real-time high-level image processing. In addition, the software-controlled processors had additional flexibility, a performance penalty of 9.9% and 36.9% and inconclusive footprint savings (and costs) when compared to hardwarecontrolled processors.
946

Cluster analysis of object-oriented programs : a thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Computer Science at Massey University, Palmerston North, New Zealand

Yakovlev, Vyacheslav January 2009 (has links)
In this thesis we present a novel approach to the analysis of dependency graphs of object-oriented programs, and we describe a tool that has been implemented for this purpose. A graph-theoretical clustering algorithm is used in order to compute the modular structure of programs. This can be used to assist software engineers to redraw component boundaries in software in order to improve the level of reuse and maintainability. The analysis of the dependency graph of an object-oriented program is useful for assessing the quality of software design. The dependency graph can be extracted from a program using various different methods, including source code, byte code, and dynamic (behavioral) analysis. The nodes in the dependency graph are classes, members, packages and other artifacts, while the edges represent uses and extends relationships between those artifacts. Once the dependency graph has been extracted, it can be analysed in order to quantify certain characteristics of the respective program. Examples include the detection of circular dependencies and measurements of the responsibility or independence of units based on their relationships. Tools like JDepend1 implementing these principles have become very popular in recent years. Our work includes grouping types in dependency graphs using di erent clustering methods: Grouping into namespaces; Grouping into clusters using graph clustering algorithms; Grouping into clusters using rules. The detected mismatches are candidates for refactoring. We have developed a tool for processing dependency graphs clustering and producing results where users can outline possible design violations.
947

Development of fusion motion capture for optimisation of performance in alpine ski racing : a thesis presented in fulfilment of the requirements for the degree of Doctor of Philosophy in Science at Massey University, Wellington, New Zealand

Brodie, Matthew Andrew Dalhousie January 2009 (has links)
Fusion Motion Capture (FMC), a wearable motion capture system was developed, and applied to the optimisation of athlete performance in alpine ski racing. In what may be a world first, the three-dimensional movements of a skilled athlete (with less than 20 FIS1 points) skiing through a complete training giant slalom racecourse were analysed. FMC consists of multiple light weight sensors attached to the athlete including inertial measurement units (IMUs), pressure sensitive insoles and a global position system (GPS) receiver. The IMUs contain accelerometers, gyroscopes, and magnetometers. Limb orientation and location are obtained by mathematically combining the most reliable data from each sensor using fusion algorithms developed by the author. FMC fuses the signals from the IMUs and GPS without the need for the post filtering, usually applied to motion capture data, and therefore, maintains maximum bandwidth. The FMC results were stable and relatively independent of motion type and duration unlike other inertial systems available in 2005, when the research was initiated. Analysis of data collected from an athlete skiing giant slalom contradict the traditional „going straight turning short? race strategy. The shortest path may not always be the fastest. Instead each gate has a different optimum approach arc. Optimum turn radius increases with both increasing speed and increasing terrain slope. The results also contradict laboratory measurements of ski/snow sliding friction and suggest that snow resistance in giant slalom is of similar importance to wind drag. In addition to gravity, the athlete increased speed using the techniques of „lateral projection? and „pumping?. Race performance was determined from the analysis of the athlete skiing through the entire course. FMC proved, therefore, to be more suitable than traditional optical systems that are practically limited to capturing small sections of a race course. The athlete experienced high and rapidly fluctuating torques about all three axes of the lower joints. This information could be useful in designing training programmes racecourses and equipment to reduce knee injuries. Data driven animations and colour coded force vector diagrams were developed to enhance athlete feedback. Inline skating data was also analysed.
948

Electronic clinical decision support (eCDS) in primary health care: a multiple case study of three New Zealand PHOs : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Information Systems at Massey University, Palmerston North, New Zealand

Engelbrecht, Judith Merrylyn January 2009 (has links)
Health care providers internationally are facing challenges surrounding the delivery of high quality, cost effective services. The use of integrated electronic information systems is seen by many people working in the health sector as a way to address some of the associated issues. In New Zealand the primary health care sector has been restructured to follow a population based care model and provides services through not-for-profit Primary Health Organisations (PHOs). PHOs, together with their District Health Boards (DHBs), contributing service providers, and local communities, are responsible for the care of their enrolled populations. The Ministry of Health (MoH) is streamlining information sharing in this environment through improvements to computer based information systems (IS). By providing health professionals with improved access to required information within an appropriate time frame, services can be targeted efficiently and effectively and patient health outcomes potentially improved. However, the adoption of IS in health care has been slower than in other industries. Therefore, a thorough knowledge of health care professionals’ attitudes to, and use of, available IS is currently needed to contribute to the development of appropriate systems. This research employs a multiple case study strategy to establish the usage of IS by three New Zealand PHOs and their member primary health care providers (PHPs), with a focus on the role of IS in clinical decision support (CDS). A mixed method approach including semi-structured interviews and postal surveys was used in the study. Firstly, the research develops and applies a survey tool based on an adaptation of an existing framework, for the study of IT sophistication in the organisations. This provides the foundation for an in-depth study of the use of computerised CDS (eCDS) in the PHO environment. Secondly, a conceptual model of eCDS utilisation is presented, illustrating the variation of eCDS use by member general practitioner (GP) practices within individual organisations. Thirdly, five areas of importance for improving eCDS utilisation within PHO’s are identified, contributing information of use to organisations, practitioners, planners, and systems developers. Lastly, the research provides a structure for the study of the domain of eCDS in PHOs by presenting a research approach and information specific for the area.
949

Modelling avian influenza in bird-human systems : this thesis is presented in the partial fulfillment of the requirement for the degree of Masters of Information Science in Mathematics at Massey University, Albany, New Zealand

Zhao, Yue January 2009 (has links)
In 1997, the first human case of avian influenza infection was reported in Hong Kong. Since then, avian influenza has become more and more hazardous for both animal and human health. Scientists believed that it would not take long until the virus mutates to become contagious from human to human. In this thesis, we construct avian influenza with possible mutation situations in bird-human systems. Also, possible control measures for humans are introduced in the systems. We compare the analytical and numerical results and try to find the most efficient control measures to prevent the disease.
950

VERTIPH : a visual environment for real-time image processing on hardware : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Computer Systems Engineering at Massey University, Palmerston North, New Zealand

Johnston, Christopher Troy January 2009 (has links)
This thesis presents VERTIPH, a visual programming language for the development of image processing algorithms on FPGA hardware. The research began with an examination of the whole design cycle, with a view to identifying requirements for implementing image processing on FPGAs. Based on this analysis, a design process was developed where a selected software algorithm is matched to a hardware architecture tailor made for its implementation. The algorithm and architecture are then transformed into an FPGA suitable design. It was found that in most cases the most efficient mapping for image processing algorithms is to use a streamed processing approach. This constrains how data is presented and requires most existing algorithms to be extensively modified. Therefore, the resultant designs are heavily streamed and pipelined. A visual notation was developed to complement this design process, as both streaming and pipelining can be well represented by data flow visual languages. The notation has three views each of which represents and supports a different part of the design process. An architecture view gives an overview of the design's main blocks and their interconnections. A computational view represents lower-level details by representing each block by a set of computational expressions and low-level controls. This includes a novel visual representation of pipelining that simplifies latency analysis, multiphase design, priming, flushing and stalling, and the detection of sequencing errors. A scheduling view adds a state machine for high-level control of processing blocks. This extended state objects to allow for the priming and flushing of pipelined operations. User evaluations of an implementation of the key parts of this language (the architecture view and the computational view) found that both were generally good visualisations and aided in design (especially the type interface, pipeline and control notations). The user evaluations provided several suggestions for the improvement of the language, and in particular the evaluators would have preferred to use the diagrams as a verification tool for a textual representation rather than as the primary data capture mechanism. A cognitive dimensions analysis showed that the language scores highly for thirteen of the twenty dimensions considered, particularly those related to making details of the design clearer to the developer.

Page generated in 0.1735 seconds