• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 56
  • 56
  • 56
  • 11
  • 10
  • 9
  • 8
  • 7
  • 6
  • 6
  • 4
  • 4
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Updating semi-structured data

Amornsinlaphachai, Pensri January 2007 (has links)
The Web has had a tremendous success with its support for the rapid and inexpensive exchange of information. A considerable body of data exchange is in the form of semi- structured data such as the eXtensible Markup Language (XML). XML, an effective standard to represent and exchange semi-structured data on the Web, is used ubiquitously in almost all areas of information technology. Most researchers in the XML area have concentrated on storing, querying and publishing XML while not many have paid attention to updating XML; thus the XML update area is not fully developed. We propose a solution for updating XML as a representation of semi-structured data. XML is updated through an object-relational database (ORDB) to exploit the maturity of the relational engine and the newer object features of the OR technology. The engine is used to enforce constraints during the updating of the XML whereas the object features are used to handle the XML hierarchical structure. Updating XML via ORDB makes it easier to join XML documents in an update and in turn joins of XML documents make it possible to keep non-redundant data in multiple XML documents. This thesis contributes a solution for the update of XML documents via an ORDB to advance our understanding of the XML update area. Rules for mapping XML structure and constraints to an ORDB schema are presented and a mechanism to handle XML cardinality constraint is provided. An XML update language, an extension to XQuery, has been designed and this language is translated into the standard SQL executed on an ORDB. To handle the recursive nature of XML, a recursive function updating XML data is translated into SQL commands equipped with a programming capability. A method is developed to reflect the changes from the ORDB to XML documents. A prototype of the solution has been implemented to help validate our approach. Experimental study to evaluate the performance of XML update processing based on the prototype has been conducted. The experimental results show that updating multiple XML documents storing non-redundant data yields a better performance than updating a single XML document storing redundant data; an ORDB can take advantage of this by caching data to a greater extent than a native XML database. The solution of updating XML documents via an ORDB can solve some problems in existing update methods as follows. Firstly, the preservation of XML constraints is handled by the ORDB engine. Secondly, non-redundant data is stored in linked XML documents; thus the problem of data inconsistency and low performance caused by data redundancy are solved. Thirdly, joins of XML documents are converted to joins of tables in SQL. Fourthly, fields or tables involved in regular path expressions can be tackled in a short time by using mapping data. Finally, a recursive function is translated into SQL commands equipped with a programming capability.
32

A pattern-based foundation for language-driven software engineering

Reichert, Tim January 2011 (has links)
This work brings together two fundamental ideas for modelling, programming and analysing software systems. The first idea is of a methodological nature: engineering software by systematically creating and relating languages. The second idea is of a technical nature: using patterns as a practical foundation for computing. The goal is to show that the systematic creation and layering of languages can be reduced to the elementary operations of pattern matching and instantiation and that this pattern-based approach provides a formal and practical foundation for language-driven modelling, programming and analysis. The underpinning of the work is a novel formalism for recognising, deconstructing, creating, searching, transforming and generally manipulating data structures. The formalism is based on typed sequences, a generic structure for representing trees. It defines basic pattern expressions for matching and instantiating atomic values and variables. Horizontal, vertical, diagonal and hierarchical operators are different ways of combining patterns. Transformations combine matching and instantiating patterns and they are patterns themselves. A quasiquotation mechanism allows arbitrary levels of meta-pattern functionality and forms the basis of pattern abstraction. Path polymorphic operators are used to specify fine-grained search of structures. A range of core concepts such as layering, parsing and pattern-based computing can naturally be defined through pattern expressions. Three language-driven tools that utilise the pattern formalism showcase the applicability of the pattern-approach. Concat is a self-sustaining (meta-)programming system in which all computations are expressed by matching and instantiation. This includes parsing, executing and optimising programs. By applying its language engineering tools to its own meta-language, Concat can extend itself from within. XMF (XML Modeling Framework) is a browser-based modelling- and meta-modelling framework that provides flexible means to create and relate modelling languages and to query and validate models. The pattern functionality that makes this possible is partly exposed as a schema language and partly as a JavaScript library. CFR (Channel Filter Rule Language) implements a language-driven approach for layered analysis of communication in complex networked systems. The communication on each layer is visible in the language of an “abstract protocol” that is defined by communication patterns.
33

The influence of national culture on the practice of project management : a study of information and communication technology projects in Saudi Arabia

Salah, Romil January 2016 (has links)
In an ever-shrinking world with advancing technology many organizations have expanded their operations internationally and experienced challenges of how to manage projects in areas with different cultural backgrounds. In a culturally unique country, like the Kingdom of Saudi Arabia (KSA), the influence of national culture on project management has to be considered, and there is additional complexity in that most project teams are themselves diverse and multi-cultural. KSA has become one of the wealthiest countries in the world, however, many of its projects, especially in the Information and Communication Technology (ICT) sector, still fail dramatically for financial, managerial, political, social and cultural reasons. In KSA, culture is a crucial factor in business, and the management of projects is no exception. The aim of this research is to contribute to more successful delivery of ICT projects in KSA. The overall commonly-held belief is that there are elements of national culture in KSA that impact the implementation of project management processes on ICT projects. Using Hofstede’s cultural model as a basis, a conceptual framework has been created that explores and explains the impact of KSA national culture on ICT project management as characterised by the Project Management Body of Knowledge (PMBoK)1 principles. A qualitative research approach was used to collect data from four private and public sector ICT projects, in their natural settings, using a multiple case study approach. Data were collected using semi-structured interviews and examination of project documentation and a cross-case analysis was performed. The conceptual framework is a very useful planning tool for human resourcing purposes is best used for the ICT project management professionals in understanding how project management practices, procedures, tools and techniques are implemented and how they are impacted by cultural factors. The findings in this study have confirmed that the dimensions of Power Distance Index (PDI), Individualism (IDV) and Uncertainty Avoidance (UAI) have a significant impact on project management in KSA, but that Long–Term Orientation (LTO), Masculinity (MAS) and Indulgence (IND) have a lesser impact.
34

Automated classification of cancer tissues using multispectral imagery

Peyret, Remy January 2017 (has links)
Automated classification of medical images for colorectal and prostate cancer diagnosis is a crucial tool for improving routine diagnosis decisions. Therefore, in the last few decades, there has been an increasing interest in refining and adapting machine learning algorithms to classify microscopic images of tumour biopsies. Recently, multispectral imagery has received a significant interest from the research community due to the fast-growing development of high-performance computers. This thesis investigates novel algorithms for automatic classification of colorectal and prostate cancer using multispectral imagery in order to propose a system outperforming the state-of-the-art techniques in the field. To achieve this objective, several feature extraction methods based on image texture have been investigated, analysed and evaluated. A novel texture feature for multispectral images is also constructed as an adaptation of the local binary pattern texture feature to multispectral images by expanding the pixels neighbourhood to the spectral dimension. It has the advantage of capturing the multispectral information with a limited feature vector size. This feature has demonstrated improved classification results when compared against traditional texture features. In order to further enhance the systems performance, advanced classification schemes such as bag-of-features - to better capture local information - and stacked generalisation - to select the most discriminative texture features - are explored and evaluated. Finally, the recent years have seen an accelerated and exponential rise of deep learning, boosted by the advances in hardware, and more specifically graphics processing units. Such models have demonstrated excellent results for supervised learning in multiple applications. This observation has motivated the employment in this thesis of deep neural network architectures, namely convolutional neural networks. Experiments were also carried out to evaluate and compare the performance obtained with the features extracted using convolutional neural networks with random initialisation against features extracted with pre-trained models on ImageNet dataset. The analysis of the classication accuracy achieved with deep learning models reveals that the latter outperforms the previously proposed texture extraction methods. In this thesis, the algorithms are assessed using two separate multiclass datasets: the first one consists of prostate tumour multispectral images, and the second contains multispectral images of colorectal tumours. The colorectal dataset was acquired on a wide domain of the light spectrum ranging from the visible to the infrared wavelengths. This dataset was used to demonstrate the improved results produced using infrared light as well as visible light.
35

Understanding the design of energy interventions to reduce end-user demand in organisational and domestic environments

Foster, Derek January 2017 (has links)
Energy demand is on the rise globally due to unchecked factors such as population growth, lifestyle choices, and the industrialization of developing countries. Governments are investing in technologies for efficient and renewable energy in an attempt to secure energy for the future over current dependencies on fossil fuels, but the development costs are high, and the rate of developed technologies is projected to fall far short of meeting global requirements. Overshadowing this growing appetite for energy is the global issue of climate change, igniting the scientific and humanitarian debate over the use of fossil fuels and a need for renewable energy, presenting a societal problem of generating clean, sustainable and secure energy for future generations. As part of understanding how society can make positive changes to daily practices around energy use, many governments have turned to behaviour change, or ‘nudge’ units, that research work on changing energy consumption behaviours. The importance of this is underlined by a focus on reducing end-user energy demand (EUED) by providing contextual energy feedback, interwoven with behaviour change strategies, in both residential and organizational sectors. EUED in large organisations and small-medium enterprises (SMEs) accounts for a significant proportion of a nation’s energy requirements. In Europe, the services sector saw a 34% growth in EUED in the period 1990-2012, with computers and other appliances in the office substantially contributing to this. In the UK, for example, 13% of total energy consumed in 2011-2012 was within the services sector, which accounts for services and business, while the residential sector consumed 30% of total consumption. Given a lack of academic HCI research in the organisational energy intervention space when comparted to domestic, the principle research undertaken in this thesis was to understand employee energy consumption practices and attitudes in the workplace, through a combination of qualitative enquiry and analysis. Additionally, alternative forms of feedback such as aversive stimuli are often ignored in the HCI literature, with favour focused on positive feedback alone as a means for behaviour change. The work in this thesis presents findings on the design implications and considerations that inform the design of in-the-field organisational energy interventions that integrate feedback and antecedent behaviour contingencies. Additionally, research is undertaken in understanding the design of aversive feedback as part of domestic energy interventions. A significant contribution is made to the HCI sustainability literature on understanding the workplace energy intervention design space, and a contribution made on how aversive feedback can in fact be a useful and engaging method for the domestic environment.
36

An empirical investigation into the drivers of re-subscription in massively multiplayer online games : a commitment trust theory approach

Grundy, David January 2010 (has links)
This is a relationship marketing PhD which is examining, using Commitment Trust Theory, the customer decision to maintain subscribing to a massively multiplayer online game. This PhD is not an examination of initial purchase decision, but of the ongoing, post-purchase, customer retention. In keeping with the contextual nature of Commitment Trust Theory, this study examines the antecedents of the re-subscription decision and their effect on the key mediating variables of Commitment and Trust and modifies the framework to model the subscription based nature of the business situation and the context. The key contribution of this research to the literature is the application of the Commitment Trust framework to a customer’s ongoing relationship with a massively multiplayer online game entertainment product; a situation and context which has not been examined in the literature. An online questionnaire survey was used to collect a sample of data from 2226 massively multiplayer online game customers. This sample data was then analysed using Structural Equation Modelling to test the relationship hypotheses between the constructs proposed by Commitment Trust Theory. Furthermore, hypotheses examining the effect of relevant demographic and categorical variables upon the constructs of Commitment Trust Theory were also tested and analysed using appropriate statistical techniques. Evidence was found to support the Commitment Trust Theory framework in a massively multiplayer online game subscription situation, with the study’s model explaining 85.7% of the variance of the sample data, with evidence presented to support the key mediating variable approach to modelling the circumstances. The study, based on examining the effect size of the construct relationships using standardised regression weights then gives evidence that a more parsimonious model which reduces the number of constructs from 16 to six (a 70% reduction in complexity) would still produce a model explaining 85.3% of the variance of the sample data (a 0.4% loss in explanatory power). The study concludes that the key antecedent constructs in the sample for a customer’s renewal of an online gaming subscription are current satisfaction, past satisfaction, the amount of game capital they have within the game and the metagame benefits they derive from the game. The study supports a key mediating variable structure, but provides evidence that while Commitment and Trust are both relevant and statistically significant, a more efficient explanation examining the effect size of the relationships as well, would focus on the antecedents of Commitment rather than Trust, as Trust and its antecedents were not found to have a significant effect size on the overall decision to re-subscribe.
37

Adaptive fuzzy logic control for solar buildings

El-Deen, M. M. G. Naser January 2002 (has links)
Significant progress has been made on maximising passive solar heating loads through the careful selection of glazing, orientation and internal mass within building spaces. Control of space heating in buildings of this type has become a complex problem. Additionally, and in common with most building control applications, there is a need to develop control solutions that permit simple and transparent set up and commissioning procedures. This work concerns the development and testing of an adaptive control method for space heating in buildings with significant solar input. A simulation model of a building space to assess the performance of different control strategies is developed. A lumped parameter model based on an optimisation technique has been proposed and validated. It is shown that this model gives an improvement over existing low order modelling methods. A detailed model of a hot water heating system and related control devices is developed and evaluated for the specific purpose of control simulation. A PI-based fuzzy logic controller is developed in which the error and change of error between the internal air temperature and the user set point temperature is used as the controller input. A conventional PD controller is also considered for comparison. The parameters of the controllers are set to values that result in the best performance under likely disturbances and changes in setpoint. In a further development of the fuzzy logic controller, the Predicted Mean Vote (PMV) is used to control the indoor temperature of a space by setting it at a point where the PMV index becomes zero and the predicted percentage of persons dissatisfied (PPD) achieves a maximum threshold of 5%. The controller then adjusts the air temperature set point in order to satisfy the required comfort level given the prevailing values of other comfort variables contributing to the comfort sensation. The resulting controller is free of the set up and tuning problems that hinder conventional HVAC controllers. The need to develop an adaptive capability in the fuzzy logic controller to account for lagging influence of solar heat gain is established and a new adaptive controller has therefore been proposed. The development of a "quasi-adaptive" fuzzy logic controller is developed in two steps. A feedforward neural network is used to predict the internal air temperature, in which a singular value decomposition (SVD) algorithm is used to remove the highly correlated data from the inputs of the neural network to reduce the network structure. The fuzzy controller is then modified to have two inputs: the first input being the error between the setpoint temperature and the internal air temperature and the second the predicted future internal air temperature. When compared with a conventional method of control the proposed controller is shown to give good tracking of the setpoint temperature, reduced energy consumption and improved thermal comfort for the occupants by reducing solar overheating. The proposed controller is tested in real time using a test cell equipped with an oil- filled electric radiator, temperature and solar sensors. Experimental results confirm earlier findings arrived at by simulations, in that the proposed controller achieves superior tracking and reduces afternoon solar overheating, when compared with a conventional method of control.
38

Pedestrian detection and tracking

Suppitaksakul, Chatchai January 2006 (has links)
This report presents work on the detection and tracking of people in digital images. The employed detection technique is based on image processing and classification techniques. The work uses an object detection process to detect object candidate locations and a classification method using a Self-Organising Map neural network to identify the pedestrian head positions in an image. The proposed tracking technique with the support of a novel prediction method is based on the association of Cellular Automata (CA) and a Backpropagation Neural Network (BPNN). The tracking employs the CA to capture the pedestrian's movement behaviour, which in turn is learned by the BPNN in order to the estimated location of the pedestrians movement without the need to use empirical data. The report outlines this method and describes how it detects and identifies the pedestrian head locations within an image. Details of how the proposed prediction technique is applied to support the tracking process are then provided. Assessments of each component of the system and on the system as a whole have been carried out. The results obtained have shown that the novel prediction technique described is able to provide an accurate forecast of the movement of a pedestrian through a video image sequence.
39

Generic model for application driven XML data processing

Elbekai, Ali Sayeh January 2006 (has links)
XML technology has emerged during recent years as a popular choice for representing and exchanging semi-structured data on the Web. It integrates seamlessly with web-based applications. If data is stored and represented as XML documents, then it should be possible to query the contents of these documents in order to extract, synthesize and analyze their contents. This thesis for experimental study of Web architecture for data processing is based on semantic mapping of XML Schema. The thesis involves complex methods and tools for specification, algorithmic transformation and online processing of semi-structured data over the Web in XML format with persistent storage into relational databases. The main focus of the research is preserving the structure of original data for data reconciliation during database updates and also to combine different technologies for XML data processing such as storing (SQL), transformation (XSL Processors), presenting (HTML), querying (XQUERY) and transporting (Web services) using a common framework, which is both theoretically and technologically well grounded. The experimental implementation of the discussed architecture requires a Web server (Apache), Java container (Tomcat) and object-relational DBMS (Oracle 9) equipped with Java engine and corresponding libraries for parsing and transformation of XML data (Xerces and Xalan). Furthermore the central idea behind the research is to use a single theoretical model of the data to be processed by the system (XML algebra) controlled by one standard metalanguage specification (XML Schema) for solving a class of problems (generic architecture). The proposed work combines theoretical novelty and technological advancement in the field of Internet computing. This thesis will introduce a generic approach since both our model (XML algebra) and our problem solver (the architecture of the integrated system) are XML Schema- driven. Starting with the XML Schema of the data, we first develop domain-specific XML algebra suitable for data processing of the specific data and then use it for implementing the main offline components of the system for data processing.
40

Modelling of tumour-induced angiogenesis

Chen, Wei January 2015 (has links)
Controlled by extracellular signals, tumour-induced angiogenesis is a crucial step in the development of tumours. Among the many cell signals already identified, the VEGF and Notch signalling pathways play a critical role in controlling endothelial cells (ECs) during angiogenesis. Although this regulatory mechanism has become a current research focus in biology, its computational modelling is still rare. We focus on developing a computational model to simulate the VEGF and Notch signalling regulatory mechanism to perceive the micro procedure of angiogenesis in silico and fill the gap between biology and computer engineering. We first developed a mathematical model with nonlinear partial differential equations (PDEs) to describe the migration of endothelial tip cells during tumour-induced angiogenesis. The simulation results show that both chemotaxis and haptotaxis have impacts on the migration of ECs in velocity and density, and the impacts depend on the gradient and direction of tumour angiogenenic factor (TAF), and fibronectin, implying a possible malignant mechanism for some subgroups of tumour. We then developed the model further to simulate the regression, recurrence or clearance of tumours due to tumour cytotoxic factors, including the immune system and drugs delivered through the vessels formed during angiogenesis, providing a broader understanding of tumours. Based on the PDE model which provided parameters of continuum mathematical model, we finally developed an enzymatic catalysed regulating model in the form of ordinary differential equations (ODEs) with agent-based modelling (ABM) using Java and MATLAB languages, to visually realise the sprouting regulated by VEGF and Notch signalling during angiogenesis. The simulation describes the process of how an endothelial stalk cell becomes an endothelial tip cell, and sprouts under the influence of VEGF and Notch signalling, revealing the relationship between sprouting and branching. As the simulation results are consistent with reported in vitro and in vivo assays, the study bridges angiogenesis research and computer modelling from the dynamic regulatory mechanism perspective, offering a huge leap over previous studies in computationally simulating tumour-induced angiogenesis. It is hoped that the results will assist researchers in both the experimental and theoretical angiogenesis communities to improve understanding of the complexity and identify the fundamental principles of angiogenesis, whilst also using modelling approaches that will enrich knowledge for computational scientists in this field.

Page generated in 0.1131 seconds