• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 257
  • 45
  • 25
  • 25
  • 25
  • 25
  • 25
  • 25
  • 5
  • 5
  • 4
  • 1
  • 1
  • Tagged with
  • 367
  • 367
  • 346
  • 66
  • 63
  • 55
  • 26
  • 24
  • 23
  • 21
  • 18
  • 18
  • 18
  • 16
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Identifying gene regulatory networks using Bayesian networks and domain knowledge

Liu, Ziying January 2006 (has links)
Bayesian network techniques have been used for discovering causal relationships among large number of variables in many applications. This thesis demonstrates how Bayesian techniques are used to build gene regulation networks. The contribution of this thesis is to find a novel way of combining pre-knowledge (biological domain information) into Bayesian network learning process for microarray data analysis. Such pre-knowledge includes biological process, cellular component and molecular function information and cell cycle information. Incorporating preexisting knowledge into the Bayesian network learning process significantly improves the accuracy and performance of learning. Another contribution of this thesis is the inference and validation of learning result based on the biological literature and biological knowledge. The learned network structure is presented graphically to make the results easy to understand. A yeast microarray dataset is used to test the performance of the learning process.
92

Development of project manager selection tool based on project manager competency

Shao, Ming Gang January 2006 (has links)
Project management is entering into every aspect of our work with high speed but a relatively low success rate. Based on research into the current project management literature, this thesis concludes that three elements - a competent project manager, right project definition and right project organization are the key factors that determine the project success. This study focuses on the first of these three factors: selection of a competent project manager. The research process uses a web formed questionnaire based on the 102 Project Management Institute competency elements designed to measure Project Manager Knowledge, Performance and Personal competency. This questionnaire was used to gather data concerning real-world experience of critical project management skills from 16 people who have professional project experiences. Based on this data, a new project management tool was developed to highlight effective project manager selection considerations and effective organizational use of project management analysis and problem solving processes. This new tool attempts to assist the combination of project manager capabilities and organizational capabilities to achieve more effective project success. As a pilot survey, 16 respondents who have project experiences using a project-management approach were invited to complete the study project management questionnaire. The responses concerning the importance of the various elements named in the questionnaire were analyzed. The analysis resulted in a profile representing the critical skills associated with project management from the view of the expert respondents. This thesis proposes this profile as a project manager selection method. It also analyzes organizational project management problems in light of the selection model proposed as a way of helping the project manager, project, and organization move toward more sustainable development. As a part of future research, the effectiveness of this tool in distinguishing between effective and less effective project managers will form the basis of potential future research.
93

Analysis and improvement of teaching processes and environments with Internet-related tools

Zhang, Yuxiang January 2006 (has links)
Nowadays, universities and colleges seek e-learning solutions to permit students to learn at different places and times from the usual classes, and e-teaching tools to assist teachers to handle their increasing numbers of students. Inexpensive and platform-independent tools are useful to professors who want to cope with their teaching without more work and with limited resources such as people, machinery, and Internet bandwidth. To improve work processes in the educational environment, particularly in teaching quantitative, problem-solving subjects, the development, dissemination and use of Internet-related tools would be welcomed. Several Internet-related tools or projects are ongoing under Professor John C. Nash in the School of Management at the University of Ottawa. The E-tutor project to date has shown the merit of the concept of a "watch the student work" teaching method that can be (in part, at least) implemented using WWW-type tools running at low bandwidth with more than 100 students. These tools are, however, not yet organized perfectly. It is also important that their advantages and weaknesses be analyzed and placed in context with other solutions that are being developed and implemented.
94

Adding probability to neural network prediction of NICU mortality

Zhou, Dajie January 2006 (has links)
It has been shown that neural networks can be trained to predict clinical outcomes of a neonatal intensive care unit (NICU). This thesis expands past research and presents a neural network approach that predicts the probability of NICU mortality. The resulting neural network models are able to not only classify the dichotomous outcome (i.e. survival or death) but also estimate the probability of death. The general conditions necessary for the neural network to estimate probabilities are discussed in the thesis. The neural network estimation of mortality probability includes the following steps: modeling the neural network cost function with the likelihood function; deriving the gradient ascent training algorithm to perform the maximum likelihood estimation; developing the neural network models with the NICU data; evaluating performance by the Receiver Operating Characteristic Curve and Hosmer-Lemeshow test. These neural network based probability estimation models applied as the mortality prognostic tools are presented. For this purpose, two approaches for improving the models' sensitivity are suggested: adjustment of the cost function and the cutoff point. Both of them were tested and results are discussed.
95

Three-dimensional visualization of multi-layered graphs with application to communications

Zhou, Jianong January 2006 (has links)
This thesis introduces two new algorithms for 3D graph drawing and network display. The first algorithm, the Incremental Projection Algorithm , is a new universal algorithm for displaying any graph of any vertex degree. The above algorithm can be implemented to display graph in 3D space without edge crossing. The number of edge bends produced by the algorithm does not excess two. The average time complexity of the Incremental Projection Algorithm is O( N N), where N is the number of vertices in a graph. If there is no degree of vertices great than M, the time complexity of the Incremental Projection Algorithm is O(N). The second algorithm is called Depth-Height Buffer Algorithm. The algorithm is designed for displaying special multi-layered networks. Actually, the algorithm is a method for hidden object elimination. It is useful for multi-layered communication networks visualization. The time complexity of the Depth-Height Buffer Algorithm is O(N log N) . Two demonstration packages of the above algorithms are developed in order, to verify their correctness and functionality. The thesis also discusses the methods and techniques for 2D graph drawing. In bi-layered crossing reduction aspect the thesis presents a new method called the minimizing angle approach, which may reduce the crossings among the edges with the time complexity of O(|S |max(|N|,|S |)).
96

Handling metadata for statistical and other databases

Chen, Yiwen January 2006 (has links)
Metadata, data about data, provides information on various facets of our data. It facilitates survey design as well as data discovery, analysis and retrieval. Given the many requirements to incorporate and deal with metadata in statistical applications, metadata tools in commonly used statistical software would be very useful. R presents an opportunity for this. However, the common problems of metadata such as metadata capture and the diversity of difficulties of adding functions to R under Windows bring a big challenge for incorporating metadata functions into R. This thesis aims to present the possibility for handling metadata in the statistical system, R. It provides an R package with functions to facilitate metadata management. Taking the advantage of the metadata levels and templates defined in the thesis, the functions attempt to implement (semi-) automatic metadata capture and updating. In addition, the functions provide the examples of programming with R especially for the novice. Furthermore, this thesis supplies a guideline of the future work, that is, a road map for the development of R metadata tools.
97

An experimental study of wireless LANs

Sattari, Afsaneh January 2006 (has links)
Wireless equipment based on IEEE 802.11 standard has been increasingly successful during the recent years. The popularity of wireless LAN hardware, and increasing use of different applications has brought challenges and competition among different vendors of IEEE 802.11 standards. Traditionally, such equipment has mainly been used indoors to allow users to wirelessly connect to the Internet or to a LAN. In this thesis we evaluate wireless equipment of the 802.11b standard. We consider typical scenarios and also consider typical transport protocols (i.e. TCP and UDP) in order to observe the performance of the system in real settings. Although the performance of a system mainly depends on the following metrics: throughput, packet loss, delay and jitter, in this thesis we take into consideration a large number of metrics of performance in order to obtain a better understanding of the way diverse factors impact performance in real wireless LANs. Indeed, we perform a large set of tests and measurements with different hardware. This let us obtain a more complete view of the performance of the system. Such parameters are important in real networks since its interrelation affect the quality of service perceived by various services carried by the network. We observe that real throughput of WLANs is sensibly lower than the raw amount usually advertised by the manufacturers. From the obtained results we conclude that in addition to the lack of efficiency of the MAC layer, this situation is also due to the combination of other factors that are detailed in the thesis. We found that one of the most important factor affecting the performance is a corner in the signal path between the sender and the receiver, which significantly affects the performance. We also describe some different hardware and software tools that are useful to carry out the tasks required for a performance study in a wireless LAN.
98

A model to mitigate the bullwhip effect by ordering policies and forecasting methods

Yuan, Xin January 2006 (has links)
This thesis considers an important phenomenon: the bullwhip effect - the amplification of variability of demand as one moves up a supply chain. We apply Agent-Based Modeling (ABM) at the macro level of our system to build relationships among the trading partners; and then we utilize system dynamics to model rules like ordering policy or inventory management at the micro level. We define these rules according to Sterman's generic stock acquisition and ordering heuristic in his Beer Distribution Game Model (Sterman 1989, 321-339). By including additional forecasting methods like moving average, Holts, and double exponential smoothing (DES) method, we extend the model to investigate how different ordering policies and forecasting methods affect the bullwhip effect. Through simulations in the ordering policy space, we demonstrate that fed with a local trend customer order pattern, the bullwhip effect can be mitigated significantly if the suitable forecasting method like Holts or DES is applied under the right ordering policy that has a slow adjustment to the discrepancy of the stock. Comparing with previous research, we extend Sterman's model to investigate what other managerial behaviours, like the aggregation effect, the varied ordering policy and additional forecasting methods would bring to the bullwhip effect. In the "smoothing" method, we extend the moving average and exponential average, which appeared in Forrester's study (Forrester 1961), to Holts and DES method. In modeling, we differ from CDRS (Chen, Drezener, Ryan, and Simchi-Levi 2000, 436-443) and CRS (Chen, Ryan, and Simchi-Levi 2000, 269-286) and Yao (2001) by replacing order-up-to policy with the heuristic ordering policy. Our research has another important managerial insight: through the right ordering policy and forecasting method, trading partners can alleviate the bullwhip effect without adopting information sharing, which may lead to other problems, like mutual trust or additional cost. Key words. Supply chain management; Bullwhip Effect; System dynamics; Agent-Based Modeling; Forecasting.
99

Development of "AlarmLocator" --- A computerized model for predicting the optimum number, location, and power level of acoustic warning devices in noisy work plants

Al Osman, Rida January 2007 (has links)
Acoustic warning signals are necessary to promptly alert workers of events which can compromise safety. Failure to react to warning signals can increase the risk of accidents in the workplace. Unfortunately, the use of warning signals in industry is poorly regulated and submitted to intuitive installation practices, often with little regard to the many factors contributing to an efficient use. This research presents the development of a new software tool called "AlarmLocator" to automate the process of installing auditory warning devices in a given setting, in terms of the characteristics of the devices to use and their optimal location in the plant. The software tool, when use in combination with psychoacoustic model "Detectsound" (Zheng et al., 2006), produces a solution to two practical installation problems: (1) selecting a suitable number of warning devices and acoustic power for a given work area, and (2) specifying the location of the devices in the plant in such a way that the signals emitted are clearly audible by all workers at all workstations. Thus, a solution to the problem of installing warning devices is provided in a format that can be easily understood and used in the workplace. A simple hybrid method, combining the mirror image technique and the classical room acoustics theory was used in this research to compute the sound pressure level generated by acoustic alarm devices. The former technique accounts for wall-placement directivity effects by using a relatively low order of virtual sources to simulate early reverberation. The classical theory of room acoustics is used for late reverberation. The method only requires specification of the overall room dimensions and the estimated or measured reverberation time; however it is currently limited to rectangular-shaped rooms. It was validated in a classroom at the University of Ottawa to verify that sound pressure level predictions by "AlarmLocator" would provide realistic solutions of optimal installation of acoustic warning devices in the workplace. The validation was carried out over 125 arrangements (source, receiver, frequency) and average errors of 0.15 and 2.2 dB were noted for estimated and measured reverberation time conditions. This is well within the design range (13 dB) for the warning sound level window under "Detectsound". [Work funded by a research grant provided by the Workplace Safety and Insurance Board (Ontario)].
100

A compliance framework for business processes based on URN

Ghanavati, Sepideh January 2007 (has links)
Compliance with institutional policies, government regulations and applicable legislation is a major concern for any organization when defining its business processes. These regulations are usually complex, hard to understand, and they rarely come with a model or taxonomy. As well, both business processes and regulations are susceptible to change with the potential of introducing non-compliance. This thesis presents a framework that intends to help companies track compliance by leveraging requirements engineering models. Compliance is managed by establishing links between User Requirements Notation (URN) models of government legislation and organizational business process and tracking how they are affected in a requirements management system. Special attention is paid to maintaining compliance as either the legislation or business processes evolve over time. The framework is evaluated by way of a case study from the healthcare industry. The case study centres on the approval process implemented to control access to a data warehouse at a major Ontario hospital and whether or not this process complies with relevant legislation and hospital guidelines. The relevant legislation in Ontario is the new provincial Personal Health Information Privacy Act (PHIPA).

Page generated in 0.0847 seconds