391 |
SOME CONTRIBUTIONS TO THE CENSORED EMPIRICAL LIKELIHOOD WITH HAZARD-TYPE CONSTRAINTSHu, Yanling 01 January 2011 (has links)
Empirical likelihood (EL) is a recently developed nonparametric method of statistical inference. Owen’s 2001 book contains many important results for EL with uncensored data. However, fewer results are available for EL with right-censored data. In this dissertation, we first investigate a right-censored-data extension of Qin and Lawless (1994). They studied EL with uncensored data when the number of estimating equations is larger than the number of parameters (over-determined case). We obtain results similar to theirs for the maximum EL estimator and the EL ratio test, for the over-determined case, with right-censored data. We employ hazard-type constraints which are better able to handle right-censored data. Then we investigate EL with right-censored data and a k-sample mixed hazard-type constraint. We show that the EL ratio test statistic has a limiting chi-square distribution when k = 2. We also study the relationship between the constrained Kaplan-Meier estimator and the corresponding Nelson-Aalen estimator. We try to prove that they are asymptotically equivalent under certain conditions. Finally we present simulation studies and examples showing how to apply our theory and methodology with real data.
|
392 |
A Fault-Based Model of Fault Localization TechniquesHays, Mark A 01 January 2014 (has links)
Every day, ordinary people depend on software working properly. We take it for granted; from banking software, to railroad switching software, to flight control software, to software that controls medical devices such as pacemakers or even gas pumps, our lives are touched by software that we expect to work. It is well known that the main technique/activity used to ensure the quality of software is testing. Often it is the only quality assurance activity undertaken, making it that much more important.
In a typical experiment studying these techniques, a researcher will intentionally seed a fault (intentionally breaking the functionality of some source code) with the hopes that the automated techniques under study will be able to identify the fault's location in the source code. These faults are picked arbitrarily; there is potential for bias in the selection of the faults. Previous researchers have established an ontology for understanding or expressing this bias called fault size. This research captures the fault size ontology in the form of a probabilistic model. The results of applying this model to measure fault size suggest that many faults generated through program mutation (the systematic replacement of source code operators to create faults) are very large and easily found. Secondary measures generated in the assessment of the model suggest a new static analysis method, called testability, for predicting the likelihood that code will contain a fault in the future.
While software testing researchers are not statisticians, they nonetheless make extensive use of statistics in their experiments to assess fault localization techniques. Researchers often select their statistical techniques without justification. This is a very worrisome situation because it can lead to incorrect conclusions about the significance of research. This research introduces an algorithm, MeansTest, which helps automate some aspects of the selection of appropriate statistical techniques. The results of an evaluation of MeansTest suggest that MeansTest performs well relative to its peers. This research then surveys recent work in software testing using MeansTest to evaluate the significance of researchers' work. The results of the survey indicate that software testing researchers are underreporting the significance of their work.
|
393 |
DETAILED MODELING OF MUFFLERS WITH PERFORATED TUBES USING SUBSTRUCTURE BOUNDARY ELEMENT METHODDatchanamourty, Balasubramanian 01 January 2004 (has links)
Perforated tubes in mufflers are generally modeled by the transfer impedance approach since modeling the actual geometry of the perforated tubes with holes is very expensive due to the enormity of the boundary elements required. With the development of the substructuring technique which greatly reduces the number of elements required detailed modeling of the perforated tubes has become possible. In this thesis mufflers with perforated tubes are analyzed by modeling the actual geometry and locations of holes on the perforated tubes. The Direct-mixed-body boundary element method with substructuring is used to model the mufflers. Mufflers of various geometry containing perforated tubes with holes of different sizes and porosity are tested. The results obtained from the analyses are compared with the empirical formula results and experimental results. A preliminary investigation on the detailed modeling of flow-through catalytic converters is also conducted.
|
394 |
Image Filtering Methods for Biomedical ApplicationsNiazi, M. Khalid Khan January 2011 (has links)
Filtering is a key step in digital image processing and analysis. It is mainly used for amplification or attenuation of some frequencies depending on the nature of the application. Filtering can either be performed in the spatial domain or in a transformed domain. The selection of the filtering method, filtering domain, and the filter parameters are often driven by the properties of the underlying image. This thesis presents three different kinds of biomedical image filtering applications, where the filter parameters are automatically determined from the underlying images. Filtering can be used for image enhancement. We present a robust image dependent filtering method for intensity inhomogeneity correction of biomedical images. In the presented filtering method, the filter parameters are automatically determined from the grey-weighted distance transform of the magnitude spectrum. An evaluation shows that the filter provides an accurate estimate of intensity inhomogeneity. Filtering can also be used for analysis. The thesis presents a filtering method for heart localization and robust signal detection from video recordings of rat embryos. It presents a strategy to decouple motion artifacts produced by the non-rigid embryonic boundary from the heart. The method also filters out noise and the trend term with the help of empirical mode decomposition. Again, all the filter parameters are determined automatically based on the underlying signal. Transforming the geometry of one image to fit that of another one, so called image registration, can be seen as a filtering operation of the image geometry. To assess the progression of eye disorder, registration between temporal images is often required to determine the movement and development of the blood vessels in the eye. We present a robust method for retinal image registration. The method is based on particle swarm optimization, where the swarm searches for optimal registration parameters based on the direction of its cognitive and social components. An evaluation of the proposed method shows that the method is less susceptible to becoming trapped in local minima than previous methods. With these thesis contributions, we have augmented the filter toolbox for image analysis with methods that adjust to the data at hand.
|
395 |
The Impact of Domain Knowledge on the Effectiveness of Requirements Engineering ActivitiesNiknafs, Ali January 2014 (has links)
One of the factors that seems to influence an individual’s effectiveness in requirements engineering activities is his or her knowledge of the problem being solved, i.e., domain knowledge. While in-depth domain knowledge enables a requirements engineer to understand the problem easier, he or she can fall for tacit assumptions of the domain and might overlook issues that are obvious to domain experts and thus remain unmentioned.
The purpose of this thesis is to investigate the impact of domain knowledge on different requirements engineering activities. The main research question this thesis attempts to answer is “How does one form the most effective team, consisting of some mix of domain ignorants and domain awares, for a requirements engineering activity involving knowledge about the domain of the computer-based system whose requirements are being determined by the team?”
This thesis presents two controlled experiments and an industrial case study to test a number of hypotheses. The main hypothesis states that a requirements engineering team for a computer-based system in a particular domain, consisting of a mix of requirements analysts that are ignorant of the domain and requirements analysts that are aware of the domain, is more effective at requirement idea generation than a team consisting of only requirements analysts that are aware of the domain.
The results of the controlled experiments, although not conclusive, provided some support for the positive effect of the mix on effectiveness of a requirements engineering team. The results also showed a significant effect of other independent variables, especially educational background. The data of the case study corroborated the results of the controlled experiments.
The main conclusion that can be drawn from the findings of this thesis is that the presence in a requirements engineering team of a domain ignorant with a computer science or software engineering background improves the effectiveness of the team.
|
396 |
Mechanistic-empirical failure prediction models for spring weight restricted flexible pavements in Manitoba using Manitoba and MnROAD instrumented test sitesKavanagh, Leonnie 27 June 2013 (has links)
Pavement damage due to heavy loads on thaw weakened flexible pavements is a major
concern for road agencies in Western Canada. To protect weaker, low volume roads,
agencies impose spring weight restrictions (SWR) during the spring thaw to reduce
pavement damage. While SWR may be cost effective for highway agencies, reducing the
spring weight allowances can have a major impact on truck productivity and shipping
costs. Therefore an improved process that links SWR loads to pavement damage, and
based on limiting failure strain, is required.
This thesis developed Local mechanistic-empirical damage models to predict fatigue and
rutting failure on two spring weight restricted (SWR) flexible pavements in Manitoba.
The Local damage models were used to assess the SWR loads that regulate commercial
vehicle weights in Manitoba based on a limiting strain relationship between truck loads
and damage. The Local damage models and a calibrated Finite Element Model (FEM)
were used to predict the equivalent single axle load (ESAL) repetitions to fatigue and
rutting failure at varying B-Train axle loads at the Manitoba sites. The Local model
predictions were compared to predictions from the Asphalt Institute (AI) and Mechanistic
Empirical Design Guide (MEPDG) damage models. The results of the analysis showed
that for each 1% increase in load, there was a corresponding 1% increase in strain, and up
to 3% decrease in ESAL repetitions to failure, depending on the Local, AI, or MEPDG
damage models. The limiting failure strains, computed from the Local model for design
ESALs of 100,000, were 483μm/m and 1,008μm/m for fatigue and rutting failure,
respectively. For the Manitoba sites, the predicted FEM strains at B-Train normal and
SWR loads were higher than the Local model limiting strains. Therefore the Manitoba ii
SWR loads regulating B-Train operations on the two pavements during the spring period
appeared to be reasonable. It is recommended that the research findings be verified with
further calibration and validation of the Local damage model using a larger data set of
low volume flexible pavements. A strain-based concept on how to manage the SWR
regime in Manitoba based on the limiting strains was developed and presented.
|
397 |
STUDYING THE IMPACT OF DEVELOPER COMMUNICATION ON THE QUALITY AND EVOLUTION OF A SOFTWARE SYSTEMBettenburg, Nicolas 22 May 2014 (has links)
Software development is a largely collaborative effort, of which the actual encoding of program logic in source code is a relatively small part. Software developers have to collaborate effectively and communicate with their peers in order to avoid coordination problems. To date, little is known how developer communication during software development activities impacts the quality and evolution of a software.
In this thesis, we present and evaluate tools and techniques to recover communication data from traces of the software development activities. With this data, we study the impact of developer communication on the quality and evolution of the software through an in-depth investigation of the role of developer communication during software development activities. Through multiple case-studies on a broad spectrum of open-source software projects, we find that communication between developers stands in a direct relationship to the quality of the software. Our findings demonstrate that our models based on developer communication explain software defects as well as state-of-the art models that are based on technical information such as code and process metrics, and that social information metrics are orthogonal to these traditional metrics, leading to a more complete and integrated view on software defects. In addition, we find that communication between developers plays a important role in maintaining a healthy contribution management process, which is one of the key factors to the successful evolution of the software. Source code contributors who are part of the community surrounding open-source projects are available for limited times, and long communication times can lead to the loss of valuable contributions.
Our thesis illustrates that software development is an intricate and complex process that is strongly influenced by the social interactions between the stakeholders involved in the development activities. A traditional view based solely on technical aspects of software development such as source code size and complexity, while valuable, limits our understanding of software development activities. The research presented in this thesis consists of a first step towards gaining a more holistic view on software development activities. / Thesis (Ph.D, Computing) -- Queen's University, 2014-05-22 12:07:13.823
|
398 |
Silicon Nanoparticle Synthesis and Modeling for Thin Film Solar CellsAlbu, Zahra 30 April 2014 (has links)
Nanometer-scale silicon shows extraordinary electronic and optical properties that
are not available for bulk silicon, and many investigations toward applications in optoelectronic
devices are being pursued. Silicon nanoparticle films made from solution
are a promising candidate for low-cost solar cells. However, controlling the properties
of silicon nanoparticles is quite a challenge, in particular shape and size distribution,
which effect device performance. At present, none of the solar cells made from silicon
nanoparticle films have an efficiency exceeding the efficiency of those based on crystalline
silicon. To address the challenge of controlling silicon nanoparticle properties,
both theoretical and experimental investigations are needed. In this thesis, we investigate
silicon nanoparticle properties via quantum mechanical modeling of silicon
nanoparticles and synthesis of silicon nanoparticle films via colloidal grinding.
Silicon nanoparticles with shapes including cubic, rectangular, ellipsoidal and flat
disk are modeled using semi-empirical methods and configuration interaction. Their
electronic properties with different surface passivation were also studied. The results
showed that silicon nanoparticles with hydrogen passivation have higher HOMOLUMO
gaps, and also the HOMO-LUMO gap depends on the size and the shape
of the particle. In contrast, silicon nanoparticles with oxygen passivation have a
lower HOMO-LUMO gap. Raman spectroscopy calculation of silicon nanoparticles
show peak shift and asymmetric broadening similar to what has been observed in
experiment.
Silicon nanoparticle synthesis via colloidal grinding was demonstrated as a straightforward
and inexpensive approach for thin film solar cells. Data analysis of silicon
particles via SEM images demonstrated that colloidal grinding is effective in reducing
the Si particle size to sub-micron in a short grinding time. Further increases in
grinding time, followed by filtration demonstrated a narrowing of the Si particle size
and size-distribution to an average size of 70 nm. Raman spectroscopy and EDS data
demonstrated that the Si nanoparticles contain oxygen due to exposure to air during
grinding. I-V characterization of the milled Si nanoparticles showed an ohmic behaviour
with low current at low biases then Schottky diode behaviour or a symmetric
curve at large biases. / Graduate / 0794 / 0544 / zahraalbu@hotmail.com
|
399 |
Transición demográfica y pobreza en América LatinaAlejo, Javier January 2009 (has links) (PDF)
La literatura empírica ha encontrado evidencia de una tendencia hacia el envejecimiento de la población en América Latina. Este documento analiza el impacto de los cambios demográficos sobre la pobreza utilizando las proyecciones demográficas de la Organización de las Naciones Unidas junto con distintos escenarios en la estructura educativa. La metodología utilizada en este trabajo es la de microsimulaciones econométricas. Su principal innovación consiste en proponer el método de máxima verosimilitud empírica como estrategia de simulación de ponderadores. Bajo todos los supuestos del modelo de simulación, los resultados sugieren que si la dinámica poblacional se mantiene los niveles de pobreza se verán reducidos. Sin embargo el efecto cuantitativo es muy débil, dejando un amplio margen para la planificación de políticas económicas orientadas a la reducción de la pobreza.
|
400 |
Die Psalmboek 2003 as kommunikasiemiddel in die liturgie van die erediens in die Gereformeerde Kerke in Suid-Afrika : 'n himnologiese studie / J.H. van RooyVan Rooy, Jacoba Hendrika January 2008 (has links)
During services of the Reformed Churches of South Africa (RCSA), certain psalms and Biblical hymns are used frequently, while others are almost never used. The objective of this study is to determine how the Psalter of 2003 can be used optimally. The model of Zerfass is used for the investigation. The final results of the study are presented in the form of a model that can enrich the optimal use of the Psalter 2003 as means of communication in the liturgy of the RCSA.
In Chapters 2 and 3, a basic-theoretical investigation is conducted, with reference to perspectives from Scripture and history. The investigation reveals that music and songs had already played an important role in the church of the Old Testament, and that the Psalms had had a significant influence on the liturgy and the faith. In the New Testament, new hymns are found that supplement those from the Old Testament. The form in which these hymns were composed, was closely connected to the context whence these songs arose. The communication that a hymn effects, is achieved jointly by word and music, in the church of the New Testament and subsequently. Since the Reformation, a degree of separation arose between the songs used in the church and outside, but mutual influences are observed.
In Chapters 4 and 5, attention is paid to metatheoretical perspectives, first in communication science, then in hymnology. It is in their singing that the congregation participates in the worship of a church service. Such singing is indeed the basic form of participation, which promotes communication among members of the congregation. In judging a metrical version of a Psalm, attention should especially be paid to the content, style of the text and the melody.
In the empirical investigation, qualitative and quantitative methods are used. The investigation proceeds in three phases. In the first phase, information is obtained about the hymns that were sung in a number of congregations in the course of a year. In the second phase, a questionnaire was set and sent to ministers, organists and members of congregations. In the final phase, interviews were conducted with ministers and organists from five congregations.
The data show that the hymn occupies an important place in the liturgy. However, the investigation reveals that this point of departure is not fully realised in practice, and some shortcomings are identified. In particular, there is a need for extension of the hymnody, especially by hymns from the New Testament.
In Chapter 7, an indication is given of the factors that constrain the optimal use of the Psalter 2003 as means of worship in the church service. The most important problems are:
• the incomplete utilisation of the full collection of hymns in the Psalter,
• the underuse of the 2001 metrical version,
• the repeated use of a small subset of hymns,
• problems with the melodies and liturgical usefulness,
• limitations in the training of ministers and organists,
• resistance to the 2001 metrical version,
• a lack of proper programmes to practice the new hymns, and
• inadequate support and encouragement by church councils in respect of improving the skills of organists.
In view of these problems, a model is proposed, which identifies the relationships among the role players / elements necessary to promote worshipping through singing:
• the users of the Psalter 2001, viz. Ministers, organists and members of the congregations, together with the role of the church council,
• the Psalter 2003 as hymn book, and
• the possible extension of the current corpus of hymns. / Thesis (Ph.D. (Liturgics))--North-West University, Potchefstroom Campus, 2009.
|
Page generated in 0.0481 seconds