391 |
Epiphanies of finitude: a phenomenological study of existential readingSopcak, Paul Unknown Date
No description available.
|
392 |
Lietuvos energetikos ūkio administravimas / Lithuanian energy network administrationTatarnikova, Anna 10 January 2007 (has links)
Magistro baigiamasis darbas "Lietuvos energetikos ūkio administravimas“ parašytas Annos Tatarnikovos, vadovaujant viešojo administravimo katedros profesoriui dr. Vladimirui Obrazcovui.
Pasirinktos temos aktualumas tame, jog Lietuvos šilumos ūkis dar vystymosi stadijoje ir tinkamai įvertinus esamą energetikos ūkio administravimo patirtį galima pakeisti bei įtakoti šilumos sektoriaus ateitį.
Darbo naujumą ir praktinį reikšmingumą sudaro tai, jog greta teorinės ir praktinės analizės, bus atliekamas ir Lietuvos šilumos ūkio administravimo efektyvumo tyrimas.
Būtina akcentuoti, jog pokyčiai energetikos sektoriuje neišvengiami, pasitelkus kvalifikuotus šios srities specialistus, remiantis moksliniais tyrimais, atlikus atitinkančią realiai padėčiai SWOT analizę galima užtikrinti sklandžią šilumos ūkio veiklą bei skaidrų finansinių išteklių panaudojimą.
Darbe analizuojant mokslinę literatūrą pateikiamos kapitalo pritraukimo alternatyvos šilumos ūkio modernizavimui. Pagrindinės alternatyvos yra: banko kreditai, įmonės koncesija-nuoma, dalinis įmonės privatizavimas. Siūlomos ir kitos alternatyvos: paslaugų sutartis ir administravimo sutartis, tačiau jų galiojimo terminas daug trumpesnis ir jos gali būti panaudotos tik kaip laikinos priemonės.
Darbe naudojami istorinis ir lyginamasis tyrimo metodai naudojami ištirti šilumos ūkio administravimo raidą Lietuvoje, bei jai darančius įtaką veiksnius, restruktūrizavus šilumos ūkį. Apibendrinant rezultatus, teikiant rekomendacijas... [to full text] / Actuality of the chosen topic is in the fact that the Lithuanian heating network is going under development.
Having estimated the existing administration experience correctly, one can change or influence upon the heating sectors future.
The novelty and practical meaning of the work consists in the research of the effectiveness of the Lithuanian heating system administration to be performed, alongside with the theoretical and practical analysis.
The accent is put upon the inevitability of changes in the energy sector.
The changes are inevitable if being deeply investigated by researchers and scientific specialists doing profound SWOT analysis to reinforce the heating system activities using transparent financial sources.
The scientific literature has been analyzed concerning (alternative) capital investments alternatives for heating systems modernization. The main alternatives are: bank credits, concession companies rentals and privatization of companies. The other alternatives are: service argument; administration agreement. However, the validity term of such services is by for shorter, so they com serve as temporary means alone. Summarizing the results the method of analytical criticism has been used.
AB “Vilniaus energija” the Vilnius energy sector of the beginning of 2002 has been analysed. This company signed a rental concession agreement with a foreign society. To find out the advantages and disadvantages of the heating system rentals to the private sector, the... [to full text]
|
393 |
SOME CONTRIBUTIONS TO THE CENSORED EMPIRICAL LIKELIHOOD WITH HAZARD-TYPE CONSTRAINTSHu, Yanling 01 January 2011 (has links)
Empirical likelihood (EL) is a recently developed nonparametric method of statistical inference. Owen’s 2001 book contains many important results for EL with uncensored data. However, fewer results are available for EL with right-censored data. In this dissertation, we first investigate a right-censored-data extension of Qin and Lawless (1994). They studied EL with uncensored data when the number of estimating equations is larger than the number of parameters (over-determined case). We obtain results similar to theirs for the maximum EL estimator and the EL ratio test, for the over-determined case, with right-censored data. We employ hazard-type constraints which are better able to handle right-censored data. Then we investigate EL with right-censored data and a k-sample mixed hazard-type constraint. We show that the EL ratio test statistic has a limiting chi-square distribution when k = 2. We also study the relationship between the constrained Kaplan-Meier estimator and the corresponding Nelson-Aalen estimator. We try to prove that they are asymptotically equivalent under certain conditions. Finally we present simulation studies and examples showing how to apply our theory and methodology with real data.
|
394 |
A Fault-Based Model of Fault Localization TechniquesHays, Mark A 01 January 2014 (has links)
Every day, ordinary people depend on software working properly. We take it for granted; from banking software, to railroad switching software, to flight control software, to software that controls medical devices such as pacemakers or even gas pumps, our lives are touched by software that we expect to work. It is well known that the main technique/activity used to ensure the quality of software is testing. Often it is the only quality assurance activity undertaken, making it that much more important.
In a typical experiment studying these techniques, a researcher will intentionally seed a fault (intentionally breaking the functionality of some source code) with the hopes that the automated techniques under study will be able to identify the fault's location in the source code. These faults are picked arbitrarily; there is potential for bias in the selection of the faults. Previous researchers have established an ontology for understanding or expressing this bias called fault size. This research captures the fault size ontology in the form of a probabilistic model. The results of applying this model to measure fault size suggest that many faults generated through program mutation (the systematic replacement of source code operators to create faults) are very large and easily found. Secondary measures generated in the assessment of the model suggest a new static analysis method, called testability, for predicting the likelihood that code will contain a fault in the future.
While software testing researchers are not statisticians, they nonetheless make extensive use of statistics in their experiments to assess fault localization techniques. Researchers often select their statistical techniques without justification. This is a very worrisome situation because it can lead to incorrect conclusions about the significance of research. This research introduces an algorithm, MeansTest, which helps automate some aspects of the selection of appropriate statistical techniques. The results of an evaluation of MeansTest suggest that MeansTest performs well relative to its peers. This research then surveys recent work in software testing using MeansTest to evaluate the significance of researchers' work. The results of the survey indicate that software testing researchers are underreporting the significance of their work.
|
395 |
DETAILED MODELING OF MUFFLERS WITH PERFORATED TUBES USING SUBSTRUCTURE BOUNDARY ELEMENT METHODDatchanamourty, Balasubramanian 01 January 2004 (has links)
Perforated tubes in mufflers are generally modeled by the transfer impedance approach since modeling the actual geometry of the perforated tubes with holes is very expensive due to the enormity of the boundary elements required. With the development of the substructuring technique which greatly reduces the number of elements required detailed modeling of the perforated tubes has become possible. In this thesis mufflers with perforated tubes are analyzed by modeling the actual geometry and locations of holes on the perforated tubes. The Direct-mixed-body boundary element method with substructuring is used to model the mufflers. Mufflers of various geometry containing perforated tubes with holes of different sizes and porosity are tested. The results obtained from the analyses are compared with the empirical formula results and experimental results. A preliminary investigation on the detailed modeling of flow-through catalytic converters is also conducted.
|
396 |
Image Filtering Methods for Biomedical ApplicationsNiazi, M. Khalid Khan January 2011 (has links)
Filtering is a key step in digital image processing and analysis. It is mainly used for amplification or attenuation of some frequencies depending on the nature of the application. Filtering can either be performed in the spatial domain or in a transformed domain. The selection of the filtering method, filtering domain, and the filter parameters are often driven by the properties of the underlying image. This thesis presents three different kinds of biomedical image filtering applications, where the filter parameters are automatically determined from the underlying images. Filtering can be used for image enhancement. We present a robust image dependent filtering method for intensity inhomogeneity correction of biomedical images. In the presented filtering method, the filter parameters are automatically determined from the grey-weighted distance transform of the magnitude spectrum. An evaluation shows that the filter provides an accurate estimate of intensity inhomogeneity. Filtering can also be used for analysis. The thesis presents a filtering method for heart localization and robust signal detection from video recordings of rat embryos. It presents a strategy to decouple motion artifacts produced by the non-rigid embryonic boundary from the heart. The method also filters out noise and the trend term with the help of empirical mode decomposition. Again, all the filter parameters are determined automatically based on the underlying signal. Transforming the geometry of one image to fit that of another one, so called image registration, can be seen as a filtering operation of the image geometry. To assess the progression of eye disorder, registration between temporal images is often required to determine the movement and development of the blood vessels in the eye. We present a robust method for retinal image registration. The method is based on particle swarm optimization, where the swarm searches for optimal registration parameters based on the direction of its cognitive and social components. An evaluation of the proposed method shows that the method is less susceptible to becoming trapped in local minima than previous methods. With these thesis contributions, we have augmented the filter toolbox for image analysis with methods that adjust to the data at hand.
|
397 |
The Impact of Domain Knowledge on the Effectiveness of Requirements Engineering ActivitiesNiknafs, Ali January 2014 (has links)
One of the factors that seems to influence an individual’s effectiveness in requirements engineering activities is his or her knowledge of the problem being solved, i.e., domain knowledge. While in-depth domain knowledge enables a requirements engineer to understand the problem easier, he or she can fall for tacit assumptions of the domain and might overlook issues that are obvious to domain experts and thus remain unmentioned.
The purpose of this thesis is to investigate the impact of domain knowledge on different requirements engineering activities. The main research question this thesis attempts to answer is “How does one form the most effective team, consisting of some mix of domain ignorants and domain awares, for a requirements engineering activity involving knowledge about the domain of the computer-based system whose requirements are being determined by the team?”
This thesis presents two controlled experiments and an industrial case study to test a number of hypotheses. The main hypothesis states that a requirements engineering team for a computer-based system in a particular domain, consisting of a mix of requirements analysts that are ignorant of the domain and requirements analysts that are aware of the domain, is more effective at requirement idea generation than a team consisting of only requirements analysts that are aware of the domain.
The results of the controlled experiments, although not conclusive, provided some support for the positive effect of the mix on effectiveness of a requirements engineering team. The results also showed a significant effect of other independent variables, especially educational background. The data of the case study corroborated the results of the controlled experiments.
The main conclusion that can be drawn from the findings of this thesis is that the presence in a requirements engineering team of a domain ignorant with a computer science or software engineering background improves the effectiveness of the team.
|
398 |
Mechanistic-empirical failure prediction models for spring weight restricted flexible pavements in Manitoba using Manitoba and MnROAD instrumented test sitesKavanagh, Leonnie 27 June 2013 (has links)
Pavement damage due to heavy loads on thaw weakened flexible pavements is a major
concern for road agencies in Western Canada. To protect weaker, low volume roads,
agencies impose spring weight restrictions (SWR) during the spring thaw to reduce
pavement damage. While SWR may be cost effective for highway agencies, reducing the
spring weight allowances can have a major impact on truck productivity and shipping
costs. Therefore an improved process that links SWR loads to pavement damage, and
based on limiting failure strain, is required.
This thesis developed Local mechanistic-empirical damage models to predict fatigue and
rutting failure on two spring weight restricted (SWR) flexible pavements in Manitoba.
The Local damage models were used to assess the SWR loads that regulate commercial
vehicle weights in Manitoba based on a limiting strain relationship between truck loads
and damage. The Local damage models and a calibrated Finite Element Model (FEM)
were used to predict the equivalent single axle load (ESAL) repetitions to fatigue and
rutting failure at varying B-Train axle loads at the Manitoba sites. The Local model
predictions were compared to predictions from the Asphalt Institute (AI) and Mechanistic
Empirical Design Guide (MEPDG) damage models. The results of the analysis showed
that for each 1% increase in load, there was a corresponding 1% increase in strain, and up
to 3% decrease in ESAL repetitions to failure, depending on the Local, AI, or MEPDG
damage models. The limiting failure strains, computed from the Local model for design
ESALs of 100,000, were 483μm/m and 1,008μm/m for fatigue and rutting failure,
respectively. For the Manitoba sites, the predicted FEM strains at B-Train normal and
SWR loads were higher than the Local model limiting strains. Therefore the Manitoba ii
SWR loads regulating B-Train operations on the two pavements during the spring period
appeared to be reasonable. It is recommended that the research findings be verified with
further calibration and validation of the Local damage model using a larger data set of
low volume flexible pavements. A strain-based concept on how to manage the SWR
regime in Manitoba based on the limiting strains was developed and presented.
|
399 |
STUDYING THE IMPACT OF DEVELOPER COMMUNICATION ON THE QUALITY AND EVOLUTION OF A SOFTWARE SYSTEMBettenburg, Nicolas 22 May 2014 (has links)
Software development is a largely collaborative effort, of which the actual encoding of program logic in source code is a relatively small part. Software developers have to collaborate effectively and communicate with their peers in order to avoid coordination problems. To date, little is known how developer communication during software development activities impacts the quality and evolution of a software.
In this thesis, we present and evaluate tools and techniques to recover communication data from traces of the software development activities. With this data, we study the impact of developer communication on the quality and evolution of the software through an in-depth investigation of the role of developer communication during software development activities. Through multiple case-studies on a broad spectrum of open-source software projects, we find that communication between developers stands in a direct relationship to the quality of the software. Our findings demonstrate that our models based on developer communication explain software defects as well as state-of-the art models that are based on technical information such as code and process metrics, and that social information metrics are orthogonal to these traditional metrics, leading to a more complete and integrated view on software defects. In addition, we find that communication between developers plays a important role in maintaining a healthy contribution management process, which is one of the key factors to the successful evolution of the software. Source code contributors who are part of the community surrounding open-source projects are available for limited times, and long communication times can lead to the loss of valuable contributions.
Our thesis illustrates that software development is an intricate and complex process that is strongly influenced by the social interactions between the stakeholders involved in the development activities. A traditional view based solely on technical aspects of software development such as source code size and complexity, while valuable, limits our understanding of software development activities. The research presented in this thesis consists of a first step towards gaining a more holistic view on software development activities. / Thesis (Ph.D, Computing) -- Queen's University, 2014-05-22 12:07:13.823
|
400 |
Silicon Nanoparticle Synthesis and Modeling for Thin Film Solar CellsAlbu, Zahra 30 April 2014 (has links)
Nanometer-scale silicon shows extraordinary electronic and optical properties that
are not available for bulk silicon, and many investigations toward applications in optoelectronic
devices are being pursued. Silicon nanoparticle films made from solution
are a promising candidate for low-cost solar cells. However, controlling the properties
of silicon nanoparticles is quite a challenge, in particular shape and size distribution,
which effect device performance. At present, none of the solar cells made from silicon
nanoparticle films have an efficiency exceeding the efficiency of those based on crystalline
silicon. To address the challenge of controlling silicon nanoparticle properties,
both theoretical and experimental investigations are needed. In this thesis, we investigate
silicon nanoparticle properties via quantum mechanical modeling of silicon
nanoparticles and synthesis of silicon nanoparticle films via colloidal grinding.
Silicon nanoparticles with shapes including cubic, rectangular, ellipsoidal and flat
disk are modeled using semi-empirical methods and configuration interaction. Their
electronic properties with different surface passivation were also studied. The results
showed that silicon nanoparticles with hydrogen passivation have higher HOMOLUMO
gaps, and also the HOMO-LUMO gap depends on the size and the shape
of the particle. In contrast, silicon nanoparticles with oxygen passivation have a
lower HOMO-LUMO gap. Raman spectroscopy calculation of silicon nanoparticles
show peak shift and asymmetric broadening similar to what has been observed in
experiment.
Silicon nanoparticle synthesis via colloidal grinding was demonstrated as a straightforward
and inexpensive approach for thin film solar cells. Data analysis of silicon
particles via SEM images demonstrated that colloidal grinding is effective in reducing
the Si particle size to sub-micron in a short grinding time. Further increases in
grinding time, followed by filtration demonstrated a narrowing of the Si particle size
and size-distribution to an average size of 70 nm. Raman spectroscopy and EDS data
demonstrated that the Si nanoparticles contain oxygen due to exposure to air during
grinding. I-V characterization of the milled Si nanoparticles showed an ohmic behaviour
with low current at low biases then Schottky diode behaviour or a symmetric
curve at large biases. / Graduate / 0794 / 0544 / zahraalbu@hotmail.com
|
Page generated in 0.0435 seconds