• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 146
  • 44
  • 20
  • 14
  • 11
  • 7
  • 4
  • 4
  • 4
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 290
  • 290
  • 69
  • 39
  • 32
  • 31
  • 30
  • 29
  • 29
  • 28
  • 26
  • 26
  • 25
  • 24
  • 24
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Quality Aspects of Maternal Health Care in Tanzania

Urassa, David Paradiso January 2004 (has links)
<p>This thesis assesses some indicators of quality for maternity care in Tanzania, using antenatal management of anaemia and hypertension and emergency obstetric care as focal points. The care of pregnant women consecutively enrolled in antenatal care (n=379) was observed and compared with quality standard criteria. From a tertiary level labour ward 741 cases of eclampsia were identified and their antenatal care analyzed. A health systems analysis was performed for 205 cases of pregnancy complications at district level.</p><p>There was inadequate equipment and drugs, inadequate staff knowledge and motivation, and incorrect measurements for investigating anaemia and hypertension in pregnancy. Hospital incidence of eclampsia at tertiary level was 200/10,000 live births, and was not modified by antenatal care. The quality observed in the antenatal programme indicated little impact on either anaemia or hypertensive complications. Compliance with obstetric referral was only 46% and all four observed maternal deaths occurred due to transport problems. The proposed process indicators for essential obstetric care were inadequate to assess the quality of care on a district level. </p><p>There is a need to address structural weaknesses, to motivate health workers and to improve training on quality improvement. More research is warranted on indicators for obstetric needs, accessibility and referral system.</p>
52

Quality Aspects of Maternal Health Care in Tanzania

Urassa, David Paradiso January 2004 (has links)
This thesis assesses some indicators of quality for maternity care in Tanzania, using antenatal management of anaemia and hypertension and emergency obstetric care as focal points. The care of pregnant women consecutively enrolled in antenatal care (n=379) was observed and compared with quality standard criteria. From a tertiary level labour ward 741 cases of eclampsia were identified and their antenatal care analyzed. A health systems analysis was performed for 205 cases of pregnancy complications at district level. There was inadequate equipment and drugs, inadequate staff knowledge and motivation, and incorrect measurements for investigating anaemia and hypertension in pregnancy. Hospital incidence of eclampsia at tertiary level was 200/10,000 live births, and was not modified by antenatal care. The quality observed in the antenatal programme indicated little impact on either anaemia or hypertensive complications. Compliance with obstetric referral was only 46% and all four observed maternal deaths occurred due to transport problems. The proposed process indicators for essential obstetric care were inadequate to assess the quality of care on a district level. There is a need to address structural weaknesses, to motivate health workers and to improve training on quality improvement. More research is warranted on indicators for obstetric needs, accessibility and referral system.
53

QoE-based Application Mapping for Resource Management

Leila, Shayanpour 11 January 2011 (has links)
Mapping between many different applications and many different underlying technologies is very complicated. Moreover, since users need service continuity and want to get the service in a satisfactory level, firstly, their perception of the service should be measured and secondly, the changes in the underlying technologies should be transparent to users. As a result, there should be ”virtualization layer” between application layer and underlying access technologies whose job is to abstract user perception of the application in terms of network parameters and transfer these requirements to underlying layers. In this thesis, we propose a generic mathematical expression to abstract user perception of application in a unified way for different applications. Since today applications are composite applications, having a generalized expression that has the same form for various applications can ease resource management calculations. We use application service map which is based on quality of experience for resource management.
54

QoE-based Application Mapping for Resource Management

Leila, Shayanpour 11 January 2011 (has links)
Mapping between many different applications and many different underlying technologies is very complicated. Moreover, since users need service continuity and want to get the service in a satisfactory level, firstly, their perception of the service should be measured and secondly, the changes in the underlying technologies should be transparent to users. As a result, there should be ”virtualization layer” between application layer and underlying access technologies whose job is to abstract user perception of the application in terms of network parameters and transfer these requirements to underlying layers. In this thesis, we propose a generic mathematical expression to abstract user perception of application in a unified way for different applications. Since today applications are composite applications, having a generalized expression that has the same form for various applications can ease resource management calculations. We use application service map which is based on quality of experience for resource management.
55

An Analysis and Reasoning Framework for Project Data Software Repositories

Attarian, Ioanna Maria January 2012 (has links)
As the requirements for software systems increase, their size, complexity and functionality consequently increases as well. This has a direct impact on the complexity of numerous artifacts related to the system such as specification, design, implementation and, testing models. Furthermore, as the software market becomes more and more competitive, the need for software products that are of high quality and require the least monetary, time and human resources for their development and maintenance becomes evident. Therefore, it is important that project managers and software engineers are given the necessary tools to obtain a more holistic and accurate perspective of the status of their projects in order to early identify potential risks, flaws, and quality issues that may arise during each stage of the software project life cycle. In this respect, practitioners and academics alike have recognized the significance of investigating new methods for supporting software management operations with respect to large software projects. The main target of this M.A.Sc. thesis is the design of a framework in terms of, first, a reference architecture for mining and analyzing of software project data repositories according to specific objectives and analytic knowledge, second, the techniques to model such analytic knowledge and, third, a reasoning methodology for verifying or denying hypotheses related to analysis objectives. Such a framework could assist project managers, team leaders and development teams towards more accurate prediction of project traits such as quality analysis, risk assessment, cost estimation and progress evaluation. More specifically, the framework utilizes goal models to specify analysis objectives as well as, possible ways by which these objectives can be achieved. Examples of such analysis objectives for a project could be to yield, high code quality, achieve low production cost or, cope with tight delivery deadlines. Such goal models are consequently transformed into collections of Markov Logic Network rules which are then applied to the repository data in order to verify or deny with a degree of probability, whether the particular project objectives can be met as the project evolves. The proposed framework has been applied, as a proof of concept, on a repository pertaining to three industrial projects with more that one hundred development tasks.
56

A Study of the Structural Similarity Image Quality Measure with Applications to Image Processing

Brunet, Dominique 02 August 2012 (has links)
Since its introduction in 2004, the Structural Similarity (SSIM) index has gained widespread popularity as an image quality assessment measure. SSIM is currently recognized to be one of the most powerful methods of assessing the visual closeness of images. That being said, the Mean Squared Error (MSE), which performs very poorly from a perceptual point of view, still remains the most common optimization criterion in image processing applications because of its relative simplicity along with a number of other properties that are deemed important. In this thesis, some necessary tools to assist in the design of SSIM-optimal algorithms are developed. This work combines theoretical developments with experimental research and practical algorithms. The description of the mathematical properties of the SSIM index represents the principal theoretical achievement in this thesis. Indeed, it is demonstrated how the SSIM index can be transformed into a distance metric. Local convexity, quasi-convexity, symmetries and invariance properties are also proved. The study of the SSIM index is also generalized to a family of metrics called normalized (or M-relative) metrics. Various analytical techniques for different kinds of SSIM-based optimization are then devised. For example, the best approximation according to the SSIM is described for orthogonal and redundant basis sets. SSIM-geodesic paths with arclength parameterization are also traced between images. Finally, formulas for SSIM-optimal point estimators are obtained. On the experimental side of the research, the structural self-similarity of images is studied. This leads to the confirmation of the hypothesis that the main source of self-similarity of images lies in their regions of low variance. On the practical side, an implementation of local statistical tests on the image residual is proposed for the assessment of denoised images. Also, heuristic estimations of the SSIM index and the MSE are developed. The research performed in this thesis should lead to the development of state-of-the-art image denoising algorithms. A better comprehension of the mathematical properties of the SSIM index represents another step toward the replacement of the MSE with SSIM in image processing applications.
57

A Study For The Requirements Of A Quality Maturity Framework For It Facilities In Organizations: A Case For Quality Assessment In Turkish Institutions

Oray, Emre 01 May 2010 (has links) (PDF)
Nowadays, almost all organizations use information technologies (IT) in some extent that they have gained many advantages from it. However, the wrong management of IT processes may cause big problems for organizations. It is first required to reach the quality to successfully govern IT and its processes. At this point, it is essential to have a quality maturity framework that organizations could use to follow a logical path in achieving quality. This thesis represents a study for determining the general quality requirements of a possible quality maturity framework for IT facilities in organizations. Many quality models, awards and frameworks will be discussed to determine those quality requirements and a possible leveling of the requirements will be also represented. Besides, a quality assessment survey will be performed in Turkish institutions regarding the quality requirements and the factors affecting them to determine the quality level of IT facilities in Turkish institutions.
58

Evaluation Of Visual Quality Metrics

Olgun, Ferhat Ramazan 01 October 2011 (has links) (PDF)
The aim of this study is to work on the visual quality metrics that are widely accepted in literature, to evaluate them on different distortion types and to give a comparison of overall performances in terms of prediction accuracy, monotonicity, consistency and complexity. The algorithms behind the quality metrics in literature and parameters used for quality metric performance evaluations are studied. This thesis also includes the explanation of Human Visual System, classification of visual quality metrics and subjective quality assessment methods. Experimental results that show the correlation between objective scores and human perception are taken to compare the eight widely accepted visual quality metrics.
59

A sampling-based decomposition algorithm with application to hydrothermal scheduling : cut formation and solution quality

Queiroz, Anderson Rodrigo de 06 February 2012 (has links)
We consider a hydrothermal scheduling problem with a mid-term horizon(HTSPM) modeled as a large-scale multistage stochastic program with stochastic monthly inflows of water to each hydro generator. In the HTSPM we seek an operating policy to minimize the sum of present and expected future costs, which include thermal generation costs and load curtailment costs. In addition to various simple bounds, problem constraints involve water balance, demand satisfaction and power interchanges. Sampling-based decomposition algorithms (SBDAs) have been used in the literature to solve HTSPM. SBDAs can be used to approximately solve problem instances with many time stages and with inflows that exhibit interstage dependence. Such dependence requires care in computing valid cuts for the decomposition algorithm. In order to help maintain tractability, we employ an aggregate reservoir representation (ARR). In an ARR all the hydro generators inside a specific region are grouped to effectively form one hydro plant with reservoir storage and generation capacity proportional to the parameters of the hydro plants used to form that aggregate reservoir. The ARR has been used in the literature with energy balance constraints, rather than water balance constraints, coupled with time series forecasts of energy inflows. Instead, we prefer as a model primitive to have the time series model forecast water inflows. This, in turn, requires that we extend existing methods to compute valid cuts for the decomposition method under the resulting form of interstage dependence. We form a sample average approximation of the original problem and then solve this problem by these special-purpose algorithms. And, we assess the quality of the resulting policy for operating the system. In our analysis, we compute a confidence interval on the optimality gap of a policy generated by solving an approximation on a sampled scenario tree. We present computational results on test problems with 24 monthly stages in which the inter-stage dependency of hydro inflows is modeled using a dynamic linear model. We further develop a parallel implementation of an SBDA. We apply SBDA to solve the HTSPM for the Brazilian power system that has 150 hydro generators, 151 thermal generators and 4 regions that each characterize an aggregate reservoir. We create and solve four different HTSPM instances where we change the input parameters with respect to generation capacity, transmission capacity and load in order to analyze the difference in the total expected cost. / text
60

Adaptive video transmission over wireless channels with optimized quality of experiences

Chen, Chao, active 2013 18 February 2014 (has links)
Video traffic is growing rapidly in wireless networks. Different from ordinary data traffic, video streams have higher data rates and tighter delay constraints. The ever-varying throughput of wireless links, however, cannot support continuous video playback if the video data rate is kept at a high level. To this end, adaptive video transmission techniques are employed to reduce the risk of playback interruptions by dynamically matching the video data rate to the varying channel throughput. In this dissertation, I develop new models to capture viewers' quality of experience (QoE) and design adaptive transmission algorithms to optimize the QoE. The contributions of this dissertation are threefold. First, I develop a new model for the viewers' QoE in rate-switching systems in which the video source rate is adapted every several seconds. The model is developed to predict an important aspect of QoE, the time-varying subjective quality (TVSQ), i.e., the up-to-the-moment subjective quality of a video as it is played. I first build a video database of rate-switching videos and measure TVSQs via a subjective study. Then, I parameterize and validate the TVSQ model using the measured TVSQs. Finally, based on the TVSQ model, I design an adaptive rate-switching algorithm that optimizes the time-averaged TVSQs of wireless video users. Second, I propose an adaptive video transmission algorithm to optimize the Overall Quality (OQ) of rate-switching videos, i.e., the viewers' judgement on the quality of the whole video. Through the subjective study, I find that the OQ is strongly correlated with the empirical cumulative distribution function (eCDF) of the video quality perceived by viewers. Based on this observation, I develop an adaptive video transmission algorithm that maximizes the number of video users who satisfy given constraints on the eCDF of perceived video qualities. Third, I propose an adaptive transmission algorithm for scalable videos. Different from the rate-switching systems, scalable videos support rate adaptation for each video frame. The proposed adaptive transmission algorithm maximizes the time-averaged video quality while maintaining continuous video playback. When the channel throughput is high, the algorithm increases the video data rate to improve video quality. Otherwise, the algorithm decreases the video data rate to buffer more videos and to reduce the risk of playback interruption. Simulation results show that the performance of the proposed algorithm is close to a performance upper bound. / text

Page generated in 0.0341 seconds