Spelling suggestions: "subject:"assurance"" "subject:"issurance""
271 |
QoS Representation, Negotiation and Assurance in Cloud ServicesZheng, Xianrong 20 February 2014 (has links)
Cloud services are Internet-based IT services. Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS) are three representative examples. As the cloud market becomes more open and competitive, Quality of Service (QoS) will be more important. However, cloud providers and cloud consumers have different and sometimes opposite preferences. If such a conflict occurs, a Service Level Agreement (SLA) cannot be established without negotiation.
To allow service consumers to express their QoS requirements, and negotiate them with service providers, we argue for cloud service negotiation. It aims to establish and enforce SLAs for cloud services. Specifically, we study how to measure, negotiate, and enforce QoS requirements for cloud services, and so formulate three research problems, i.e., QoS measurement, QoS negotiation, and QoS assurance. In terms of its scope, the topic covers business side automated negotiation and technical side resource allocation techniques. As a result, it has a potential impact on cloud service adoption.
To address QoS measurement, we initiate a quality model named CLOUDQUAL for cloud services. It is a model with quality dimensions and metrics that targets general cloud services. CLOUDQUAL contains six quality dimensions, i.e., usability, availability, reliability, responsiveness, security, and elasticity, of which usability is subjective, whereas the others are objective.
To address QoS negotiation, we present a mixed negotiation approach for cloud services, which is based on the “game of chicken”. In particular, if a party is uncertain about the strategy of its counterpart, it is best to mix concession and tradeoff strategies in negotiation. In fact, the mixed approach, which exhibits a certain degree of intelligence, can achieve a higher utility than a concession approach, while incurring fewer failures than a tradeoff approach.
To address QoS assurance, we propose a QoS-driven resource allocation method for cloud services. It can meet users’ QoS requirements while minimizing resources consumed. Especially, to honor a QoS specified in a SLA, we develop QoS assurance mechanisms, and determine the minimum resources that should be allocated. As a result, the method makes both technical and economic sense for cloud providers. / Thesis (Ph.D, Computing) -- Queen's University, 2014-02-20 14:26:06.616
|
272 |
HYDRATE NUCLEATION MEASUREMENTS USING HIGH PRESSURE DIFFERENTIAL SCANNING CALORIMETRYHester, Keith C., Davies, Simon R., Lachance, Jason W., Sloan, E. Dendy, Koh, Carolyn A. 07 1900 (has links)
Understanding when hydrates will nucleate has notable importance in the area of flow assurance. Attempts to model hydrate formation in subsea pipelines currently requires an arbitrary assignment of a nucleation subcooling. Previous studies showed that sII hydrate containing a model water-soluble former, tetrahydrofuran, would nucleate over a narrow temperature range of a few degrees with constant cooling. It is desirable to know if gas phase hydrate formers, which are typically more hydrophobic and hence have a very low solubility in water, also exhibit this nucleation behavior.
In this study, differential scanning calorimetry has been applied to determine the hydrate nucleation point for gas phase hydrate formers. Constant cooling ramps and isothermal approaches were combined to explore the probability of hydrate nucleation. In the temperature ramping experiments, methane and xenon were used at various pressures and cooling rates. In both systems, hydrate nucleation occurred over a narrow temperature range (2-3°C). Using methane at lower pressures, ice nucleated before hydrate; whereas at higher pressures, hydrate formed first. A subcooling driving force of around 30°C was necessary for hydrate nucleation from both guest molecules. The cooling rates (0.5-3°C/min) did not show any statistically significant effect on the nucleation temperature for a given pressure.
The isothermal method was used for a methane system with pure water and a water-in-West African crude emulsion. Two isotherms (-5 and -10°C) were used to determine nucleation time. In both systems, the time required for nucleation decreased with increased subcooling.
|
273 |
HYDRATE PLUG FORMATION PREDICTION TOOL – AN INCREASING NEED FOR FLOW ASSURANCE IN THE OIL INDUSTRYKinnari, Keijo, Labes-Carrier, Catherine, Lunde, Knud, Hemmingsen, Pål V., Davies, Simon R., Boxall, John A., Koh, Carolyn A., Sloan, E. Dendy 07 1900 (has links)
Hydrate plugging of hydrocarbon production conduits can cause large operational
problems resulting in considerable economical losses. Modeling capabilities to predict
hydrate plugging occurrences would help to improve facility design and operation in
order to reduce the extent of such events. It would also contribute to a more effective
and safer remediation process. This paper systematically describes different operational
scenarios where hydrate plugging might occur and how a hydrate plug formation
prediction tool would be beneficial.
The current understanding of the mechanisms for hydrate formation, agglomeration and
plugging of a pipeline are also presented. The results from this survey combined with the
identified industrial needs are then used as a basis for the assessment of the capabilities
of an existing hydrate plug formation model, called CSMHyK (The Colorado School of
Mines Hydrate Kinetic Model). This has recently been implemented in the transient
multiphase flow simulator OLGA as a separate module.
Finally, examples using the current model in several operational scenarios are shown to
illustrate some of its important capabilities. The results from these examples and the
operational scenarios analysis are then used to discuss the future development needs of
the CSMHyK model.
|
274 |
An Automated Quality Assurance Procedure for Archived Transit Data from APC and AVL SystemsSaavedra, Marian Ruth January 2010 (has links)
Automatic Vehicle Location (AVL) and Automatic Passenger Counting (APC) systems can be powerful tools for transit agencies to archive large, detailed quantities of transit operations data. Managing data quality is an important first step for exploiting these rich datasets.
This thesis presents an automated quality assurance (QA) methodology that identifies unreliable archived AVL/APC data. The approach is based on expected travel and passenger activity patterns derived from the data. It is assumed that standard passenger balancing and schedule matching algorithms are applied to the raw AVL/APC data along with any existing automatic validation programs. The proposed QA methodology is intended to provide transit agencies with a supplementary tool to manage data quality that complements, but does not replace, conventional processing routines (that can be vendor-specific and less transparent).
The proposed QA methodology endeavours to flag invalid data as “suspect” and valid data as “non-suspect”. There are three stages: i) the first stage screens data that demonstrate a violation of physical constraints; ii) the second stage looks for data that represent outliers; and iii) the third stage evaluates whether the outlier data can be accounted for with valid or invalid pattern. Stop-level tests are mathematically defined for each stage; however data is filtered at the trip-level. Data that do not violate any physical constraints and do not represent any outliers are considered valid trip data. Outlier trips that may be accounted for with a valid outlier pattern are also considered valid. The remaining trip data is considered suspect.
The methodology is applied to a sample set of AVL/APC data from Grand River Transit in the Region of Waterloo, Ontario, Canada. The sample data consist of 4-month’s data from September to December of 2008; it is comprised of 612,000 stop-level records representing 25,012 trips. The results show 14% of the trip-level data is flagged as suspect for the sample dataset. The output is further dissected by: reviewing which tests most contribute to the set of suspect trips; confirming the pattern assumptions for the valid outlier cases; and comparing the sample data by various traits before and after the QA methodology is applied. The latter task is meant to recognize characteristics that may contribute to higher or lower quality data. Analysis shows that the largest portion of suspect trips, for this sample set, suggests the need for improved passenger balancing algorithms or greater accuracy of the APC equipment. The assumptions for valid outlier case patterns were confirmed to be reasonable. It was found that poor schedule data contributes to poorer quality in AVL-APC data. An examination of data distribution by vehicle showed that usage and the portion of suspect data varied substantially between vehicles. This information can be useful in the development of maintenance plans and sampling plans (when combined with information of data distribution by route).
A sensitivity analysis was conducted along with an impact analysis on downstream data uses. The model was found to be sensitive to three of the ten user-defined parameters. The impact of the QA procedure on network-level measures of performance (MOPs) was not found to be significant, however the impact was shown to be more substantial for route-specific MOPs.
|
275 |
Gyvybės draudimo investicinio portfelio formavimo prielaidos Lietuvos gyvybės draudimo rinkoje / Life assurance investment portfolio formation assumptions of life assurance market in lithuaniaKučinskas, Paulius 08 September 2009 (has links)
GYVYBĖS DRAUDIMO INVESTICINIO PORTFELIO FORMAVIMO PRIELAIDOS LIETUVOS GYVYBĖS DRAUDIMO RINKOJE Darbo tikslas – ištirti investavimo į investicinius fondues sąlygas ir galimybes Lietuvoje. Pirmojoje darbo dalyje nagrinėjamos teorinės investicinių fondų atsiradimo prielaidos, pateikiama investicinių fondų klasifikacija bei analizuojamas investicijų portfelio formavimas. Antroji darbo dalis skirta gyvybės draudimo įmonių investicinių portfelių formavimo teisiniams aspektams Lietuvoje, išanalizuojama gyvybės draudimo rinka Lietuvoje. Trečiojoje darbo dalyje nagrinėjamos UAB SEB VB gyvybės draudimo siūlomų investavimo krypčių formavimo galimybės ir optimalaus investicinio fondo pasirinkimo galimybės. Taip pat trečiojoje darbo dalyje pateikiama autoriaus atlikto tyrimo analizė. Darbas susideda iš teorinės, analitinės ir tyriamosios dalies. / KUČINSKAS, Paulius. (2006) Life Assurance Investment Portfolio Formation Assumptions of Life Assurance Market in Lithuania MBA Graduation Paper. – Kaunas: Kaunas Faculty of Humanities, Vilnius University, 2006. 51 p. SUMMARY The goal of this theme – to analyze conditions and possibilities to invest in to investment funds in Lithuania. There are analyzed life insurance investment portfolios in the first part of paper. In the second part we demonstrate the peculiarities of formatting international portfolio investment, it’s benefits, risk. There are shown tendencies and perspective of investing in global market and with what kind of problems face investors. In the third part of paper we analyze and compare life insurance investment portfolios. We analyze life insurance problems. Legal environment for investment are analyzed as well as research of investment units as a SEB investment fund product. Paper consist of theoretical, analytical and project part.
|
276 |
Process improvements for manufacturing excellenceCarrillo, Janice E. 05 1900 (has links)
No description available.
|
277 |
The development of an in-vivo dosimeter for the application in radiotherapyBose, Rajiv January 2012 (has links)
The expectation for continual improvements in the treatment of cancer has brought quality assurance in radiotherapy under scrutiny in recent years. After a cancer diagnosis a custom treatment plan is devised to meet the particular needs of the patient's condition based on their prognosis. A cancer treatment plan will typically comprise of several cancer treatment technologies combining to form a comprehensive programme to fight the malignant growth. Inherent in each cancer treatment technology is a percentage error in treatment accuracy. Quality assurance is the medical practice to minimise the percentage error in treatment accuracy. Radiotherapy is one of the several cancer treatment technologies a patient might receive as part of their treatment plan, and in-vivo dosimetry is a quality assurance technology specifically designed to minimise the percentage error in the treatment accuracy of radiotherapy. This thesis outlines the work completed in the design of a next generation dosimeter for in-vivo dosimetry. The proposed dosimeter is intended to modernise the process of measuring the absorbed dose of ionising radiation received by the target volume during a radiotherapy session. To accomplish this goal the new dosimeter will amalgamate specialist technologies from the field of particle physics and reapply them to the field of medical physics. This thesis describes the design of a new implantable in-vivo dosimeter, a dosimeter comprising of several individual stages of electronics working together to modernise quality assurance in radiotherapy. Presented within this thesis are the results demonstrating the performance of two critical stages for this new dosimeter, including: the oating gate metal oxide field effective transistor, a radiation sensitive electronic component measuring an absorbed dose of radiation; and the micro antenna, a highly specialist wireless communications device working to transmit a high frequency radio signal. This was a collaborative project between Rutherford Appleton Laboratory and Brunel University. The presented work in this thesis was completed between March 2007 and January 2011.
|
278 |
Assuring sustainable supply through risk management / D.M. KamauKamau, David Mungai January 2009 (has links)
In today's highly competitive global marketplace, organisations are under pressure to find new ways to minimize risk and maximize profits. Profits can increase by reducing the cost of inputs or increasing price of the outputs. Hence, procurement contributes by reducing the cost of inputs by sourcing materials at lower costs. Consequently, this creates a challenge to the buyer as supply risk issues are more likely to occur with lower cost sources that might include new suppliers or unreliable sources. Unfortunately, logistics planners, too, drive new initiatives intended to be cost effective such as "Just-in-Time" (JIT) manufacturing. This adds an additional challenge to the buyer of ensuring uninterrupted supply while the system has actually eroded the supply buffer that would have enabled achieving the goal of low cost supply at lower risk. Hence, supply managers have to deal with the ever-increasing challenges in pursuit of a balance between supply risk and lower cost of supplies while ensuring sustainable supplies. For these reasons, this study aims to develop an implementation plan for the supply risk management. The plan developed from a case study undertaken at Safripol, a large manufacturing chemical company located in Sasolburg, South Africa. The complete study of the implementation plan, from the literature to empirical studies, were conducted by various means, including a study of journals and procurement textbooks, analysis of the supply data of Safripol, an experiment of a proposed plan, and a survey. The literature study initially discusses various approaches suggested by other writers and finally concludes with a proposition of a seven-step process. The process starts with (1) human resource allocation, (2) supply base categorisation, (3) vulnerability assessment, (4) evaluation of the implication, (5) cost/benefit analysis, (6) risk treatment, and it finally ends with the strategic alignment (7). These steps address the major issues that are critical to successful implementation of supply risk management, particularly in large manufacturing companies like Safripol.
However, the empirical study went further to investigate the application of the seven-step process. This involved commissioning of a multi-disciplinary team to review the process. The team applied the seven-step process in conducting a full assessment and evaluations of supply risks for four selected suppliers of specific material. The team composed of Safripol employees who are involved in the supply chain of the specific material reviewed. In addition, the analyses of the steps were captured, and stored in a tool developed in Excel worksheets. That enabled the process to be systematic, repeatable and easier to apply. Populated worksheets resulted in a formal knowledge resource database that will enable Safripol to manage supply risk plans and execute action plans in case of supply interruptions. Finally, the study indicated that the seven-step process is practical and applicable to Safripol. In addition, the study found that the process adds value in managing supply risks. Thus, the study achieved its goal of developing an implementation plan for supply risk management. However, the study concludes by providing future fields of study and recommendations in an effort to contribute further to development of supply risk management. / Thesis (M.B.A.)--North-West University, Vaal Triangle Campus, 2009.
|
279 |
Assuring sustainable supply through risk management / D.M. KamauKamau, David Mungai January 2009 (has links)
In today's highly competitive global marketplace, organisations are under pressure to find new ways to minimize risk and maximize profits. Profits can increase by reducing the cost of inputs or increasing price of the outputs. Hence, procurement contributes by reducing the cost of inputs by sourcing materials at lower costs. Consequently, this creates a challenge to the buyer as supply risk issues are more likely to occur with lower cost sources that might include new suppliers or unreliable sources. Unfortunately, logistics planners, too, drive new initiatives intended to be cost effective such as "Just-in-Time" (JIT) manufacturing. This adds an additional challenge to the buyer of ensuring uninterrupted supply while the system has actually eroded the supply buffer that would have enabled achieving the goal of low cost supply at lower risk. Hence, supply managers have to deal with the ever-increasing challenges in pursuit of a balance between supply risk and lower cost of supplies while ensuring sustainable supplies. For these reasons, this study aims to develop an implementation plan for the supply risk management. The plan developed from a case study undertaken at Safripol, a large manufacturing chemical company located in Sasolburg, South Africa. The complete study of the implementation plan, from the literature to empirical studies, were conducted by various means, including a study of journals and procurement textbooks, analysis of the supply data of Safripol, an experiment of a proposed plan, and a survey. The literature study initially discusses various approaches suggested by other writers and finally concludes with a proposition of a seven-step process. The process starts with (1) human resource allocation, (2) supply base categorisation, (3) vulnerability assessment, (4) evaluation of the implication, (5) cost/benefit analysis, (6) risk treatment, and it finally ends with the strategic alignment (7). These steps address the major issues that are critical to successful implementation of supply risk management, particularly in large manufacturing companies like Safripol.
However, the empirical study went further to investigate the application of the seven-step process. This involved commissioning of a multi-disciplinary team to review the process. The team applied the seven-step process in conducting a full assessment and evaluations of supply risks for four selected suppliers of specific material. The team composed of Safripol employees who are involved in the supply chain of the specific material reviewed. In addition, the analyses of the steps were captured, and stored in a tool developed in Excel worksheets. That enabled the process to be systematic, repeatable and easier to apply. Populated worksheets resulted in a formal knowledge resource database that will enable Safripol to manage supply risk plans and execute action plans in case of supply interruptions. Finally, the study indicated that the seven-step process is practical and applicable to Safripol. In addition, the study found that the process adds value in managing supply risks. Thus, the study achieved its goal of developing an implementation plan for supply risk management. However, the study concludes by providing future fields of study and recommendations in an effort to contribute further to development of supply risk management. / Thesis (M.B.A.)--North-West University, Vaal Triangle Campus, 2009.
|
280 |
Examining Methods and Practices of Source Data Verification in Canadian Critical Care Randomized Controlled TrialsWard, Roxanne E. 21 March 2013 (has links)
Statement of the Problem: Source data verification (SDV) is the process of comparing data collected at the source to data recorded on a Case Report Form, either paper or electronic (1) to ensure that the data are complete, accurate and verifiable. Good Clinical Practice (GCP) Guidelines are vague and lack evidence as to the degree of SDV and whether or not SDV affects study outcomes.
Methods of Investigation: We performed systematic reviews to establish the published evidence-base for methods of SDV and to examine the effect of SDV on study outcomes. We then conducted a national survey of Canadian Critical Care investigators and research coordinators regarding their attitudes and beliefs regarding SDV. We followed by an audit of the completed and in-progress Randomized Controlled Trials (RCTs) of the Canadian Critical Care Trials Group (CCCTG).
Results: Systematic Review of Methods of SDV: The most common reported or recommended frequency of source data verification (10/14 - 71%) was either based on level or risk, or that it be conducted early (i.e. after 1st patient enrolled). The amount of SDV recommended or reported, varied from 5-100%. Systematic Review of Impact of SDV on Study Outcomes: There was no difference in study outcomes for 1 trial and unable to assess in the other. National Survey of Critical Care Investigators and Research Coordinators: Data from the survey found that 95.8% (115/120) of respondents believed that SDV was an important part of Quality Assurance; 73.3% (88/120) felt that academic studies should do more SDV; and 62.5% (75/120) felt that there is insufficient funding available for SDV. Audit of Source Data Verification Practices in CCCTG RCTs: In the national audit of in-progress and completed CCCTG RCTs, 9/15 (60%) included a plan for SDV and 8/15 (53%) actually conducted SDV. Of the 9 completed published trials, 44% (4/9) conducted SDV.
Conclusion: There is little evidence base for methods and effect of SDV on study outcomes. Based on the results of the systematic review, survey, and audit, more research is needed to support the evidence base for the methods and effect of SDV on study outcomes.
|
Page generated in 1.8152 seconds