• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 137
  • 45
  • 17
  • 9
  • 4
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 259
  • 49
  • 34
  • 32
  • 29
  • 29
  • 28
  • 27
  • 25
  • 23
  • 21
  • 19
  • 19
  • 18
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Improvement of Manufacturing Quality Burapha Wood Company in Laos

Phonetip, Khamtan, Rashidian, Zahra January 2011 (has links)
This report is an initially study of the whole processes in the manufacturing aims to capturethe current problems into groups of categories and prioritize the problem solving according tothe frequency of the defects detection, the root causes are indentified and assessment ofFMEA on the problems occurrence.Research method is performed by interviewing the key persons who are representatives fromthe units, then use tools of quality improvement to analyze each step (Affinity, Pareto,Ishikawa and FMEA).Conclusion of the study found that Burapha Wood Company is currently facing with threecategories of problem, those are Machinery, Competency, Timber Drying and others with thelist of problem are consisted.The Two sided planer machine is a first prior to be undertaken of solution. Where the rootcauses are ignorance of workers not follow the machine’s instruction in such of reset themachine before operating and the record form has not filled when problems occurrence.The processes where the company has to focus is on the Two sided planer machine which is ahighest prioritized by pareto diagram based on FMEA analysis, but considering with thepotential and possibility of solution Burapha Wood Company must intend on timber drying.These problems are partial platform of improvement of manufacturing quality to get a COCcertification.
22

View-Dependent Visualization for Analysis of Large Datasets

Overby, Derek Robert 2011 December 1900 (has links)
Due to the impressive capabilities of human visual processing, interactive visualization methods have become essential tools for scientists to explore and analyze large, complex datasets. However, traditional approaches do not account for the increased size or latency of data retrieval when interacting with these often remote datasets. In this dissertation, I discuss two novel design paradigms, based on accepted models of the information visualization process and graphics hardware pipeline, that are appropriate for interactive visualization of large remote datasets. In particular, I discuss novel solutions aimed at improving the performance of interactive visualization systems when working with large numeric datasets and large terrain (elevation and imagery) datasets by using data reduction and asynchronous retrieval of view-prioritized data, respectively. First I present a modified version of the standard information visualization model that accounts for the challenges presented by interacting with large, remote datasets. I also provide the details of a software framework implemented using this model and discuss several different visualization applications developed within this framework. Next I present a novel technique for leveraging the hardware graphics pipeline to provide asynchronous, view-prioritized data retrieval to support interactive visualization of remote terrain data. I provide the results of statistical analysis of performance metrics to demonstrate the effectiveness of this approach. Finally I present the details of two novel visualization techniques, and the results of evaluating these systems using controlled user studies and expert evaluation. The results of these qualitative and quantitative evaluation mechanisms demonstrate improved visual analysis task performance for large numeric datasets.
23

The development of a conceptual framework for a district 4-Year Pavement Management Plan

Hwang, Jea Won 30 September 2011 (has links)
The Texas Department of Transportation (TxDOT) is concerned about the widening gap between preservation needs and available funding. Funding levels are not adequate to meet the preservation needs of the roadway network; therefore projects listed in the 4-Year Pavement Management Plan must be ranked to determine which projects should be funded now and which can be postponed until a later year. Currently, each district uses locally developed methods to rank and prioritize projects. These ranking methods have relied on less formal qualitative assessments based on engineers’ subjective judgment. It is important for TxDOT to have a rational 4-Year Pavement Management Plan. The objective of this study is to develop a conceptual framework that describes the development of the 4-Year Pavement Management Plan and a proposed ranking process. It can be largely divided into three steps; (1) Network-Level preliminary project screening process, (2) Project-Level project ranking process, and (3) Economic Analysis. A rational pavement management procedure and a project ranking method that are accepted by districts and the TxDOT administration will maximize efficiency in budget allocations and help improve pavement condition. As a part of this study, based on the data provided by the Austin District Pavement Engineer, the Network-Level Project Screening (NLPS) tool, including the candidate project selection algorithm and the preliminary project screening matrix, is developed. The NLSP tool has been used by the Austin District Pavement Engineer (DPE) to evaluate the PMIS (Pavement Management Information System) data and to prepare a preliminary list of candidate projects for further evaluation. The automated tool will help TxDOT engineers easily incorporate the developed mathematical algorithm into their daily pavement maintenance management. / text
24

People with Parkinson's disease should avoid performing dual-tasks while walking: myth or reality?

Fok, Pamela Ching Kwan January 2009 (has links)
Traditionally, people with Parkinson’s disease (PD) are advised to avoid performing dual-tasks while walking. Rehabilitation programs also emphasise the need to train walking under single-task conditions to improve gait and reduce risk for falls. There are findings that people with PD can walk faster and with longer strides while performing a secondary motor or cognitive task, when cued by a metronome or visual floor stripes. There are also findings that people with PD can walk faster and with longer strides while performing a secondary motor task simply by prioritising their attention to take big steps. Using attention is a convenient strategy favoured by people with PD to manage their gait difficulties. / This thesis examined the immediate and training effects of two attention-priority strategies on dual-cognitive task walking in people with mild to moderate PD. Two groups of participants received 30 minutes of training to prioritise attention to take big steps while performing serial three subtractions (gait-priority strategy) or to divide their attention between taking big steps and the cognitive task (equal-priority strategy). Control groups received no training. Measures of gait hypo-bradykinesia (stride length and gait velocity), stride variability (Coefficient of variation [CV] of stride length and CV of stride time) and cognitive task performance (accurate enumeration rate) were assessed at baseline, during training, immediately after training and 30 minutes after training. Both attention-priority strategies improved stride length and gait velocity during training. The improvement was retained for at least 30 minutes after training. Both strategies have no effect on CVs of stride length, stride time and accurate enumeration rate. / Many daily routines require our ability to overcome single-, dual- and multi-task demands while walking. Rehabilitation strategies should encompass real life demands in order to minimise functional impairments, activity limitations and participation restrictions, as recommended by the World Health Organisation. Putting together the findings of this thesis and the evidence provided by previous studies, it is concluded that traditional recommendations need qualification. Avoiding dual-tasks during walking or gait retraining in people with mild to moderate PD may not be necessary. Gait-priority and equal-priority strategies can be used as compensatory strategies to improve gait during dual-tasks. The two strategies can also be used in training programs for walking rehabilitation.
25

A Systematic Stakeholder Selection Model in Requirements Elicitation for Software Projects: A Systematic Mapping Study

Zhao, Junwen, Zhao, Yuemin January 2018 (has links)
Context. The appropriate stakeholder selection for software engineering is an essential stage and the precondition of software requirements elicitation. However, the stakeholder analysis hasn’t get enough attention in the requirement elicitation field as it was commonly recognized as a self-evidence process in practice. Objectives. In this study, we investigated the current status of this area. Collated the affecting factors which influence the appropriate stakeholder selection on Software Engineering (SE) with respect to the requirement elicitation purpose. On the basic of this objective, we investigated a systematic conceptual model which aims to guide the appropriate stakeholder selection of software projects. Finally, we evaluate the meaning to practice of our model. Methods. We conducted the systematic mapping study for the first objective. The objective of selecting affecting factors is on the basic of the first objective. The objective to evaluate the meaning to practice is realized by interviewing 10 experienced software product managers. Results. The recent studies on this area have been classified according to their different focuses. We described the methods of each included papers on systematic mapping study. We collected 12 factors used by previous studies and select 6 factors for our model. And we generated our model by six steps. The interview to ten practitioners is used to evaluate our model. Conclusions. We extracted six factors according to previous studies, then proposed a systematic stakeholder selection model for software projects on the basic of analysis to those factors. Generally, our model’s meaning to practice has been confirmed by interviews with experienced practitioners.
26

La priorisation humaine. Étude des corrélats cognitifs de l'évaluation des priorités / Human prioritizing. A cognitive study of priorities assessment

Valéry, Benoît 01 December 2017 (has links)
La viabilité des systèmes de travail complexes repose en partie sur la capacité des opérateurs à y gérer des tâches concurrentes en fonction de leurs priorités respectives. Il s'agit là d'un processus critique qui, lorsqu'il est défaillant, peut amener à des erreurs de négligence ou de persévération sur certaines opérations. Les modèles actuels de la priorisation humaine présentent de nombreuses limites : ils sont souvent normatifs et ne rendent pas suffisamment compte de la façon dont les opérateurs utilisent effectivement l'information disponible dans l'environnement pour le calcul des priorités. Leur faible pouvoir descriptif ne permet pas réellement de comprendre les mécanismes cognitifs contributifs de l'erreur, spécialement dans des situations de surcharge mentale. Cette thèse, en adoptant l'approche de l'ergonomie cognitive, se donne pour objectif de mieux comprendre et décrire ces processus. Trois expérimentations fondamentales ont été menées, qui toutes mettaient les participants aux prises avec des situations de double-tâche, où divers attributs (difficulté, importance, marge d'erreur) des tâches étaient manipulés. Ces tâches reproduisaient des sollicitations des environnements complexes, comme la mémorisation, la surveillance, ou encore le diagnostic sous incertitude. L'ensemble des résultats souligne l'importance de la priorisation dans la bonne tenue des systèmes complexes, et suggère une sensibilisation accrue des opérateurs à ces phénomènes lors de leur formation, ainsi qu'une réflexion à plus long terme sur la présentation de l'information dans le contexte des interfaces homme-machine. / The viability of complex work systems depends in part on the ability of operators to manage competing tasks according to their respective priorities. This is a critical process which, when it fails, can lead to errors of negligence or perseverance on certain operations. Current models of human prioritization have many limitations: they are often normative and do not adequately account for how operators actually use the information available in the environment for computing priorities. Their weak descriptive power does not really make it possible to understand the cognitive mechanisms contributing to error, especially in situations of mental overload. This thesis, adopting the cognitive ergonomics approach, aims to better understand and describe these processes. Three fundamental experiments were conducted, all of which put participants in double-task situations, where various attributes (difficulty, importance, margin of error) of the tasks were handled. These tasks reproduced solicitations of complex environments, such as memorization, monitoring, or diagnosis under uncertainty. All the results underline the importance of the prioritization in the good behavior of the complex systems, and suggest an increased sensitization of the operators to these phenomena during their formation, as well as a longer-term reflection on the presentation of the information in the context of man-machine interfaces.
27

Generating Mixed-Level Covering Arrays of Lambda = 2 and Test Prioritization

January 2015 (has links)
abstract: In software testing, components are tested individually to make sure each performs as expected. The next step is to confirm that two or more components are able to work together. This stage of testing is often difficult because there can be numerous configurations between just two components. Covering arrays are one way to ensure a set of tests will cover every possible configuration at least once. However, on systems with many settings, it is computationally intensive to run every possible test. Test prioritization methods can identify tests of greater importance. This concept of test prioritization can help determine which tests can be removed with minimal impact to the overall testing of the system. This thesis presents three algorithms that generate covering arrays that test the interaction of every two components at least twice. These algorithms extend the functionality of an established greedy test prioritization method to ensure important components are selected in earlier tests. The algorithms are tested on various inputs and the results reveal that on average, the resulting covering arrays are two-fifths to one-half times smaller than a covering array generated through brute force. / Dissertation/Thesis / Masters Thesis Computer Science 2015
28

Methods For Test Case Prioritization Based On Test Case Execution History

Ying, PuLe, Fan, LingZhi January 2017 (has links)
Motivation: Test case prioritization can prioritize test cases, optimize the test execution, save time and cost. There are many different methods for test case prioritization, test case prioritization method based on test case execution history is one kind of them. Based on the test case execution history, it’s easier to increase the rate of fault detection, hence we want to do a study about test case prioritization methods based on the test case execution history. Meanwhile, executing the feasible methods to compare the effectiveness of them. For the motivation of the thesis may be regarded as an example for experiencing approach for comparing test case prioritizations based on test case execution history, or as a study case for identifying the suitable methods to use and help improve the effectiveness of the testing process. Objectives: The aim of this thesis is to look for a suitable test case prioritization method that can support risk based testing, in which test case execution history is employed as the key criterion of evaluation. For this research, there are three main objectives. First, explore and summarize methods of test case prioritization based on test case history. Next, identify what are differences among the test case prioritization methods. Finally, execute the methods which we selected, and compare the effectiveness of methods. Methods: To achieve the first and the second study objectives, a systematic literature review has been conducted using Kitchenham guidelines. To achieve the third study objective, an experiment was conducted following Wohlin guidelines. Results: In our thesis: 1) We conducted a systematic literature review and selected 15 relevant literatures. We extracted data of the literatures and then we synthesized the data. We found that the methods have different kinds of inputs, test levels, maturity levels, validation and "automated testing or manual testing". 2) We selected two feasible methods from those 15 literatures, Method 1 is Adaptive test-case prioritization and Method 2 is Similarity-based test quality metric. We executed the methods within 17 test suites. Comparing the result of two methods and non-prioritization, the mean Average Percentage of Defects Found (APFD) of Adaptive test-case prioritization execution result (86.9%) is significantly higher than non-prioritization (51.5%) and Similarity-based test quality metric (47.5%), it means that the Adaptive test-case prioritization has higher effectiveness. Conclusion: In our thesis, existing test case prioritization methods based on test case execution history are extracted and listed out through systematic literature review. The summary of them and the description of differences can be available in the thesis. The 15 relevant literatures and the synthesized data may be as a guideline for relevant software researchers or testers. We did the statistical test for the experimental result, we can see two different test case prioritization methods have different effectiveness.
29

Impacts of Small, Surface-Release Dams on Stream Temperature and Dissolved Oxygen in Massachusetts

Zaidel, Peter 12 July 2018 (has links)
Dams fragment streams and rivers, with >14,000 in New England alone, and have the potential to significantly alter the physical, chemical, and biological characteristics of lotic systems. For example, dams can alter temperature and dissolved oxygen (DO) regimes, which can, in turn, affect species distributions, whole system metabolism, and nutrient processing rates. Moreover, changes in temperature signal life history cues (e.g., emergence, egg-hatching, migration) for many species of aquatic organisms, and present another avenue for dams to alter biotic communities. Despite the prevalence of small dams in the landscape and their potential significant impacts on temperature and DO, dams have not been well-studied and published impacts vary widely across sites. Given the variation in impact, I sought to quantify the impacts of small dams to stream temperature and DO, and to determine the drivers of inter- and intra-site variation in response. To accomplish this, I deployed 160 continuous temperature data loggers at 30 small, surface-release dams in Massachusetts. The majority of sites (61%) had higher temperatures downstream of the dam compared to upstream and most (85%) experienced decreasing temperatures with increasing distance downstream of the dam, such that the warmest temperatures were located closest to the dam. At approximately half of the temperature sites, flow had a homogenizing effect on temperatures throughout the study reach, whereby impacts were more pronounced (e.g., more warming, faster decay rates) under periods of low flow than under high flow conditions. Magnitude of warming varied greatly among sites, and this variation was explained best by landscape position and reservoir volume, with dams in smaller watersheds and with larger reservoir volumes experiencing greater warming magnitudes. Forest cover, dam height, and the presence of an auxiliary spillway best predicted the downstream temperature decay rate, with temperatures cooling fastest downstream of shorter dams in forested basins that did not have an auxiliary spillway. I used continuous DO loggers upstream, within the impoundment, and downstream of 12 dams to identify dam impacts to DO. Most sites experienced lower DO (66%) within the impoundment compared to upstream; however, 58% of the sites showed no difference in diel ranges between these reaches. The effect of dams on downstream DO was mixed, with increases, no change, and decreases relative to upstream condition; however, the majority of sites (58%) experienced a suppressed downstream diel range relative to upstream. The upstream slope, basin size, and dam height drove the impoundment response, such that dams with steeper upstream reach slopes, located in smaller basins, and with shorter dam heights experienced the greatest decreases in impoundment DO relative to upstream. Differences between downstream and upstream DO were best explained by upstream slope and impoundment volume, whereby sites with steeper upstream reaches and larger volumes of water within the impoundment experienced the largest decreases in downstream DO when compared to upstream reaches. These results may help managers prioritize dam removal at sites where a dam is having larger and more negative (e.g., elevated temperatures, decreased DO) impacts, and therefore where the greatest benefits should occur following restoration.
30

District Level Preventive Maintenance Treatment Selection Tool for Use in Virginia

Hosten, Akyiaa Makeda 06 February 2013 (has links)
Preventive maintenance has the potential to improve network condition by retarding future pavement deterioration. The Virginia Department of Transportation uses its pavement management system to determine maintenance targets for each district. The districts then use these recommendations to select pavements that will receive maintenance and the types of treatments that will be applied. Each district has a different approach to preventive maintenance. There was a need for more consistent preventive maintenance practices across the state. This thesis outlines guidelines for the implementation of a preventive maintenance policy. Preventive maintenance treatments currently being used within Virginia include chip seal, slurry seal, microsurfacing, and thin hot mix asphalt overlays. Historical pavement condition data was obtained from the VDOT PMS for these treatments and treatment performance models were developed. A district level treatment selection tool was developed to assist the district level decision making process. A prioritized list of pavement sections was generated, maximizing the cost-effectiveness of the selected treatments subject to budgetary constraints set by the central office. The treatment selection tool was then run for each pavement classification in each district. The results of this analysis were presented. Although the recommended budget for each district was very close to the targets set by the central office, the recommended lane miles for each district were about half the targets set by the central office.  It is believed that the unit costs used in this analysis were higher than those used in the VDOT PMS analysis. This selection tool has the potential to be a very powerful decision support tool if the unit costs are representative of what the expected treatment costs are for each district. / Master of Science

Page generated in 0.1047 seconds