• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 593
  • 257
  • 133
  • 99
  • 62
  • 29
  • 24
  • 14
  • 13
  • 12
  • 11
  • 9
  • 8
  • 6
  • 5
  • Tagged with
  • 1519
  • 853
  • 227
  • 156
  • 155
  • 149
  • 140
  • 129
  • 123
  • 117
  • 108
  • 107
  • 91
  • 91
  • 88
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

Development of a Quality Assurance Procedure for Dose Volume Histogram Analysis

Davenport, David Alan 07 August 2013 (has links)
No description available.
352

Physical Security Assessment of a Regional University Computer Network

Timbs, Nathan H 01 December 2013 (has links) (PDF)
Assessing a network's physical security is an essential step in securing its data. This document describes the design, implementation, and validation of PSATool, a prototype application for assessing the physical security of a network's intermediate distribution frames, or IDFs (a.k.a. "wiring closets"). PSATool was created to address a lack of tools for IDF assessment. It implements a checklist-based protocol for assessing compliance with 52 security requirements compiled from federal and international standards. This checklist can be extended according to organizational needs. PSATool was validated by using it to assess physical security at 135 IDFs at East Tennessee State University. PSATool exposed 95 threats, hazards, and vulnerabilities in 82 IDFs. A control was recommended for each threat, hazard, and vulnerability discovered. The administrators of ETSU's network concluded that PSATool's results agreed with their informal sense of these IDFs' physical security, while providing documented support for improvements to IDF security.
353

Non-contact Methods for Detecting Hot-mix Asphalt Nonuniformity

de León Izeppi, Edgar 06 November 2006 (has links)
Segregation, or non-uniformity, in Hot Mix Asphalt (HMA) induces accelerated pavement distress(es) that can reduce a pavement's service life up to 50%. Quality Assurance procedures should detect and quantify the presence of this problem in newly constructed pavements. Current practices are usually based on visual inspections that identify non-uniform surface texture areas. An automatic process that reduces subjectivity would improve the quality-assurance procedures of HMA pavements. Virginia has undertaken a focused research effort to improve the uniformity of hot-mix asphalt (HMA) pavements. A method using a dynamic (laser-based) surface macrotexture instrument showed great promise, but it revealed that it may actually miss significant segregated areas because they only measure very thin longitudinal lines. The main objective of this research is to develop a non-contact system for the detection of segregated HMA areas and for the identification of the locations of these areas along a road for HMA quality assurance purposes. The developed system uses relatively low cost components and innovative image processing and analysis software. It computes the gray level co-occurrence matrix (GLCM) of images of newly constructed pavements to find various parameters that are commonly used in visual texture analysis. Using principal component analysis to integrate multivariable data into a single classifier, Hotelling's T2 statistic, the system then creates a list of the location of possible nonuniformities that require closer inspection. Field evaluations of the system at the Virginia Smart Road proved that it is capable of discriminating between different pavement surfaces. Verification of the system was conducted through a series of field tests to evaluate the uniformity of newly constructed pavements. A total of 18 continuous road segments of recently paved roads were tested and analyzed with the system. Tables and plots to be used by inspection personnel in the field were developed. The results of these field tests confirmed the capability of the system to detect potential nonuniformities of recently completed pavements. The system proved its potential as a useful tool in the final inspection process. / Ph. D.
354

Optical accuracy assessment of robotically assisted dental implant surgery

Klass, Dmitriy, D.D.S. 11 August 2022 (has links)
BACKGROUND: Static and dynamic dental implant guidance systems have established themselves as effective choices that result in predictable and relatively accurate dental implant placement. Generally, studies assess this accuracy using a postoperative CBCT, which has disadvantages such as additional radiation exposure for the patient. This pilot study proposed a scanbody-agnostic method of implant position assessment using intraoral scanning technology and applied it as an accuracy test of robotically assisted dental implant placement using the Neocis Yomi. MATERIALS AND METHODS: All of the robotically assisted dental implant surgery was performed in the Postdoctoral Periodontology clinic at Boston University Henry M. Goldman School of Dental Medicine. Completely edentulous patients were excluded. A total of eleven (11) implants were included in the study, eight (8) of which were fully guided. An optical impression of each implant position was obtained using a CEREC Omnicam (SW 5.1) intraoral scanner. Each sample used either a DESS Lab Scan Body or an Elos Accurate Scan Body as a means to indirectly index the position of the implant. A comparison of planned implant position versus executed surgical implant position was performed for each placement using Geomagic Control X software. Global positional and angular deviations were quantified using a proposed scanbody-agnostic method. Intraoral directionality of deviation was visually qualified by the author (D.K). RESULTS: Mean global positional deviations at the midpoints of the top of each scanbody were 1.7417 mm in the partially guided samples and 1.1300 mm in the fully guided samples. Mean global positional deviations at the midpoints of the restorative platforms of each implant were 1.3142 mm in the partially guided sample and 1.27045 mm in the fully guided samples. Mean global positional deviations at the midpoints of the apex of each implant were 1.455 mm in the partially guided samples and 1.574 mm in the fully guided samples. Mean angular deviations were 3.7492 degrees in the partially guided samples and 2.6432 degrees in the fully guided samples. CONCLUSION: Within the sample size limitations, robotically assisted dental implant surgery offers similar implant placement accuracy compared to published static and dynamic implant placement guidance systems. Intraoral optical assessment of dental implant position used in this study allows comparable analysis to other methods without requiring additional exposure to radiation and should be considered the default method of assessing guidance accuracy.
355

Sustainability Reports and the Missing of One Official Standard : Consequence for Audit Quality

Bruno, Saron, Shariff, Said January 2022 (has links)
Sustainability reporting has been increasing in popularity in the last decade giving medium tolarge companies a tool to communicate with their users about their sustainability activity. Asglobalization and technology were introduced and increased in demand many violations startedby companies including human right violation, economic inequality, pollution, etc. Thisviolation led to different stakeholders demanding for companies be transparent and report ontheir sustainability activity through sustainability reporting.As sustainability reporting increased in popularity many challenges arose including a lack ofhomogeneity and comparability for the users and the firms themselves. Many standards andguidelines were created and are still being created to minimize the problem that firms,stakeholders, and auditors face. However, many standards and guidelines introduced lead tomaking the job of auditors more complex and harder as they must know and understand all thestandards available.Multiple studies were done regarding sustainability reports and the absence of one singleofficial standard. The studies explored the subject with interviews or case research tounderstand the consequence or effect that has on auditors and/or audit quality. Similar to thosestudies, this paper also follow the previous research by answering the following question thatwas posed by the authors: “What are the consequences of the absence of one single officialstandard in sustainability reporting on the audit quality?”. To answer this question, the authorsinterviewed 10 participants who work with auditing, accounting, and analyzing sustainabilityreports which helped the thesis with a practical contribution. Furthermore, the thesis alsocontributed to the knowledge by examining the results using Emergency Theory, LegitimacyTheory, and Complexity Theory. This thesis provides evidence regarding the consequence ofhaving multiple standards with low law requirements for sustainability reporting and why thereis a need for one single official standard.The findings of this thesis shows that auditors are face multiple challenges when it comes tonot having one official standard. The challenge affects the audit quality when auditingsustainability reports. The results show that sustainability is very multidimensional whichmakes the topic complex to auditors. Auditors also face the challenge of sustainability beingan emerging concept in the accounting domain which leads to not much research and theoriespresent to produce frameworks and standards for sustainability reports.
356

Assessment Resistance: Using Kubler-Ross to Understand and Respond

Tarnoff, Karen A., Bostwick, Eric D., Barnes, Kathleen J. 24 November 2021 (has links)
Purpose: Faculty participation in the assurance of learning (AoL) is requisite both for the effective operation of the system and for accreditation compliance, but faculty often resist engaging in AoL tasks. The purpose of this paper is to provide specific recommendations to address faculty concerns and to guide AoL systems toward maturity. Design/methodology/approach: This paper provides a comprehensive model of faculty resistance perspectives aligned to AoL maturity, provides specific responses to faculty resistance and introduces success markers of progress toward maturity. Findings: Specifically, a three-stage model of AoL system maturity is presented and aligned with five faculty perspectives. For each faculty perspective, responses targeting causal factors are proposed and signs of progress toward the next level of faculty engagement are highlighted. Practical implications: Faculty and AoL leaders will be able to identify their current stage of AoL system maturity and implement practical solutions to move to the next stage of system maturity. Social implications: Understanding the motivations for faculty resistance will facilitate more meaningful and effective internal interactions as a school seeks to improve its AoL system. In turn, a more effective AoL system will promote better learning experiences for students; and better learning allows students to become productive in their chosen careers more quickly, thus improving society as a whole. Originality/value: To the knowledge, no prior paper has organized faculty resistance along a maturity continuum, provided targeted responses based on the level of maturity or included signs that indicate growth toward the next level of maturity.
357

Investigation of Chromosome Size Effect on the Rate of Crossovers in the Meiotic Yeast Saccharomyces cerevisiae

Galland, Lanie Maria 01 June 2014 (has links) (PDF)
Meiosis is a specialized type of cell division characterized by a single round of DNA replication and two rounds of chromosome segregation, ultimately resulting in four haploid cells. During meiosis I, chromosomes align and reciprocal recombination results in the formation of a crossover, creating the tension required to properly segregate homologs during the first round of meiosis. Two mechanisms involved in regulating the occurrence of crossing over are assurance and interference. Crossover assurance describes the phenomenon that at least one crossover will form between each pair of homologous chromosomes during prophase I. Crossover interference, on the other hand, describes the nonrandom placement of crossovers between homologs, increasing the probability that a second crossover will occur at a discrete distance away from the first one. In addition to assurance and interference, chromosome size may play a role in the rate of meiotic recombination during prophase I. As a result of crossover assurance, small chromosomes receive a minimum of one crossover, the obligate crossover. Assuming chromosome size does not influence the rate of recombination, pairs of large chromosomes should experience the same number of crossovers per base pair as small chromosomes. Previous studies have been inconsistent: Kaback et al. (1999) saw decreased rates of crossing over between large chromosomes relative to small ones, suggesting that crossover interference acts across a larger distance on large chromosomes. Turney et al. (2004), however, saw no such effect, suggesting that these findings may be site- or sequence-specific. The current study used the Cre-loxP system to create translocated chromosomes, decreasing the size of chromosome VIII from 562 kb to 125 kb. The rate of crossing over was evaluated using nutrient marker genes that were inserted on the left arm of chromosome VIII to facilitate phenotypic detection of crossing over between homologous translocated chromosomes in comparison to crossing over between homologous nontranslocated chromosomes. Translocated strains were attempted, though further testing suggests that the translocation itself may be lethal. In the future, we plan to further investigate the potential lethal nature of the translocation. We also experienced difficulty in curing yeast cells of the Cre expression plasmid: as pSH47 was removed, translocated chromosomes reverted to nontranslocated chromosomes. In addition, crossing over in nontranslocated yeast, along with subsequent molecular analysis, revealed that one of the marker genes presumed to be on the left arm of chromosome VIII is, in fact, located on a different chromosome, preventing analysis of crossing over in this region. As a result, we were unable to proceed with current experimentation.
358

Increasing Program Effectiveness Through use of Principles of Andragogy in Tennessee Beef Quality Assurance Programs

McCormick, Lisa Ellis 07 July 2023 (has links)
Tennessee Beef Quality Assurance (BQA) programs teach beef producers the importance of quality within beef industries. BQA programs assure consumers of the quality and safety of supplied beef, as well as the environmental orientation of farm production practices (Tsakiridis et al., 2021). Any active BQA certificate holder in Tennessee can apply for the Tennessee Agricultural Enhancement Program (TAEP). TAEP significantly benefits both farmers and the economy. The TAEP is a cost-share system funding over $106 million dollars funding over thirty-seven thousand programs in the agricultural community statewide (Farm Bureau, Tennessee 2019 Resolutions, 2019). The cost-share program aids farmers to begin projects that could not have been financially feasible if the cost-share program was not available (Menard et al., 2019). The BQA program is an educational program taught as Cooperative Extensions efforts. The program aims to predominately adult beef cattle producers. Andragogy, also known as adult learning theory, was created by Malcolm Knowles to effectively teach adults. In this study, qualitative methods and quantitative methods were used to accurately identify how andragogy is being used in Tennessee BQA programs. The results showed Extension agents followed the seven-step andragogical design process and showed that BQA participants have the six andragogical principles. Recommendations for future research were identified to adapt the Andragogy in practice inventory for instructors, conduct a research study that addresses counties with smaller participation, and conduct studies with county agents in early career stages. Recommendations for the Tennessee BQA program are to have trainings for Extension agents around the andragogical process and to reevaluate the requirement for additional programs. / Master of Science in Life Sciences / Since BQA was established in 1987 by the Beef Checkoff, trainings across 47 states have been implemented to guide beef producers with the tools and training necessary to assure animal health and well-being. The program is an educational program that is typically taught by Extension education. Extension education was established by the Smith-Lever Act in 1914 which was established for the educational outreach of the Land-Grant institution for the growth of rural areas across the United States. This study aimed to identify how adult learning theory, andragogy, is used in Tennessee BQA programs and to make appropriate recommendations to ensure program effectiveness. This study is important to identify educational effectiveness in the BQA program and to ensure program participants are implementing program objectives to ensure the goals and purposes of the BQA program.
359

Computational intelligence for safety assurance of cooperative systems of systems

Kabir, Sohag, Papadopoulos, Y. 29 March 2021 (has links)
Yes / Cooperative Systems of Systems (CSoS) including Autonomous systems (AS), such as autonomous cars and related smart traffic infrastructures form a new technological frontier for their enormous economic and societal potentials in various domains. CSoS are often safety-critical systems, therefore, they are expected to have a high level of dependability. Due to the open and adaptive nature of the CSoS, the conventional methods used to provide safety assurance for traditional systems cannot be applied directly to these systems. Potential configurations and scenarios during the evolving operation are infinite and cannot be exhaustively analysed to provide guarantees a priori. This paper presents a novel framework for dynamic safety assurance of CSoS, which integrates design time models and runtime techniques to provide continuous assurance for a CSoS and its systems during operation. / Dependability Engineering Innovation for Cyber Physical Systems (DEIS) H2020 Project under Grant 732242.
360

Combining Static Analysis and Dynamic Learning to Build Context Sensitive Models of Program Behavior

Liu, Zhen 10 December 2005 (has links)
This dissertation describes a family of models of program behavior, the Hybrid Push Down Automata (HPDA) that can be acquired using a combination of static analysis and dynamic learning in order to take advantage of the strengths of both. Static analysis is used to acquire a base model of all behavior defined in the binary source code. Dynamic learning from audit data is used to supplement the base model to provide a model that exactly follows the definition in the executable but that includes legal behavior determined at runtime. Our model is similar to the VPStatic model proposed by Feng, Giffin, et al., but with different assumptions and organization. Return address information extracted from the program call stack and system call information are used to build the model. Dynamic learning alone or a combination of static analysis and dynamic learning can be used to acquire the model. We have shown that a new dynamic learning algorithm based on the assumption of a single entry point and exit point for each function can yield models of increased generality and can help reduce the false positive rate. Previous approaches based on static analysis typically work only with statically linked programs. We have developed a new component-based model and learning algorithm that builds separate models for dynamic libraries used in a program allowing the models to be shared by different program models. Sharing of models reduces memory usage when several programs are monitored, promotes reuse of library models, and simplifies model maintenance when the system updates dynamic libraries. Experiments demonstrate that the prototype detection system built with the HPDA approach has a performance overhead of less than 6% and can be used with complex real-world applications. When compared to other detection systems based on analysis of operating system calls, the HPDA approach is shown to converge faster during learning, to detect attacks that escape other detection systems, and to have a lower false positive rate.

Page generated in 0.0547 seconds