• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 177
  • 49
  • 25
  • 18
  • 15
  • 7
  • 5
  • 4
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 459
  • 74
  • 72
  • 64
  • 55
  • 40
  • 40
  • 38
  • 35
  • 34
  • 34
  • 33
  • 32
  • 31
  • 31
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

A high fidelity audio compressor-limiter with optoelectronic gain control

Zabel, William Peter, January 1969 (has links)
Thesis (M.S.)--University of Wisconsin--Madison, 1969. / eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references.
12

Infant Effects on Experimenter Fidelity: New Data

Dixon, Wallace, Driggers-Jones, L. P., Robertson, Chelsea LeeAnn 01 July 2020 (has links)
No description available.
13

GENETIC FIDELITY IN EXTREMOPHILES

MACKWAN, REENA RUFUS 17 July 2006 (has links)
No description available.
14

Assessing the Impact of Reading First Programs on Student Achievement in K-3 Classrooms in Selected Mississippi schools

Day-Meeks, Angel LaKease 09 December 2011 (has links)
This study investigated the implementation and impact of Reading First programs in 8 elementary schools across the state of Mississippi. The study assessed how principals, literacy coaches, and kindergarten through third grade teachers perceived the implementation of the Reading First program at their respective schools. Data from these three groups of research participants were analyzed to determine if there were differences in perceptions regarding program implementation. This study also examined if there was a relationship between participants’ judgment about implementation and second and third grade students reading scores on the Mississippi Curriculum Test (MCT). This study employed descriptive, survey, causal-comparative, and correlational research. Descriptive data were used to describe research participants’ gender, years of professional experience, highest degree held, and type of license held. Survey data were used to determine the perceptions of principals, literacy coaches, and teachers regarding the implementation fidelity of the Reading First program at their respective schools. An analysis of variance was used to determine if there were differences in the perceptions of the groups. Correlational statistics were used to analyze the possible existence of a relationship between principals’, literacy coaches’, and teachers’ perceptions about implementation and second and third grade students’ MCT reading scores. The study found that principals and literacy coaches perceived that the Reading First program was being fully implemented, but teachers believed that the program was being moderately implemented. There were no significant differences between the perceptions of principals, literacy coaches, and teachers. However, the study did reveal that principals, literacy coaches, and teachers had similar ratings regarding the implementation of specific Reading First program components. There was no correlation between perceived implementation fidelity of the Reading First program and students reading test scores on the MCT. Survey results revealed that most schools had fully implemented: (a) the uninterrupted, 90 minute reading block, (b) the 5 core elements of reading, (c) instructional strategies, and (d) support for struggling readers. Additionally, survey results indicated that schools need to strive toward fully implementing: (a) appropriate assessment strategies, (b) professional development activities that focus on reading instructional content and (c) instructional support activities.
15

An Exploration of High-Fidelity Virtual Training Simulators on Learners' Self-Efficacy: A Mixed Methods Study

Holbrook, Heather Anne 02 May 2012 (has links)
In this world of fast-paced learning, training agencies often require their learners to acquire the knowledge and skills needed for a job at an expedited rate. Because of this rapid form of training, learners are sometimes uncertain about their abilities to execute task-based performances. This uncertainty can lead to a decrease in learners’ self-efficacy on expected task performance. In order to help with this training, trainers are using a variety of simulations and simulators to provide learners’ valuable and necessary training experiences. This mixed methods study explored the influence of high-fidelity virtual training simulators on learners’ self-efficacy. It used pre- and post-simulation-use surveys that combined general self-efficacy questions (Schwarzer & Jerusalem, 1995) and task-specific self-efficacy questions (Bandura, 1977, 1997, 2006; Bandura, Adams, Hardy, & Howells, 1980). This study had a sample size of 18 participants. It was assumed that the intent of providing learners with the vital experience needed to perform specific tasks in a high-fidelity virtual training simulator was to increase their self-efficacy on task-specific criteria. Instead, through surveys, observations, and interviews, the research revealed a decrease in learners’ self-efficacy due to heightened emotional arousal stemming from the learners’ experiences with the level of realism the simulator provide, as well as with breakdowns within the simulator. The breakdowns and the realism were the most influential aspects that influenced self-efficacy in this study. The significance of these findings shows that despite learners wanting to use high-fidelity virtual training simulators, improperly functioning simulators can negatively influence learners’ self-efficacy in task-based performances. / Ph. D.
16

Identifying Critical Incidents That Helped or Hindered the Sustainment of Positive Behavior Interventions and Supports in Schools (PBIS) with Five Years or more of Implementation in One School Division

Dunbar, Michael Nathan 08 June 2020 (has links)
The purpose of this qualitative study was to identify critical incidents that helped or hindered the sustainment of Positive Behavior Interventions and Supports (PBIS) in schools with five years or more of implementation in one school division. This study highlighted information related to PBIS because of its comprehensive approach to school discipline. The research sought to answer the following questions: 1. What critical incidents do building-based leadership team members indicate have helped the sustainment of PBIS? 2. What critical incidents do building-based leadership team members indicate have hindered the sustainment of PBIS? Participants of this study included one representative from five different building-based PBIS leadership teams from a school division in the Commonwealth of Virginia. Data were collected and analyzed to determine the building-based PBIS leadership team member's perception of what constituted the sustainment of PBIS, connection between a school's Tiered Fidelity Inventory score and sustainment, key components of implementing PBIS with fidelity, importance of implementing PBIS with fidelity, most challenging sustainable elements of fidelity, and least challenging sustainable elements of PBIS. This research will further contribute to the existing body of literature through a thorough description of critical incidents that have helped or hindered the sustainment of PBIS. Matthews, McIntosh, Frank, and May (2013) stated fidelity is the degree to which a new initiative is delivered as intended in order for PBIS to be sustained. The research identified a need for establishing a stronger understanding of core components of fidelity, establishment of a leadership team, staff buy-in, data driven decisions, and building capacity. Participants also indicated the essentialness of consistency in establishing and utilizing school-wide expectations. Leadership team members emphasized the importance of administrative support from initial stages of implementation through sustainment. In addition, the study found resources to be a vital component to the reward system of PBIS. Participants discussed struggles they endured without proper resources. Ongoing training was also designated as a critical component of sustaining PBIS. Participants similarly indicated the consistent use of data was essential to setting goals and action planning. / Doctor of Education / The purpose of this qualitative study was to identify critical incidents that helped or hindered the sustainment of Positive Behavior Interventions and Supports (PBIS) in schools with five years or more of implementation in one school division. For purposes of this study, critical incidents are those key factors – both positive or negative – that impact the sustainment of PBIS. Educators often times adopt new practices that fail because the root of the concerns were never identified and addressed. PBIS' comprehensive approach to behavior adds value to the school climate, however, there is not much research about sustaining PBIS. The researcher sought to identify these incidents as a means of helping educators sustain PBIS beyond a five-year period. The data from this study could be used to develop guidelines for sustaining PBIS. The researcher interviewed one building-based leadership team member from five different schools to gain an understanding of what constituted sustaining PBIS, key components of fidelity, purpose of the Tiered Fidelity Inventory (TFI) score, and challenges of implementing PBIS with fidelity. This study sought to answer the following research questions: 1. What critical incidents do building-based leadership team members indicate have helped the sustainment of PBIS? 2. What critical incidents do building-based leadership team members indicate have hindered the sustainment of PBIS? Participants of this study included one representative from five different building-based PBIS leadership teams from a school division in the Commonwealth of Virginia. Data were collected and analyzed to determine building-based PBIS leadership team members' perception of what constituted the sustainment of PBIS, connection between a school's TFI score and sustainment, key components of implementing PBIS with fidelity, importance of implementing PBIS with fidelity, most challenging sustainable elements of fidelity, and least challenging sustainable elements of PBIS. The TFI is an assessment tool used to determine the level of fidelity school personnel are applying the core features of PBIS (VTSS, 2018). This research will further contribute to the existing body of literature through a thorough description of critical incidents that have helped or hindered the sustainment of PBIS. Matthews et al. (2013) stated fidelity is the degree to which a new initiative is delivered as intended in order for PBIS to be sustained. This research found a stronger understanding of core components of fidelity needed to be established. Participants also indicated the essentialness of consistency in establishing and utilizing school-wide expectations. Leadership team members emphasized the importance of administrative support from initial stages of implementation through sustainment. In addition, this study found resources to be a vital component to the reward system of PBIS. Participants discussed struggles they endured without proper resources. Continuous training was also designated as critical component of sustaining PBIS. Participants similarly indicated consistent use of data was essential to setting goals and action planning. These identified critical incidents could be used to create guidelines to help educators sustain PBIS beyond a five-year period.
17

Exploring the effects of different fidelities in an early design process of mobile prototyping

Danielsson, Pehr-Henric January 2010 (has links)
<p>There are a vast number of research and studies undertaken within the domain of user-centered design concerning the design process of an artifact, but still there are questions being raised of the relatively new and successful field of mobile design. During recent years mobile technology have surpassed the standardized thoughts of how to prototype and evaluate such a ubiquitous device. The paper addresses this predicament by discussing aspects of fidelity differences in an early design process of a mobile design. The example being presented refers to a design study of an iPhone-application, where two different types of prototypes were created and user tested, a low-fidelity paper prototype and a mixed-fidelity interactive prototype. The paper sets focus on the various differences between these fidelities closely related to the case. It also reflects on how mobile design approaches can mature during the early stages of a design process.</p>
18

Exploring the effects of different fidelities in an early design process of mobile prototyping

Danielsson, Pehr-Henric January 2010 (has links)
There are a vast number of research and studies undertaken within the domain of user-centered design concerning the design process of an artifact, but still there are questions being raised of the relatively new and successful field of mobile design. During recent years mobile technology have surpassed the standardized thoughts of how to prototype and evaluate such a ubiquitous device. The paper addresses this predicament by discussing aspects of fidelity differences in an early design process of a mobile design. The example being presented refers to a design study of an iPhone-application, where two different types of prototypes were created and user tested, a low-fidelity paper prototype and a mixed-fidelity interactive prototype. The paper sets focus on the various differences between these fidelities closely related to the case. It also reflects on how mobile design approaches can mature during the early stages of a design process.
19

Exploring the Effects of Higher-Fidelity Display and Interaction for Virtual Reality Games

McMahan, Ryan Patrick 05 January 2012 (has links)
In recent years, consumers have witnessed a technological revolution that has delivered more-realistic experiences in their own homes. Expanding technologies have provided larger displays with higher resolutions, faster refresh rates, and stereoscopic capabilities. These advances have increased the level of display fidelity—the objective degree of exactness with which real-world sensory stimuli are reproduced by a display system. Similarly, the latest generation of video game systems (e.g., Nintendo Wii and Xbox Kinect) with their natural, gesture-based interactions have delivered increased levels of interaction fidelity—the objective degree of exactness with which real-world interactions can be reproduced in an interactive system. Though this technological revolution has provided more realistic experiences, it is not completely clear how increased display fidelity and interaction fidelity impact the user experience because the effects of increasing fidelity to the real world have not been empirically established. The goal of this dissertation is to provide a better understanding of the effects of both display fidelity and interaction fidelity on the user experience. For the context of our research, we chose virtual reality (VR) games because immersive VR allows for high levels of fidelity to be achieved while games usually involve complex, performance-intensive tasks. In regard to the user experience, we were concerned with objective performance metrics and subjective responses such as presence, engagement, perceived usability, and overall preferences. We conducted five systematically controlled studies that evaluated display and interaction fidelity at contrasting levels in order to gain a better understanding of their effects. In our first study, which involved a 3D object manipulation game within a three-sided CAVE, we found that stereoscopy and the total size of the visual field surrounding the user (i.e., field of regard or FOR) did not have a significant effect on manipulation times but two high-fidelity interaction techniques based on six degrees-of-freedom (DOF) input outperformed a low-fidelity technique based on keyboard and mouse input. In our second study, which involved a racing game on a commercial game console, we solely investigated interaction fidelity and found that two low-fidelity steering techniques based on 2D joystick input outperformed two high-fidelity steering techniques based on 3D accelerometer data in terms of lap times and driving errors. Our final three studies involved a first-person shooter (FPS) game implemented within a six-sided CAVE. In the first of these FPS studies, we evaluated display fidelity and interaction fidelity independently, at extremely high and low levels, and found that both significantly affected strategy, performance, presence, engagement, and perceived usability. In particular, performance results were strongly in favor of two conditions: low-display, low-interaction fidelity (representative of desktop FPS games) and high-display, high-interaction fidelity (similar to the real world). In the second FPS study, we investigated the effects of FOR and pointing fidelity on the subtasks of searching, aiming, and firing. We found that increased FOR affords faster searching and that high-fidelity pointing based on 6-DOF input provided faster aiming than low-fidelity mouse pointing and a mid-fidelity mouse technique based on the heading of the user. In the third FPS study, we investigated the effects of FOR and locomotion fidelity on the subtasks of long-distance navigation and maneuvering. Our results indicated that increased FOR increased perceived usability but had no significant effect on actual performance while low-fidelity keyboard-based locomotion outperformed our high-fidelity locomotion technique developed for our original FPS study. The results of our five studies show that increasing display fidelity tends to have a positive correlation to user performance, especially for some components such as FOR. Contrastingly, our results have indicated that interaction fidelity has a non-linear correlation to user performance with users performing better with "traditionalThe results of our five studies show that increasing display fidelity tends to have a positive correlation to user performance, especially for some components such as FOR. Contrastingly, our results have indicated that interaction fidelity has a non-linear correlation to user performance with users performing better with "traditional", extremely low-fidelity techniques and "natural", extremely high-fidelity techniques while performing worse with mid-fidelity interaction techniques. These correlations demonstrate that the display fidelity and interaction fidelity continua appear to have differing effects on the user experience for VR games. In addition to learning more about the effects of display fidelity and interaction fidelity, we have also developed the Framework for Interaction Fidelity Analysis (FIFA) for comparing interaction techniques to their real-world counterparts. There are three primary factors of concern within FIFA: biomechanical symmetry, control symmetry, and system appropriateness. Biomechanical symmetry involves the comparison of the kinematic, kinetic, and anthropometric aspects of two interactions. Control symmetry compares the dimensional, transfer function, and termination characteristics of two interactions. System appropriateness is concerned with how well a VR system matches the interaction space and objects of the real-world task (e.g., a driving simulator is more appropriate than a 2D joystick for a steering task). Although consumers have witnessed a technological revolution geared towards more realistic experiences in recent years, we have demonstrated with this research that there is still much to be learned about the effects of increasing a system's fidelity to the real world. The results of our studies show that the levels of display and interaction fidelity are significant factors in determining performance, presence, engagement, and usability. / Ph. D.
20

Optical Superlenses: Quality and Fidelity in Silver-Dielectric Near-Field Imaging Systems

Moore, Ciaran Patrick January 2011 (has links)
In the year 2000 John Pendry described a new kind of lens that could focus both the propagating and evanescent components of light. This ‘super’ lens, which took the form of a thin slab of silver with a negative effective index of refraction under certain conditions, had the ability to reproduce images much smaller than the wavelength of light, seemingly in violation of the diffraction limit that governed the performance of conventional optics. Despite significant controversy regarding the purported operation of such superlenses, the first experimental samples were fabricated in 2005, with features as small as 63 nm successfully imaged with 365 nm light. These results put to rest disbelief in the feasibility of superlenses and ushered in an era of intense interest in near-field phenomena and negative index materials (NIMs). Despite sustained effort, progress on the practical implementation of superlenses was slow, with a further five years passing before improved experimental results were published. In the meantime, a proliferation of analytical and modelling studies appeared on the behaviour and properties of superlenses, as well as numerous suggestions for improved physical designs, very few of which had accompanying experimental evidence. The primary aim of this thesis arose from these many proposals, namely, to reconcile predictions made about the behaviour of superlenses with observed experimental results. The measurement of the theoretical and practical behaviour of superlenses is addressed in this thesis by the development of a set of characterisation metrics that can be used to describe the imaging performance of a number of near-field imaging systems. These metrics are initially calculated via transfer matrix modelling (TMM), which is a one-dimensional analytical technique traditionally used to find the transmission and reflection coefficients of planar structures. Two families of metrics are derived; one that describes imaging systems in terms of their abilities in generic situations and the other that gives the suitability of an imaging system for application to a given class of object. Transfer functions, bandwidth and peak wavenumber measurements form this first group of characterisation functions, while contrast, pseudo-contrast and correlation coefficients are used to assess the quality of imaging systems when exposed to well-defined input profiles. Both sets of metrics show that the performance of superlenses is highly application-specific, with the fidelity or otherwise of a generated image dependent more on the construction of the superlens than on the maximum spatial frequencies present in the object. The results from the characterisation metrics are also used to guide the design of hypothetical superlens structures; these suggest that sub-diffraction limited resolution may still be available with almost a full wavelength separation between object and image. The quantitative accuracy of the TMM method is assessed by comparison to full-field vector simulations performed via finite element modelling (FEM), these reveal systematic inadequacies in the application of the TMM technique to superlensing applications. These inadequacies stem from near-field mask-lens interactions that are present in superlens experiments but are not accounted for in TMM calculations. A new technique, based on a modified transfer matrix model (M-TMM), is proposed that accounts for the effects between masks and superlenses by approximating masks as solid slabs of known thickness. Results generated via M-TMM are shown to be in better agreement with FEM models than similar TMM data, even when the duty cycle of the actual mask becomes significant and the approximation in M-TMM is at its most coarse. Finally, experiments are designed and executed that directly measure the transfer functions of superlenses and other near-field imaging techniques. The problem of intimate contact between optics components, which normally hinders any such attempts to perform lithography in the near-field, is mitigated by including a flexible layer of poly (dimethylsiloxane) (PDMS) between various components in the mask:lens:resist stack. Furthermore, high spatial frequency data corresponding to low nanometre-scale features are retrieved from masks with periodic, micron-scale patterns, greatly easing the requirements on mask construction for these experiments. The end results show good agreement with FEM and M-TMM data and satisfy the aim of this thesis, which was to bridge the divide between the performance expected and experienced from silver superlenses.

Page generated in 0.0651 seconds