Spelling suggestions: "subject:"databased"" "subject:"data.based""
41 |
Multilevel Analysis of a Scale Measuring Educators’ Perceptions of Multi-Tiered Systems of Supports PracticesMarshall, Leslie Marie 01 July 2016 (has links)
This study aimed to provide evidence of reliability and validity for the 42-item Perceptions of Practices Survey. The scale was designed to assess educators’ perceptions of the extent to which their schools were implementing multi-tiered system of supports (MTSS) practices. The survey was initially given as part of a larger evaluation project of a 3-year, statewide initiative designed to evaluate MTSS implementation. Elementary educators (Level-1 n = 2,109, Level-2 n = 62) completed the survey in September/October of 2007, September/October of 2008 (Level-1 n = 1,940, Level-2 n = 61), and January/February of 2010 (Level-1 n = 2,058, Level-2 n = 60). Multilevel exploratory and confirmatory factor analysis procedures were used to examine the construct validity and reliability of the instrument. Results supported a correlated four-factor model: Tiers I & II Problem Solving, Tier III Problem Identification, Tier III Problem Analysis & Intervention Procedures, and Tier III Evaluation of Response to Intervention. Composite reliability estimates for all factors across the three years approximated or exceeded .84. Additionally, relationships were found between the Perceptions of Practices Survey factors and another measure of MTSS implementation, the Tiers I & II Critical Components Checklist. Implications for future research regarding the psychometric properties of the survey and for its use in schools are discussed.
|
42 |
Response to Intervention 2 EasyCBM and AIMSweb Intervention Programs How They Relate to Student GrowthHopson, George T 01 August 2021 (has links)
This researcher aimed to determine how data collected from computer-based assessment programs, specifically EasyCBM and AIMSweb, was used in data-driven instruction and used to identify risk levels in math and reading areas proficiency. Data from intervention programs were collected from six participating high schools. The data collection included math and reading universal screening scores and levels of risk indicators from Tier 2 and Tier 3 levels of their response to intervention (RTI) programs. Section A included math data within a baseline score and a risk indicator level. Section B had reading scores with a baseline score and a risk indicator level.
A descriptive quantitative study was conducted to determine if significant differences in EasyCBM and AIMSweb exist in student universal screener scores over an academic calendar year. Independent variables included: math and reading universal screener scores, tier level identifiers, and level of risk indicators. Factors that influenced the rates of effectiveness included: interventionist utilization of data, student entry tier levels, and time spent in intervention from the fall to winter benchmarking period.
|
43 |
Data-Based Decision-Making in the Development of an RTI Certificate Program for Pre-Service TeachersHale, Kimberly D., Hudson, Tina 01 January 2016 (has links)
No description available.
|
44 |
Modeling for Control Design of an Axisymmetric Scramjet Engine IsolatorZinnecker, Alicia M. 18 December 2012 (has links)
No description available.
|
45 |
An investigation on automatic systems for fault diagnosis in chemical processesMonroy Chora, Isaac 03 February 2012 (has links)
Plant safety is the most important concern of chemical industries. Process faults can cause economic loses as well as human and environmental damages. Most of the operational faults are normally considered in the process design phase by applying methodologies such as Hazard and Operability Analysis (HAZOP). However, it should be expected that failures may occur in an operating plant. For this reason, it is of paramount importance that plant operators can promptly detect and diagnose such faults in order to take the appropriate corrective actions. In addition, preventive maintenance needs to be considered in order to increase plant safety.
Fault diagnosis has been faced with both analytic and data-based models and using several techniques and algorithms. However, there is not yet a general fault diagnosis framework that joins detection and diagnosis of faults, either registered or non-registered in records. Even more, less efforts have been focused to automate and implement the reported approaches in real practice.
According to this background, this thesis proposes a general framework for data-driven Fault Detection and Diagnosis (FDD), applicable and susceptible to be automated in any industrial scenario in order to hold the plant safety. Thus, the main requirement for constructing this system is the existence of historical process data. In this sense, promising methods imported from the Machine Learning field are introduced as fault diagnosis methods. The learning algorithms, used as diagnosis methods, have proved to be capable to diagnose not only the modeled faults, but also novel faults. Furthermore, Risk-Based Maintenance (RBM) techniques, widely used in petrochemical industry, are proposed to be applied as part of the preventive maintenance in all industry sectors. The proposed FDD system together with an appropriate preventive maintenance program would represent a potential plant safety program to be implemented.
Thus, chapter one presents a general introduction to the thesis topic, as well as the motivation and scope. Then, chapter two reviews the state of the art of the related fields. Fault detection and diagnosis methods found in literature are reviewed. In this sense a taxonomy that joins both Artificial Intelligence (AI) and Process Systems Engineering (PSE) classifications is proposed. The fault diagnosis assessment with performance indices is also reviewed. Moreover, it is exposed the state of the art corresponding to Risk Analysis (RA) as a tool for taking corrective actions to faults and the Maintenance Management for the preventive actions. Finally, the benchmark case studies against which FDD research is commonly validated are examined in this chapter.
The second part of the thesis, integrated by chapters three to six, addresses the methods applied during the research work. Chapter three deals with the data pre-processing, chapter four with the feature processing stage and chapter five with the
diagnosis algorithms. On the other hand, chapter six introduces the Risk-Based Maintenance techniques for addressing the plant preventive maintenance. The third part includes chapter seven, which constitutes the core of the thesis. In this chapter the proposed general FD system is outlined, divided in three steps: diagnosis model construction, model validation and on-line application. This scheme includes a fault detection module and an Anomaly Detection (AD) methodology for the detection of novel faults. Furthermore, several approaches are derived from this general scheme for continuous and batch processes. The fourth part of the thesis presents the validation of the approaches. Specifically, chapter eight presents the validation of the proposed approaches in continuous processes and chapter nine the validation of batch process approaches. Chapter ten raises the AD methodology in real scaled batch processes. First, the methodology is applied to a lab heat exchanger and then it is applied to a Photo-Fenton pilot plant, which corroborates its potential and success in real practice. Finally, the fifth part, including chapter eleven, is dedicated to stress the final conclusions and the main contributions of the thesis. Also, the scientific production achieved during the research period is listed and prospects on further work are envisaged. / La seguridad de planta es el problema más inquietante para las industrias químicas. Un fallo en planta puede causar pérdidas económicas y daños humanos y al medio ambiente. La mayoría de los fallos operacionales son previstos en la etapa de diseño de un proceso mediante la aplicación de técnicas de Análisis de Riesgos y de Operabilidad (HAZOP). Sin embargo, existe la probabilidad de que pueda originarse un fallo en una planta en operación. Por esta razón, es de suma importancia que una planta pueda detectar y diagnosticar fallos en el proceso y tomar las medidas correctoras adecuadas para mitigar los efectos del fallo y evitar lamentables consecuencias. Es entonces también importante el mantenimiento preventivo para aumentar la seguridad y prevenir la ocurrencia de fallos.
La diagnosis de fallos ha sido abordada tanto con modelos analíticos como con modelos basados en datos y usando varios tipos de técnicas y algoritmos. Sin embargo, hasta ahora no existe la propuesta de un sistema general de seguridad en planta que combine detección y diagnosis de fallos ya sea registrados o no registrados anteriormente. Menos aún se han reportado metodologías que puedan ser automatizadas e implementadas en la práctica real.
Con la finalidad de abordar el problema de la seguridad en plantas químicas, esta tesis propone un sistema general para la detección y diagnosis de fallos capaz de implementarse de forma automatizada en cualquier industria. El principal requerimiento para la construcción de este sistema es la existencia de datos históricos de planta sin previo filtrado. En este sentido, diferentes métodos basados en datos son aplicados como métodos de diagnosis de fallos, principalmente aquellos importados del campo de “Aprendizaje Automático”. Estas técnicas de aprendizaje han resultado ser capaces de detectar y diagnosticar no sólo los fallos modelados o “aprendidos”, sino también nuevos fallos no incluidos en los modelos de diagnosis. Aunado a esto, algunas técnicas de mantenimiento basadas en riesgo (RBM) que son ampliamente usadas en la industria petroquímica, son también propuestas para su aplicación en el resto de sectores industriales como parte del mantenimiento preventivo. En conclusión, se propone implementar en un futuro no lejano un programa general de seguridad de planta que incluya el sistema de detección y diagnosis de fallos propuesto junto con un adecuado programa de mantenimiento preventivo.
Desglosando el contenido de la tesis, el capítulo uno presenta una introducción general al tema de esta tesis, así como también la motivación generada para su desarrollo y el alcance delimitado. El capítulo dos expone el estado del arte de las áreas relacionadas al tema de tesis. De esta forma, los métodos de detección y diagnosis de fallos encontrados en la literatura son examinados en este capítulo. Asimismo, se propone una
taxonomía de los métodos de diagnosis que unifica las clasificaciones propuestas en el área de Inteligencia Artificial y de Ingeniería de procesos. En consecuencia, se examina también la evaluación del performance de los métodos de diagnosis en la literatura. Además, en este capítulo se revisa y reporta el estado del arte correspondiente al “Análisis de Riesgos” y a la “Gestión del Mantenimiento” como técnicas complementarias para la toma de medidas correctoras y preventivas. Por último se abordan los casos de estudio considerados como puntos de referencia en el campo de investigación para la aplicación del sistema propuesto. La tercera parte incluye el capítulo siete, el cual constituye el corazón de la tesis. En este capítulo se presenta el esquema o sistema general de diagnosis de fallos propuesto. El sistema es dividido en tres partes: construcción de los modelos de diagnosis, validación de los modelos y aplicación on-line. Además incluye un modulo de detección de fallos previo a la diagnosis y una metodología de detección de anomalías para la detección de nuevos fallos. Por último, de este sistema se desglosan varias metodologías para procesos continuos y por lote. La cuarta parte de esta tesis presenta la validación de las metodologías propuestas. Específicamente, el capítulo ocho presenta la validación de las metodologías propuestas para su aplicación en procesos continuos y el capítulo nueve presenta la validación de las metodologías correspondientes a los procesos por lote. El capítulo diez valida la metodología de detección de anomalías en procesos por lote reales. Primero es aplicada a un intercambiador de calor escala laboratorio y después su aplicación es escalada a un proceso Foto-Fenton de planta piloto, lo cual corrobora el potencial y éxito de la metodología en la práctica real. Finalmente, la quinta parte de esta tesis, compuesta por el capítulo once, es dedicada a presentar y reafirmar las conclusiones finales y las principales contribuciones de la tesis. Además, se plantean las líneas de investigación futuras y se lista el trabajo desarrollado y presentado durante el periodo de investigación.
|
46 |
The Relationship Between Systems-Change Coaching and Levels of Implementation and Fidelity of Problem-Solving/Response to Intervention (PS/RtI)March, Amanda 01 January 2011 (has links)
This study examined the extent to which coaching facilitates the successful implementation of the Problem-Solving/Response to Intervention (PS/RtI) model in schools, as well as the extent to which coaching enhances the fidelity of implementation of PS/RtI practices in those schools. Data from 34 schools in seven districts participating in three years of a statewide initiative to implement PS/RtI practices with assistance of a PS/RtI coach were used to evaluate the relationship between coaching activities and levels of implementation and integrity outcomes. Data on various coaching-related factors (i.e., perceived coaching quality, coach continuity, frequency and duration of training and technical assistance), educator beliefs and perceived skills, and PS/RtI implementation and fidelity levels were collected and examined utilizing a series of multilevel modeling (MLM) procedures. Results of the analysis suggest that a number of coaching variables were related to growth in specific measures of PS/RtI implementation and fidelity over time. Specifically, shorter, more frequent training sessions were related to higher levels of staff consensus and fidelity of problem analysis implementation over time after controlling for the quality of the coaching delivered. Growth in PS/RtI implementation over time was predicted positively by the continuity (the degree to which coaching was delivered by the same individual over the three years of the study) of the coaching received. Educators' perceptions of their own PS/RtI skill levels related to manipulation of data and use of technology in schools predicted increases in fidelity of problem identification implementation over time after controlling for quality of coaching. Fidelity of program evaluation/RtI implementation was predicted by the quality of coaching received across time. The relationship between coaching and infrastructure development, as well as the relationship between coaching and fidelity of intervention development and implementation, were unclear. Potential explanations for the findings from this exploratory study and implications for future research are discussed.
|
47 |
Investigating variability in student performance on DIBELS Oral Reading Fluency third grade progress monitoring probes: Possible contributing factorsBriggs, Rebecca N. 06 1900 (has links)
xv, 109 p. : col. ill. / The current study investigated variability in student performance on DIBELS Oral Reading Fluency (DORF) Progress Monitoring passages for third grade and sought to determine to what extent the variability in weekly progress monitoring scores is related to passage-level factors (e.g., type of passage [i.e., narrative or expository]), readability of the passage, reading rate for words in lists, passage specific comprehension, background knowledge, and interest in the topic of the passage) and student-level factors (e.g., the student's initial skill and variability across benchmark passages).
In light of recent changes in IDEIA legislation allowing for the use of Response to Intervention models and formative assessment practices in the identification of specific learning disabilities, it was intent of this study to identify factors associated with oral reading fluency that, once identified, could potentially be altered or controlled during progress monitoring and decision-making to allow for more defensible educational decisions.
The sample for analysis included 70 third grade students from one school in Iowa. Results of two-level HLM analyses indicated significant effects for background knowledge, interest in the passage, type of passage, retell fluency, readability, and word reading, with type of passage and readability demonstrating the largest magnitude effects. Magnitude of effect was based upon a calculation of proportion of reduction in level 1 residual variance. At level 2, initial risk status demonstrated a significant effect on a student's initial oral reading fluency score, while the benchmark variability demonstrated a significant effect on a student's growth over time.
Results demonstrate support for readability as an indicator of passage difficulty as it relates to predicting oral reading fluency for students and suggest that consideration for the type of passage may be warranted when interpreting student ORF scores. Additionally, results indicated possible student-level effects of variables such as background knowledge and word list that were not investigated within the current study. Limitations of the study, considerations for future research, and implications for practice are discussed. / Committee in charge: Roland Good, Chairperson/Advisor;
Laura Lee McIntyre, Member;
Joe Stevens Member;
Robert Davis, Outside Member;
Scott Baker, Member
|
48 |
A Case Study of the Impact of the Middle School Data Coach on Teacher Use of Educational Test Data to Change InstructionHill, Rachelle Phelps 12 1900 (has links)
With the advent of No Child Left Behind (NCLB) legislation in 2002 and its attendant increases in accountability pressure, many districts and schools currently embrace data analysis as an essential part of the instructional decision making process. In their attempts to overcome low achievement on state-mandated tests, some districts have begun employing data coaches. The study reported here, which was set in three middle schools in a northeast Texas school district, assessed the influence of the campus data coach on a middle school mathematics teachers' use of analyzed data to make instructional decisions. It also examined the extent to which the Data Coach/teacher relationship resolved teacher concerns about data-driven decision making. Phenomenological interviews with data coaches were guided by Seidman's (2006) three-series interview. Measurement of teacher use of data to make decisions was based on the concerns-based adoption model's levels of use interview protocol, stages of concern questionnaire, and innovation configuration map. By the end of one school year, two out of the three teachers never used data to make instructional decisions, although the non-users both had moved closer toward employing the innovation in their classroom. Data indicated all teachers were aware of the innovation, but all three ended the study with high personal concerns, signifying that the minimal efforts made by the data coaches to resolve concerns were not successful. This study's small sample gave the research paradigm of data-based decision making an in-depth glimpse into the process of implementing data-based instructional decision making and the Data Coach position on three middle school campuses in one large northeast Texas district.
|
49 |
Simulations and data-based models for electrical conductivities of graphene nanolaminatesRothe, Tom 13 August 2021 (has links)
Graphene-based conductor materials (GCMs) consist of stacked and decoupled layers of graphene flakes and could potentially transfer graphene’s outstanding material properties like its exceptional electrical conductivity to the macro scale, where alternatives to the heavy and expensive metallic conductors are desperately needed. To reach super-metallic conductivity however, a systematic electrical conductivity optimization regarding the structural and physical input parameters is required. Here, a new trend in the field of process and material optimization are data-based models which utilize data science methods to quickly identify and abstract information and relationships from the available data. In this work such data-based models for the conductivity of a real GCM thin-film sample are build on data generated with an especially improved and extended version of the network simulation approach by Rizzi et al. [1, 2, 3]. Appropriate methods to create data-based models for GCMs are thereby introduced and typical challenges during the modelling process are addressed, so that data-based models for other properties of GCMs can be easily created as soon as sufficient data is accessible. Combined with experimental measurements by Slawig et al. [4] the created data-based models allow for a coherent and comprehensive description of the thin-films’
electrical parameters across several length scales.:List of Figures
List of Tables
Symbol Directory
List of Abbreviations
1 Introduction
2 Simulation approaches for graphene-based conductor materials
2.1 Traditional simulation approaches for GCMs
2.1.1 Analytical model for GCMs
2.1.2 Finite element method simulations for GCMs
2.2 A network simulation approach for GCMs
2.2.1 Geometry generation
2.2.2 Electrical network creation
2.2.3 Contact and probe setting
2.2.4 Conductivity computation
2.2.5 Results obtained with the network simulation approach
2.3 An improved implementation for the network simulation
2.3.1 Rizzi’s implementation of the network simulation approach
2.3.2 An network simulation tool for parameter studies
2.3.3 Extending the network simulation approach for anisotropy investigations and multilayer flakes
3 Data-based material modelling
3.1 Introduction to data-based modelling
3.2 Data-based modelling in material science
3.3 Interpretability of data-based models
3.4 The data-based modelling process
3.4.1 Preliminary considerations
3.4.2 Data acquisition
3.4.3 Preprocessing the data
3.4.4 Partitioning the dataset
3.4.5 Training the model
3.4.6 Model evaluation
3.4.7 Real-world applications
3.5 Regression estimators
3.5.1 Mathematical introduction to regression
3.5.2 Regularization and ridge regression
3.5.3 Support Vector Regression
3.5.4 Introducing non-linearity through kernels
4 Data-based models for a real GCM thin-film
4.1 Experimental measurements
4.2 Simulation procedure
4.3 Data generation
4.4 Creating data-based models
4.4.1 Quadlinear interpolation as benchmark model
4.4.2 KR, KRR and SVR
4.4.3 Enlarging the dataset
4.4.4 KR, KRR and SVR on the enlarged training dataset
4.5 Application to the GCM sample
5 Conclusion and Outlook
5.1 Conclusion
5.2 Outlook
Acknowledgements
Statement of Authorship
|
50 |
Data driven support of production through increased control of information parameters : A case studyCavallin, Petter January 2020 (has links)
The current manufacturing business environment becomes more dependent of digital tools to increase business opportunities and customer value. The organizations ability to embrace the digital tools is dependent of a its digital maturity position. The organization structure, information systems, and communication are variables affecting the position and enables or disables possibility of data-based decisions-making (DBDM). To determine the ability information system and information flow is analyzed through a case study at one of the production departments. The areas studied in the case study are information flow of metal powder and compression tools. The case study is performed to study the organizations ability to connecting information, study information flow and assess potential information disruption. It is assessed by using digital maturity assessments. This result provides an insight of how it affects the DBDM abilities within the department. These areas are common in a general production setting. The metal powder area is analyzed by an experiment where the metal powder containers is manually measured and evaluate the real weight compared to the depreciated weight in the information system. The compression tool analysis is performed by extracting and analyzing structured- and unstructured machine data from the production. This analytical angle is dependent of reliable data, and information disruption between the production processes and the servers is noticed during the extraction of data. This extraction process and analysis resembles the need when implementing machine learning and other automatic applications. The 360DMA assess a general view of the organizations position and follow up with a method how to reach certain goals to increase one of the five levels. The Acatech-model is used to assess two structural areas, resources and information systems. The metal powder container analysis shows that there is a problem between the information stored in the systems regarding weight of the metal powder containers. The compression tool analysis result is that the stored data about the compression tools and the count of the different components is not correct. This and difficulties with manually- and automatically extracting data from server’s cause information disruptions and decrease the production process information reliability and validity. This decrease the ability to use the production data to make data driven decisions and gain insights about the production. The digital maturity assessment position the organization on a connectivity level (Acatech model) regarding information systems and resources means that data is unreliable and once its reliable the next level is in reach. The varying position within the 360DMA model call for management to synchronize development between processes by introducing strategies, define responsibilities and understand the information flow.
|
Page generated in 0.0453 seconds