• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 24
  • 15
  • 6
  • 2
  • 1
  • 1
  • Tagged with
  • 62
  • 62
  • 23
  • 21
  • 18
  • 18
  • 15
  • 12
  • 12
  • 11
  • 11
  • 9
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Modelling and Control of Batch Processes

Aumi, Siam 04 1900 (has links)
<p>This thesis considers the problems of modelling and control of batch processes, a class of finite duration chemical processes characterized by their absence of equilibrium conditions and nonlinear, time-varying dynamics over a wide range of operating conditions. In contrast to continuous processes, the control objective in batch processes is to achieve a non-equilibrium desired end-point or product quality by the batch termination time. However, the distinguishing features of batch processes complicate their control problem and call for dedicated modelling and control tools. In the initial phase of this research, a predictive controller based on the novel concept of reverse-time reachability regions (RTRRs) is developed. Defined as the set of states from where the process can be steered inside a desired end-point neighbourhood by batch termination subject to input constraints and model uncertainties, an algorithm is developed to characterize these sets at each sampling instance offline; these characterizations subsequently play an integral role in the control design. A key feature of the resultant controller is that it requires the online computation of only the immediate control action while guaranteeing reachability to the desired end-point neighbourhood, rendering the control problem efficiently solvable even when using the nonlinear process model. Moreover, the use of RTRRs and one-step ahead type control policy embeds important fault-tolerant characteristics into the controller. Next, we address the problem of the unavailability of reliable and computationally manageable first-principles-based process models by developing a new data-based modelling approach. In this approach, local linear models (identified via latent variable regression techniques) are combined with weights (arising from fuzzy c-means clustering) to describe global nonlinear process dynamics. Nonlinearities are captured through the appropriate combination of the different models while the linearity of the individual models prevents against a computationally expensive predictive controller. This modelling approach is also generalized to account for time-varying dynamics by incorporating online learning ability into the model, making it adaptive. This is accomplished by developing a probabilistic recursive least squares (PRLS) algorithm for updating a subset of the model parameters. The data-based modelling approach is first used to generate data-based reverse-time reachability regions (RTRRs), which are subsequently incorporated in a new predictive controller. Next, the modelling approach is applied on a complex nylon-6,6 batch polymerization process in order to design a trajectory tracking predictive controller for the key process outputs. Through simulations, the modelling approach is shown to capture the major process nonlinearities and closed-loop results demonstrate the advantages of the proposed controller over existing options. Through further simulation studies, model adaptation (via the PRLS algorithm) is shown to be crucial for achieving acceptable control performance when encountering large disturbances in the initial conditions. Finally, we consider the problem of direct quality control even when there are limited quality-related measurements available from the process; this situation typically calls for indirectly pursuing the control objective through trajectory tracking control. To address the problem of unavailability of online quality measurements, an inferential quality model, which relates the process conditions over the entire batch duration to the final quality, is required. The accuracy of this type of quality model, however, is sensitive to the prediction of the future batch behaviour until batch termination. This "missing data" problem is handled by integrating the previously developed data-based modelling approach with the inferential model in a predictive control framework. The key feature of this approach is that the causality and nonlinear relationships between the future inputs and outputs are accounted for in predicting the final quality and computing the manipulated input trajectory. The efficacy of the proposed predictive control design is illustrated via simulations of the nylon-6,6 batch polymerization process with a different control objective than considered previously.</p> / Doctor of Philosophy (PhD)
52

A Multi-Site Case Study: Acculturating Middle Schools to Use Data-Driven Instruction for Improved Student Achievement

James, Rebecca C. 05 January 2011 (has links)
In the modern era of high-stakes accountability, test data have become much more than a simple comparison (Schmoker, 2006; Payne & Miller, 2009). The information provided in modern data reports has become an invaluable tool to drive instruction in classrooms. However, there is a lack of good training for educators to evaluate data and translate findings into solid practices that can improve student learning (Blair, 2006; Dynarski, 2008; Light, Wexler, & Heinze, 2005; Payne & Miller, 2009). Some schools are good at collecting data, but often fall short at what to do next. It is the role of the principal to serve as an instructional leader and guide teachers to the answer the reoccurring question of "now what?" The purpose of this study was to investigate ways in which principals build successful data-driven instructional systems within their schools using a qualitative multi-site case study method. This research utilized a triangulation approach with structured interviews, on-site visits, and document reviews from various middle school supervisors, principals, and teachers. The findings are presented in four common themes and patterns identified as essential components administrators used to implement data-driven instructional systems to improve student achievement. The themes are 1) administrators must clearly define the vision and set the expectation of using data to improve student achievement, 2) administrators must take an active role in the data-driven process, 3) data must be easily accessible to stakeholders, and 4) stakeholders must devote time on a regular basis to the data-driven process. The four themes led to the conclusion of ten common steps administrators can use to acculturate their school or school division with the data-driven instruction process. / Ed. D.
53

Response to Intervention: A Case Study Documenting one Elementary School's Successful Implementation

Rowden, Elizabeth Szydlo 01 May 2020 (has links)
The use of Response to Intervention, more commonly referred to as RTI has become more prevalent as school systems look to find ways of bridging the opportunity gap and provide support those students who are not successful in their attempts to access the general education curriculum. More research is needed in order to have a better understanding of not only how schools implement RTI, but also how they utilize data, monitor student progress and help to ensure fidelity of implementation. The purpose of this study was to examine and explain how one elementary school with a high quality RTI program implemented Response to Intervention while keeping all three essential components in consideration. The findings demonstrate that the subject elementary school combined several elements of Response to Intervention and in turn, created their own version of a hybrid RTI model that utilized components from both the standard protocol model and the problem-solving model. In order to monitor student progress, universal screeners were utilized several times throughout the year for both reading and math. Reading was also monitored through running records, PALS Quick Checks, Orton Gillingham assessments, and exit tickets, whereas Math utilized formative assessments, anecdotal notes, and exit tickets to track student progress. Each math and reading CLT met weekly to engage in dialogue around student data. An important finding is that the subject elementary school made RTI implementation decisions around what was best for their students, which allowed for a more flexible and adaptable approach. The system utilized targeted individual student needs and helped to ensure that ALL students had access to the necessary supports that would help to ensure student success. / Doctor of Education / As schools continue to face increasing demands, including how to meet the needs of students with diverse academic backgrounds, they have been charged with exploring new ways and methods of ensuring that students are successful in their attempts to access the general education curriculum. Response to Intervention, more commonly referred to as RTI, has become more widely used in school systems as they continue to work to ensure student success for all. RTI is seen as a tool to help accurately identify students who have a learning disability (Ciolfi and Ryan, 2011), however more research is needed in order to have a better understanding of how schools implement RTI, as well as how they utilize the data collected and monitor student progress. This qualitative case study analyzes how one subject elementary school implemented RTI, how they utilized data, as well as how they monitored the progress of their students.
54

Data-Based Decisions Guidelines for Teachers of Students with Severe Intellectual and Developmental Disabilities

Jimenez, Bree A., Mims, Pamela J., Browder, Diane M. 01 January 2012 (has links)
Effective practices in student data collection and implementation of data-based instructional decisions are needed for all educators, but are especially important when students have severe intellectual and develop- mental disabilities. Although research in the area of data-based instructional decisions for students with severe disabilities shows benefits for using data, there is limited research to demonstrate teachers in applied settings can acquire the decision-making skills required. The purpose of this research was to demonstrate how teachers from five states acquired a set of data-based decisions implementation guidelines through online professional development. Recommendations for practice and future research are included.
55

Autoria de decisões pedagógicas informadas por dados sob a perspectiva de um MOOC.

PAIVA, Ranilson Oscar Araújo. 04 June 2018 (has links)
Submitted by Maria Medeiros (maria.dilva1@ufcg.edu.br) on 2018-06-04T15:11:46Z No. of bitstreams: 1 RANILSON OSCAR ARAÚJO PAIVA - TESE (PPGCC) 2017.pdf: 11842712 bytes, checksum: ac3058f290c36c90ab21ce1a9820b601 (MD5) / Made available in DSpace on 2018-06-04T15:11:46Z (GMT). No. of bitstreams: 1 RANILSON OSCAR ARAÚJO PAIVA - TESE (PPGCC) 2017.pdf: 11842712 bytes, checksum: ac3058f290c36c90ab21ce1a9820b601 (MD5) Previous issue date: 2017 / Vivemos uma mudança no paradigma educacional onde se busca prover educação de qualquer lugar, a qualquer momento e para qualquer pessoa, utilizando tecnologias digitais da informação e comunicação. Há um interesse global no ensino à distância, mas não há um aumento equivalente de suporte aos professores e tutores responsáveis por manter tais cursos. Essa deficiência acentua os problemas e desafios existentes na gestão de cursos a distância, pois com o tempo mais estudantes aderem ao ensino a distância, requerendo a criação de novos cursos e turmas, incrementando a quantidade de estudantes sendo sub-servidos em termos de apoio pedagógico. Isso é evidenciado pelo grande número de desistências e falhas em tais cursos, particularmente, nos MOOCs (Cursos Online Massivos e Abertos), que os estudantes justificam como falta de suporte. Esta tese propõe uma solução de autoria para guiar a tomada de decisões pedagógicas em ambientes online de aprendizagem, tendo sido aplicada em um MOOC. A proposta visa auxiliar professores e tutores (1) a descobrir situações de interesse pedagógico ocorrendo em seus cursos; (2) compreender essas situações; (3) tomar decisões para aborda-las e; (4) monitorar e avaliar o impacto da decisão tomada. As interações em tais cursos geram grande quantidade e diversidade de dados, sendo possível extrair informações pedagogicamente relevantes. Entretanto, professores e tutores não dominam a tecnologia necessária para utilizar esses dados, nem é prático ou apropriado solicitar que sejam treinados para tanto. A proposta se fundamenta na tomada de decisões informada por dados educacionais, na visualização de dados e nos sistemas de autoria para promover a cooperação entre a inteligência artificial e a inteligência humana. Realizamos dois experimentos para: (1) avaliar se o processo auxilia professores e tutores a tomar decisões pedagógicas úteis e em tempo de curso; (2) avaliar se a solução de autoria guia a tomada de decisões pedagógicas de professores e tutores. Os resultados sugerem que o processo trouxe benefícios à tomada de decisões pedagógicas e que a solução de autoria foi capaz de guiar a tomada de decisões pedagógicas em tempo de curso e que sua utilidade e facilidade foram percebidas positivamente por professores e tutores. / We are experiencing changes in the educational paradigm. There is a quest to provide education from anywhere, at any time and for anyone, using digital information and communication technologies. There is a global interest in distance learning, but there is no equivalent increase in support for teachers and tutors responsible for maintaining such courses. This difference accentuates the problems and challenges in managing distance courses once that, over time, more students join distance learning, requiring the creation of new courses and classes, increasing the number of distance students being sub-served in terms of pedagogical support. This is evidenced by the large number of dropouts and faiilures in such courses, particularly in MOOCs (Massive Open Online Courses), that students justify as lack of support. This thesis proposes an authoring solution to guide the pedagogical decision making in online learning environments, which was applied in a MOOC. The proposal aims to help teachers and tutors (1) to discover situations of pedagogical interest occurring in their courses; (2) understand these situations; (3) make decisions to address them; (4) monitor and evaluate the impact of the decision made. The interactions in such courses generate considerable quantity and diversity of data,allowing the extraction of pedagogically relevant information. However, teachers and tutors do not master the technology needed to use this data, nor is it practical or appropriate to ask them to be trained to do so. Our proposal is based on informed decision making by educational data, data visualization and authoring systems to promote cooperation between artificial intelligence and human intelligence. We conducted two experiments to: (1) evaluate whether the process helps teachers and tutors to make useful and time-course pedagogical decisions; (2) to evaluate if the authoring solution guides the pedagogical decision making of teachers and tutors. The results suggest that the process brought benefits to pedagogical decision making and that the authoring solution was able to guide the pedagogical decision making in course time and that its usefulness and ease of use were positively perceived by teachers and tutors.
56

Estimation d'une matrice d'échelle. / Scale matrix estimation

Haddouche, Mohamed Anis 31 October 2019 (has links)
Beaucoup de résultats sur l’estimation d’une matrice d’échelle en analyse multidimensionnelle sont obtenus sous l’hypothèse de normalité (condition sous laquelle il s’agit de la matrice de covariance). Or il s’avère que, dans des domaines tels que la gestion de portefeuille en finance, cette hypothèse n’est pas très appropriée. Dans ce cas, la famille des distributions à symétrie elliptique, qui contient la distribution gaussienne, est une alternative intéressante. Nous considérons dans cette thèse le problème d’estimation de la matrice d’échelle Σ du modèle additif Yp_m = M + E, d’un point de vue de la théorie de la décision. Ici, p représente le nombre de variables, m le nombre d’observations, M une matrice de paramètres inconnus de rang q < p et E un bruit aléatoire de distribution à symétrie elliptique, avec une matrice de covariance proportionnelle à Im x Σ. Ce problème d’estimation est abordé sous la représentation canonique de ce modèle où la matrice d’observation Y est décomposée en deux matrices, à savoir, Zq x p qui résume l’information contenue dans M et une matrice Un x p, où n = m - q, qui résume l’information suffisante pour l’estimation de Σ. Comme les estimateurs naturels de la forme Σa = a S (où S = UT U et a est une constante positive) ne sont pas de bons estimateurs lorsque le nombre de variables p et le rapport p=n sont grands, nous proposons des estimateurs alternatifs de la forme ^Σa;G = a(S + S S+G(Z; S)) où S+ est l’inverse de Moore-Penrose de S (qui coïncide avec l’inverse S-1 lorsque S est inversible). Nous fournissons des conditions sur la matrice de correction SS+G(Z; S) telles que ^Σa;G améliore^Σa sous le coût quadratique L(Σ; ^Σ) = tr(^ΣΣ‾1 - Ip)² et sous une modification de ce dernier, à savoir le coût basé sur les données LS (Σ; ^Σ) = tr(S+Σ(^ΣΣ‾1 - Ip)²). Nous adoptons une approche unifiée des deux cas où S est inversible et S est non inversible. À cette fin, une nouvelle identité de type Stein-Haff et un nouveau calcul sur la décomposition en valeurs propres de S sont développés. Notre théorie est illustrée par une grande classe d’estimateurs orthogonalement invariants et par un ensemble de simulations. / Numerous results on the estimation of a scale matrix in multivariate analysis are obtained under Gaussian assumption (condition under which it is the covariance matrix). However in such areas as Portfolio management in finance, this assumption is not well adapted. Thus, the family of elliptical symmetric distribution, which contains the Gaussian distribution, is an interesting alternative. In this thesis, we consider the problem of estimating the scale matrix _ of the additif model Yp_m = M + E, under theoretical decision point of view. Here, p is the number of variables, m is the number of observations, M is a matrix of unknown parameters with rank q < p and E is a random noise, whose distribution is elliptically symmetric with covariance matrix proportional to Im x Σ. It is more convenient to deal with the canonical forme of this model where Y is decomposed in two matrices, namely, Zq_p which summarizes the information contained in M, and Un_p, where n = m - q which summarizes the information sufficient to estimate Σ. As the natural estimators of the form ^Σ a = a S (where S = UT U and a is a positive constant) perform poorly when the dimension of variables p and the ratio p=n are large, we propose estimators of the form ^Σa;G = a(S + S S+G(Z; S)) where S+ is the Moore-Penrose inverse of S (which coincides with S-1 when S is invertible). We provide conditions on the correction matrix SS+G(Z; S) such that ^Σa;G improves over ^Σa under the quadratic loss L(Σ; ^Σ) = tr(^ΣΣ‾1 - Ip)² and under the data based loss LS (Σ; ^Σ) = tr(S+Σ(^ΣΣ‾1 - Ip)²).. We adopt a unified approach of the two cases where S is invertible and S is non-invertible. To this end, a new Stein-Haff type identity and calculus on eigenstructure for S are developed. Our theory is illustrated with the large class of orthogonally invariant estimators and with simulations.
57

Databaserade beslut i skolan : En kritisk diskursanalys av rektorers välgrundade beslutsprocess / Data-based Decisions in School : A Discourse analysis of Principals written Data-based Decision process

Linder, Susanne January 2020 (has links)
In this study, the aim is to review principals' data-based decision-making process, which is based on analysing the discourses that are expressed in the documentation of their decision-making process. There are 84 decisions reviewed which come from 12 different principals in 13 different schools (pre-school class and year 1 - 9 compulsory school).  The theoretical startingpoint for the study is organisational theory and a model for Critical discourse analysis (CDA), which consists of three interrelated processes of analysis. Four different decision discourses are identified, efficiency discourse, knowledge discourse, documentation discourse and problem-based discourse. Previous research in the same field are confirmed. However, it contradicts with the idea that data-based decision-making process is following certain steps and fixed order in this kind of decision process in school. The conclusions are that despite guidelines and a strong focus on assessing results, the decision-making process seems to take other directions than planned in policies. Some parts in the process seems to be particularly complicated, and schools would gain from clarifying a designed data-based decision process. / I den här studien är syftet att granska rektorers välgrundade eller databaserade beslutsprocess, vilket sker genom att analysera de diskurser som kommer till uttryck i rektorers dokumentation om analys av resultat och planerade åtgärder. Utifrån en flerdimensionell analysmodell granskas 84 planerade åtgärder, från 12 olika rektorer i 13 olika grundskolor. Den teoretiska och metodologiska utgångspunkten finns i kritisk diskursanalys och organisationsteori. Resultatet stämmer till viss del överens med och bekräftar tidigare forskningsresultat inom samma område, men det motsäger att den välgrundade beslutsprocessen går till i bestämda steg och i en viss ordning. I studien identifieras fyra beslutsdiskurser: effektivitetsdiskurs, kunskapsdiskurs, dokumentationsdiskurs och problembaserad diskurs. Slutsatserna är, att trots riktlinjer och ett starkt fokus på att bedöma resultat verkar beslutsprocessen ta andra vägar än det som planeras i direktiv och policys. Vissa delar är särskilt komplicerade och av den anledningen kan alla organisationsnivåer vinna på att ge en mer tydlig form till processen för de databaserade besluten.
58

Best way to go? Intriguing citizens to investigate what is behind smart city technologies

Tachtler, Franziska Maria January 2016 (has links)
The topic of smart cities is growing in importance. However, a field study in the city of Malmö, Sweden shows that there is a discrepancy between the ongoing activities of urban planners and companies using analytical and digital tools to interpret humans’ behavior and preferences on the one hand, and the visibility of these developments in public spaces on the other. Citizens are affected by the invisible data and software not only when they use an application, but also when their living space is transformed. By Research through Design, this thesis examines ways of triggering discussion about smart city issues, which are hidden in software and code. In this thesis, a specific solution is developed: a public, tangible, and interactive visualization in the form of an interactive signpost. The final, partly functioning prototype is mountable in public places and points in the direction of the most beautiful walking path. The design refers to a smart city application that analyzes geo-tagged locative media and thereby predicts the beauty and security of a place.The aim is to trigger discussion about the contradictory issue of software interpreting the beauty of a place. Through its tangible, non-digital, and temporary character, the interactive representation encourages passers-by to interact with the prototype.
59

Développement des méthodes génériques d'analyses multi-variées pour la surveillance de la qualité du produit / Development of multivariate analysis methods for the product quality prediction

Melhem, Mariam 20 November 2017 (has links)
L’industrie microélectronique est un domaine compétitif, confronté de manière permanente à plusieurs défis. Pour évaluer les étapes de fabrication, des tests de qualité sont appliqués. Ces tests étant discontinus, une défaillance des équipements peut causer une dégradation de la qualité du produit. Des alarmes peuvent être déclenchées pour indiquer des problèmes. D’autre part, on dispose d’une grande quantité de données des équipements obtenues à partir de capteurs. Une gestion des alarmes, une interpolation de mesures de qualité et une réduction de données équipements sont nécessaires. Il s’agit dans notre travail à développer des méthodes génériques d’analyse multi-variée permettant d’agréger toutes les informations disponibles sur les équipements pour prédire la qualité de produit en prenant en compte la qualité des différentes étapes de fabrication. En se basant sur le principe de reconnaissance de formes, nous avons proposé une approche pour prédire le nombre de produits restant à produire avant les pertes de performance liée aux spécifications clients en fonction des indices de santé des équipement. Notre approche permet aussi d'isoler les équipements responsables de dégradation. En plus, une méthodologie à base de régression régularisée est développée pour prédire la qualité du produit tout en prenant en compte les relations de corrélations et de dépendance existantes dans le processus. Un modèle pour la gestion des alarmes est construit où des indices de criticité et de similarité sont proposés. Les données alarmes sont ensuite utilisées pour prédire le rejet de produits. Une application sur des données industrielles provenant de STMicroelectronics est fournie. / The microelectronics industry is a highly competitive field, constantly confronted with several challenges. To evaluate the manufacturing steps, quality tests are applied during and at the end of production. As these tests are discontinuous, a defect or failure of the equipment can cause a deterioration in the product quality and a loss in the manufacturing Yield. Alarms are setting off to indicate problems, but periodic alarms can be triggered resulting in alarm flows. On the other hand, a large quantity of data of the equipment obtained from sensors is available. Alarm management, interpolation of quality measurements and reduction of correlated equipment data are required. We aim in our work to develop generic methods of multi-variate analysis allowing to aggregate all the available information (equipment health indicators, alarms) to predict the product quality taking into account the quality of the various manufacturing steps. Based on the pattern recognition principle, data of the degradation trajectory are compared with health indices for failing equipment. The objective is to predict the remaining number of products before loss of the performance related to customer specifications, and the isolation of equipment responsible for degradation. In addition, regression- ased methods are used to predict the product quality while taking into account the existing correlation and the dependency relationships in the process. A model for the alarm management is constructed where criticality and similarity indices are proposed. Then, alarm data are used to predict the product scrap. An application to industrial data from STMicroelectronics is provided.
60

On the design of customized risk measures in insurance, the problem of capital allocation and the theory of fluctuations for Lévy processes

Omidi Firouzi, Hassan 12 1900 (has links)
No description available.

Page generated in 0.0441 seconds