31 |
Statistically steady measurements of Rayleigh-Taylor mixing in a gas channelBanerjee, Arindam 30 October 2006 (has links)
A novel gas channel experiment was constructed to study the development of
high Atwood number Rayleigh-Taylor mixing. Two gas streams, one containing air
and the other containing helium-air mixture, flow parallel to each other separated by
a thin splitter plate. The streams meet at the end of a splitter plate leading to the
formation of an unstable interface and of buoyancy driven mixing. This buoyancy
driven mixing experiment allows for long data collection times, short transients and
was statistically steady. The facility was designed to be capable of large Atwood
number studies of ABtB ~ 0.75. We describe work to measure the self similar evolution
of mixing at density differences corresponding to 0.035 < ABtB < 0.25. Diagnostics
include a constant temperature hot-wire anemometer, and high resolution digital
image analysis. The hot-wire probe gives velocity, density and velocity-density
statistics of the mixing layer. Two different multi-position single-wire techniques
were used to measure the velocity fluctuations in three mutually perpendicular
directions. Analysis of the measured data was used to explain the mixing as it
develops to a self-similar regime in this flow. These measurements are to our knowledge, the first use of hot-wire anemometry in the Rayleigh-Taylor community.
Since the measurement involved extensive calibration of the probes in a binary gas
mixture of air and helium, a new convective heat transfer correlation was formulated
to account for variable-density low Reynolds number flows past a heated cylinder. In
addition to the hot-wire measurements, a digital image analysis procedure was used
to characterize various properties of the flow and also to validate the hot-wire
measurements. A test of statistical convergence was performed and the study
revealed that the statistical convergence was a direct consequence of the number of
different large three-dimensional structures that were averaged over the duration of
the run.
|
32 |
A pilot study of test driven developmentAravindhan, Arasi 21 February 2011 (has links)
Test Driven Development is a software technique which uses automated unit tests to drive software design and to force decoupling of dependencies. This report describes the pilot study that was conducted to understand Test Driven Development process and to evaluate its pros and cons before adopting it completely across the software team. The goal of the pilot study was to use TDD principles to build part of a real life software project - in particular, to completely implement 3 user stories - and to evaluate the resulting software. The main questions being discussed are - Is it feasible to adopt TDD in the development of a real life system with databases and UI? How easy is it to convert a user story into a set of unit tests? Can a set of unit tests adequately represent a user story or are requirements lost in translation? / text
|
33 |
Experimental investigation on evaporation induced convection in water using laser based measurement techniquesSong, Xudong Unknown Date
No description available.
|
34 |
Smart Modeling of Drilling-Well in an Integrated ApproachRahman, Shah Md Rajiur Unknown Date
No description available.
|
35 |
Feature constraint grammarsGötz, Thilo. January 2000 (has links) (PDF)
Tübingen, University, Diss., 1999.
|
36 |
Discrete Event Simulation of Operating Rooms Using Data-Driven ModelingMalik, Mandvi January 2018 (has links)
No description available.
|
37 |
An Examination of Mathematics Teachers’ Use of Student Data in Relationship to Student Academic PerformanceHartmann, Lillian Ann 12 1900 (has links)
Among educational researchers, important questions are being asked about how to improve mathematics instruction for elementary students. This study, conducted in a north Texas public school with 294 third- through fifth-grade students, ten teachers and three coaches, examined the relationship between students’ achievement in mathematics and the mathematics teaching and coaching instruction they received. Student achievement was measured by the Computer Adaptive Instrument (CAT), which is administered three times a year in the district and is the main criterion for students’ performance/movement in the district’s response to intervention program for mathematics. The response to intervention model employs student data to guide instruction and learning in the classroom and in supplemental sessions. The theoretical framework of the concerns based adoption model (CBAM) was the basis to investigate the concerns that mathematics teachers and coaches had in using the CAT student data to inform their instruction. The CAT data, based on item response theory, was the innovation. Unique in this study was the paralleling of teachers’ and coaches’ concerns and profiles for their use of the data with student scores using an empirical approach. Data were collected at three intervals through the Stages of Concerns Questionnaire, the Levels of Use interviews, and the Innovation Configuration Components Matrix from teachers and at three intervals student CAT-scaled scores. Multiple regression analyses with the concerns and CAT scores and levels of use and CAT scores were conducted to determine if relationships existed between the variables. The findings indicated that, overall, the teachers and coaches who scored high in personal concerns at the three data points remained at low levels of use or non-use of CAT data in their instruction. Only two teachers indicated movement from high intense personal concerns to high concerns regarding the impact on students. This correlated with their increased use of CAT at the three-collection points. The regression analyses indicated no correlations between the teachers’ and coaches’ concerns and the CAT and no correlations between their levels of data use and the CAT. At the exit interviews, patterns suggested that the presence of a change facilitator might have made a difference in their understanding and use of the CAT data ultimately impacting student achievement. This study sets a new precedent in the use of CBAM data and offers insights into the necessity of providing support and training in a change process.
|
38 |
Data-Driven Modeling and Control of Batch and Continuous Processes using Subspace MethodsPatel, Nikesh January 2022 (has links)
This thesis focuses on subspace based data-driven modeling and control techniques for batch and continuous processes. Motivated by the increasing amount of process data, data-driven modeling approaches have become more popular. These approaches are better in comparison to first-principles models due to their ability to capture true process dynamics. However, data-driven models rely solely on mathematical correlations and are subject to overfitting. As such, applying first-principles based constraints to the subspace model can lead to better predictions and subsequently better control. This thesis demonstrates that the addition of process gain constraints leads to a more accurate constrained model. In addition, this thesis also shows that using the constrained model in a model predictive control (MPC) algorithm allows the system to reach desired setpoints faster. The novel MPC algorithm described in this thesis is specially designed as a quadratic program to include a feedthrough matrix. This is traditionally ignored in industry however this thesis portrays that its inclusion leads to more accurate process control.
Given the importance of accurate process data during model identification, the missing data problem is another area that needs improvement. There are two main scenarios with missing data: infrequent sampling/ sensor errors and quality variables. In the infrequent sampling case, data points are missing in set intervals and so correlating between different batches is not possible as the data is missing in the same place everywhere. The quality variable case is different in that quality measurements require additional expensive test making them unavailable for over 90\% of the observations at the regular sampling frequency. This thesis presents a novel subspace approach using partial least squares and principal component analysis to identify a subspace model. This algorithm is used to solve each case of missing data in both simulation (polymethyl methacrylate) and industrial (bioreactor) processes with improved performance. / Dissertation / Doctor of Philosophy (PhD) / An important consideration of chemical processes is the maximization of production and product quality. To that end developing an accurate controller is necessary to avoid wasting resources and off-spec products. All advance process control approaches rely on the accuracy of the process model, therefore, it is important to identify the best model. This thesis presents two novel subspace based modeling approaches the first using first principles based constraints and the second handling missing data approaches. These models are then applied to a modified state space model with a predictive control strategy to show that the improved models lead to improved control. The approaches in this work are tested on both simulation (polymethyl methacrylate) and industrial (bioreactor) processes.
|
39 |
AN EVENT-BASED APPROACH TO DEMAND-DRIVEN DYNAMIC RECONFIGURABLE COMPUTINGLEE, TAI-CHUN 11 October 2001 (has links)
No description available.
|
40 |
A Local Grammar of Cause and Effect : A Corpus-driven StudyAllen, Christopher January 2005 (has links)
This thesis puts forward a specialized, functional grammar of cause and effect withinthe sub-genre of biomedical research articles. Building on research into the localgrammars of dictionary definitions and evaluation, the thesis describes the applicationof a corpus-driven methodology to description of the principal lexical grammaticalpatterns which underpin causation in scientific writing. The source of data is the 2million-word Halmstad Biomedical Corpus constructed from 589 on-line researcharticles published since 1997. These articles were sampled in accordance with astandard library classification system across the broad spectrum of the biomedicalresearch literature. On the basis of lexical grammatical patterns identified in thecorpus, a total of five functional sub-types of causation are put forward. The localgrammar itself is a description of these sub-types based on the Hallidayian notion ofsystem along the syntagm coupled with the identification of the paradigmatic contentsof these systems as a closed set of 37 semantic categories specific to the biomedicaldomain. A preliminary evaluation of the grammar is then offered in terms of handparsingexperiments using a test corpus. Finally potential NLP applications of thegrammar are described in terms of on-line information extraction, ontology buildingand text summary.
|
Page generated in 0.0297 seconds