• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 28
  • 13
  • 8
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 67
  • 13
  • 10
  • 10
  • 10
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Toward Error-Statistical Principles of Evidence in Statistical Inference

Jinn, Nicole Mee-Hyaang 02 June 2014 (has links)
The context for this research is statistical inference, the process of making predictions or inferences about a population from observation and analyses of a sample. In this context, many researchers want to grasp what inferences can be made that are valid, in the sense of being able to uphold or justify by argument or evidence. Another pressing question among users of statistical methods is: how can spurious relationships be distinguished from genuine ones? Underlying both of these issues is the concept of evidence. In response to these (and similar) questions, two questions I work on in this essay are: (1) what is a genuine principle of evidence? and (2) do error probabilities have more than a long-run role? Concisely, I propose that felicitous genuine principles of evidence should provide concrete guidelines on precisely how to examine error probabilities, with respect to a test's aptitude for unmasking pertinent errors, which leads to establishing sound interpretations of results from statistical techniques. The starting point for my definition of genuine principles of evidence is Allan Birnbaum's confidence concept, an attempt to control misleading interpretations. However, Birnbaum's confidence concept is inadequate for interpreting statistical evidence, because using only pre-data error probabilities would not pick up on a test's ability to detect a discrepancy of interest (e.g., "even if the discrepancy exists" with respect to the actual outcome. Instead, I argue that Deborah Mayo's severity assessment is the most suitable characterization of evidence based on my definition of genuine principles of evidence. / Master of Arts
42

Image texture analysis for inferential sensing in the process industries

Kistner, Melissa 12 1900 (has links)
Thesis (MScEng)-- Stellenbosch University, 2013. / ENGLISH ABSTRACT: The measurement of key process quality variables is important for the efficient and economical operation of many chemical and mineral processing systems, as these variables can be used in process monitoring and control systems to identify and maintain optimal process conditions. However, in many engineering processes the key quality variables cannot be measured directly with standard sensors. Inferential sensing is the real-time prediction of such variables from other, measurable process variables through some form of model. In vision-based inferential sensing, visual process data in the form of images or video frames are used as input variables to the inferential sensor. This is a suitable approach when the desired process quality variable is correlated with the visual appearance of the process. The inferential sensor model is then based on analysis of the image data. Texture feature extraction is an image analysis approach by which the texture or spatial organisation of pixels in an image can be described. Two texture feature extraction methods, namely the use of grey-level co-occurrence matrices (GLCMs) and wavelet analysis, have predominated in applications of texture analysis to engineering processes. While these two baseline methods are still widely considered to be the best available texture analysis methods, several newer and more advanced methods have since been developed, which have properties that should theoretically provide these methods with some advantages over the baseline methods. Specifically, three advanced texture analysis methods have received much attention in recent machine vision literature, but have not yet been applied extensively to process engineering applications: steerable pyramids, textons and local binary patterns (LBPs). The purpose of this study was to compare the use of advanced image texture analysis methods to baseline texture analysis methods for the prediction of key process quality variables in specific process engineering applications. Three case studies, in which texture is thought to play an important role, were considered: (i) the prediction of platinum grade classes from images of platinum flotation froths, (ii) the prediction of fines fraction classes from images of coal particles on a conveyor belt, and (iii) the prediction of mean particle size classes from images of hydrocyclone underflows. Each of the five texture feature sets were used as inputs to two different classifiers (K-nearest neighbours and discriminant analysis) to predict the output variable classes for each of the three case studies mentioned above. The quality of the features extracted with each method was assessed in a structured manner, based their classification performances after the optimisation of the hyperparameters associated with each method. In the platinum froth flotation case study, steerable pyramids and LBPs significantly outperformed the GLCM, wavelet and texton methods. In the case study of coal fines fractions, the GLCM method was significantly outperformed by all four other methods. Finally, in the hydrocyclone underflow case study, steerable pyramids and LBPs significantly outperformed GLCM and wavelet methods, while the result for textons was inconclusive. Considering all of these results together, the overall conclusion was drawn that two of the three advanced texture feature extraction methods, namely steerable pyramids and LBPs, can extract feature sets of superior quality, when compared to the baseline GLCM and wavelet methods in these three case studies. The application of steerable pyramids and LBPs to further image analysis data sets is therefore recommended as a viable alternative to the traditional GLCM and wavelet texture analysis methods. / AFRIKAANSE OPSOMMING: Die meting van sleutelproseskwaliteitsveranderlikes is belangrik vir die doeltreffende en ekono-miese werking van baie chemiese– en mineraalprosesseringsisteme, aangesien hierdie verander-likes gebruik kan word in prosesmonitering– en beheerstelsels om die optimale prosestoestande te identifiseer en te handhaaf. In baie ingenieursprosesse kan die sleutelproseskwaliteits-veranderlikes egter nie direk met standaard sensors gemeet word nie. Inferensiële waarneming is die intydse voorspelling van sulke veranderlikes vanaf ander, meetbare prosesveranderlikes deur van ‘n model gebruik te maak. In beeldgebaseerde inferensiële waarneming word visuele prosesdata, in die vorm van beelde of videogrepe, gebruik as insetveranderlikes vir die inferensiële sensor. Hierdie is ‘n gepaste benadering wanneer die verlangde proseskwaliteitsveranderlike met die visuele voorkoms van die proses gekorreleer is. Die inferensiële sensormodel word dan gebaseer op die analise van die beelddata. Tekstuurkenmerkekstraksie is ‘n beeldanalisebenadering waarmee die tekstuur of ruimtelike organisering van die beeldelemente beskryf kan word. Twee tekstuurkenmerkekstraksiemetodes, naamlik die gebruik van grysskaalmede-aanwesigheidsmatrikse (GSMMs) en golfie-analise, is sterk verteenwoordig in ingenieursprosestoepassings van tekstuuranalise. Alhoewel hierdie twee grondlynmetodes steeds algemeen as die beste beskikbare tekstuuranalisemetodes beskou word, is daar sedertdien verskeie nuwer en meer gevorderde metodes ontwikkel, wat beskik oor eienskappe wat teoreties voordele vir hierdie metodes teenoor die grondlynmetodes behoort te verskaf. Meer spesifiek is daar drie gevorderde tekstuuranalisemetodes wat baie aandag in onlangse masjienvisieliteratuur geniet het, maar wat nog nie baie op ingenieursprosesse toegepas is nie: stuurbare piramiedes, tekstons en lokale binêre patrone (LBPs). Die doel van hierdie studie was om die gebruik van gevorderde tekstuuranalisemetodes te vergelyk met grondlyntekstuuranaliesemetodes vir die voorspelling van sleutelproseskwaliteits-veranderlikes in spesifieke prosesingenieurstoepassings. Drie gevallestudies, waarin tekstuur ‘n belangrike rol behoort te speel, is ondersoek: (i) die voorspelling van platinumgraadklasse vanaf beelde van platinumflottasieskuime, (ii) die voorspelling van fynfraksieklasse vanaf beelde van steenkoolpartikels op ‘n vervoerband, en (iii) die voorspelling van gemiddelde partikelgrootteklasse vanaf beelde van hidrosikloon ondervloeie. Elk van die vyf tekstuurkenmerkstelle is as insette vir twee verskillende klassifiseerders (K-naaste bure en diskriminantanalise) gebruik om die klasse van die uitsetveranderlikes te voorspeel, vir elk van die drie gevallestudies hierbo genoem. Die kwaliteit van die kenmerke wat deur elke metode ge-ekstraheer is, is op ‘n gestruktureerde manier bepaal, gebaseer op hul klassifikasieprestasie na die optimering van die hiperparameters wat verbonde is aan elke metode. In die platinumskuimflottasiegevallestudie het stuurbare piramiedes en LBPs betekenisvol beter as die GSMM–, golfie– en tekstonmetodes presteer. In die steenkoolfynfraksiegevallestudie het die GSMM-metode betekenisvol slegter as al vier ander metodes presteer. Laastens, in die hidrosikloon ondervloeigevallestudie het stuurbare piramiedes en LBPs betekenisvol beter as die GSMM– en golfiemetodes presteer, terwyl die resultaat vir tekstons nie beslissend was nie. Deur al hierdie resultate gesamentlik te beskou, is die oorkoepelende gevolgtrekking gemaak dat twee van die drie gevorderde tekstuurkenmerkekstraksiemetodes, naamlik stuurbare piramiedes en LBPs, hoër kwaliteit kenmerkstelle kan ekstraheer in vergelyking met die GSMM– en golfiemetodes, vir hierdie drie gevallestudies. Die toepassing van stuurbare piramiedes en LBPs op verdere beeldanalise-datastelle word dus aanbeveel as ‘n lewensvatbare alternatief tot die tradisionele GSMM– en golfietekstuuranalisemetodes.
43

Preferências ecológicas e potencial bioindicador das diatomáceas para avaliação ambiental de represas do Estado de São Paulo, Brasil /

Lehmkuhl, Angela Maria da Silva January 2019 (has links)
Orientador: Denise de Campos Bicudo / Resumo: Este estudo baseou-se em um banco de dados limnológicos e biológicos (diatomáceas de sedimento superficial e da coluna da água) de 33 reservatórios com gradiente trófico (ultraoligotrófico a hipereutrófico) distribuídos na região sudeste do Estado de São Paulo. Visou, como primeira etapa, calcular os ótimos e as tolerâncias ecológicas (etapa de regressão) das espécies de diatomáceas com a finalidade de propor um índice de diatomáceas para avaliar o estado trófico de represas, bem como um modelo de função de transferência diatomácea-fósforo (etapa de calibração) para inferir níveis pretéritos de fósforo da água. Além disso, visou avaliar o efeito da eutrofização sobre a homogeneização taxonômica e funcional das comunidades de diatomáceas. As amostras do sedimento superficial (n = 113) e do plâncton (verão e inverno, n = 226) foram obtidas entre 2009 e 2014. O método da média ponderada (WA) foi utilizado para a etapa de regressão (ótimo e tolerância das espécies), e modelos de regressão clássica e inversa foram testados para a etapa de calibração para a proposição do índice trófico de diatomáceas e para o modelo de função de transferência diatomácea-fósforo. Foram calculados os ótimos e as tolerâncias de fósforo total para 58 (sedimento superficial) e 53 (plâncton) espécies de diatomáceas. O modelo proposto com base nas diatomáceas do sedimento superficial apresentou melhor habilidade (r2 0.71, p<0.001, RMSE 49.43 μg L-1) do que as planctônicas para proposição do índice de esta... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: This study was based on a limnological and biological database (surface sediment and water column diatoms) of 33 reservoirs with trophic gradient (ultraoligo- to hyperereutrophic) distributed in the southeastern region of São Paulo State. The first aim was to calculate the optimum and ecological tolerances (regression stage) of diatom species in order to propose a diatom index to evaluate the trophic state of reservoirs, as well as a model of diatom-phosphorus transfer function (calibration step) to infer past levels of water phosphorus. In addition, it aimed to evaluate the effect of eutrophication on the taxonomic and functional homogenization of diatom communities. Surface sediment (n = 113) and plankton (summer and winter, n = 226) samples were obtained between 2009 and 2014. The weighted average (WA) method was used for the regression step (optimal and species tolerance), and classical and inverse regression models were tested for the calibration step for the proposition of the trophic index of diatoms and for the diatom-phosphorus transfer function model. Optimum and tolerances for total phosphorus were calculated for 58 (surface sediment) and 53 (plankton) diatom species. The model based on the surface sediment diatoms presented better ability (r2 0.71, p<0.001, RMSE 49.43 μg L-1) than phytoplankton diatom to propose the trophic diatom index of reservoirs (TDIR). The transfer function model showed high predictive ability (r2 0.80) and was based on 63 diatom species (su... (Complete abstract click electronic access below) / Doutor
44

O humor como estratégia de compreensão e produção de charges: um estudo inferencial das charges de Myrria

Lima, Maria Francisca Morais de 25 February 2016 (has links)
Made available in DSpace on 2016-04-28T19:34:03Z (GMT). No. of bitstreams: 1 Maria Francisca Morais de Lima.pdf: 3869227 bytes, checksum: 108564ea7f7635be117a95366c0943d8 (MD5) Previous issue date: 2016-02-25 / The Understanding of texts that represent someone's opinions, as cartoons, requires the reader to develop contextual skills capable of generate meaning. So, this thesis discusses the importance of inferential process as a strategy of understanding the humor in political cartoon, taking as a basis the principles of textuality of Beaugrande and Dresller (1981) and inferential categorization framework prepared by Marcuschi (2012). The research's problem was to analyze inferential processes and their importance for critical analysis of texts of humor. To this end, the following objectives are -: analyze inferential procedures that contribute to the understanding of the humor in the cartoon; perform a theoretical path, not only about the first studies on laughter, but also about the perception of humor and its use as a social criticism; identify how the inferential process may contribute to the perception of constituted political criticism in cartoon gender . The study of inferential process for understanding cartoons is justified, since the reader, while reading a cartoon uses the inference to fill the gaps left towards sometimes on purpose by the author in the text. Such gaps are evidenced by the incongruity intentionally assigned by cartoonist. This thesis is divided into four chapters: the first three, presented a theoretical framework that buoyed the analysis of the research corpus consists of cartoons published on Acrítica newspaper opinion notebook from February to November 2013. As an analytical tool, picked up ten (10) cartoons of Myrria organized into five groups, considering the similarity of the issues presented. In the methodological field, it was chosen a phenomenological research method, whose premises will enable an understanding from the man of visions and world and content analysis. As standard understanding of the presented charges were used skills to locate and to infer implicit and explicit information in text and establishing the relationship between the significant resources and order effects, thus enabling the player, not only out of the surface of the text frame, but also be able to understand the relationships built in interdiscourse and intertext of cartoon texts / A compreensão de textos opinativos como a charge exige do leitor o desenvolvimento de habilidades contextuais capaz de gerar sentido. Para tanto, esta tese discute a importância do processo inferencial como estratégia de compreensão do humor na charge política, tomando como base de análise os princípios de textualidade de Beaugrande e Dresller (1981) e o quadro de categorização inferencial elaborado por Marcuschi (2012). O problema da pesquisa consistiu em analisar os processos inferenciais e sua importância para a análise crítica de textos de humor. Para tanto, elencaram-se os seguintes objetivos: analisar os procedimentos inferenciais que contribuem para a compreensão do humor presente na charge; realizar um trajeto teórico, não só a respeito dos primeiros estudos sobre o riso, como também a respeito da percepção de humor e sua utilização como aporte de crítica social; identificar como o processo inferencial pode contribuir para a percepção da crítica política constituída no gênero charge. O estudo do processo inferencial para a compreensão de charges se justifica, uma vez que o leitor, ao ler um texto chárgico, utiliza a inferência para preencher as lacunas de sentido deixadas, às vezes de propósito, pelo autor no texto. Tais lacunas são evidenciadas pela incongruência intencionalmente atribuída pelo chargista. Esta tese está dividida em quatro capítulos: nos três primeiros, apresentou-se um aporte teórico que balizou a análise do Corpus da pesquisa constituído por charges publicadas no caderno de opinião do jornal Acrítica no período de fevereiro a novembro de 2013. Como instrumento de análise, escolheram-se dez (10) charges de Myrria, organizadas em cinco grupos, considerando a similaridade dos assuntos apresentados. No campo metodológico, optou-se como método de investigação a fenomenologia, cujos pressupostos permitem realizar uma compreensão a partir das visões de homem e de mundo e a análise de conteúdo. Como padrão de compreensão das charges apresentadas, foram utilizadas as habilidades de localizar e inferir informações explícitas e implícitas no texto e o estabelecimento de relação entre os recursos expressivos e efeitos de sentido, possibilitando assim ao leitor, não só sair da estrutura superficial do texto, como também ser capaz de perceber as relações construídas no interdiscurso e no intertexto dos textos chárgicos
45

Improving process monitoring and modeling of batch-type plasma etching tools

Lu, Bo, active 21st century 01 September 2015 (has links)
Manufacturing equipments in semiconductor factories (fabs) provide abundant data and opportunities for data-driven process monitoring and modeling. In particular, virtual metrology (VM) is an active area of research. Traditional monitoring techniques using univariate statistical process control charts do not provide immediate feedback to quality excursions, hindering the implementation of fab-wide advanced process control initiatives. VM models or inferential sensors aim to bridge this gap by predicting of quality measurements instantaneously using tool fault detection and classification (FDC) sensor measurements. The existing research in the field of inferential sensor and VM has focused on comparing regressions algorithms to demonstrate their feasibility in various applications. However, two important areas, data pretreatment and post-deployment model maintenance, are usually neglected in these discussions. Since it is well known that the industrial data collected is of poor quality, and that the semiconductor processes undergo drifts and periodic disturbances, these two issues are the roadblocks in furthering the adoption of inferential sensors and VM models. In data pretreatment, batch data collected from FDC systems usually contain inconsistent trajectories of various durations. Most analysis techniques requires the data from all batches to be of same duration with similar trajectory patterns. These inconsistencies, if unresolved, will propagate into the developed model and cause challenges in interpreting the modeling results and degrade model performance. To address this issue, a Constrained selective Derivative Dynamic Time Warping (CsDTW) method was developed to perform automatic alignment of trajectories. CsDTW is designed to preserve the key features that characterizes each batch and can be solved efficiently in polynomial time. Variable selection after trajectory alignment is another topic that requires improvement. To this end, the proposed Moving Window Variable Importance in Projection (MW-VIP) method yields a more robust set of variables with demonstrably more long-term correlation with the predicted output. In model maintenance, model adaptation has been the standard solution for dealing with drifting processes. However, most case studies have already preprocessed the model update data offline. This is an implicit assumption that the adaptation data is free of faults and outliers, which is often not true for practical implementations. To this end, a moving window scheme using Total Projection to Latent Structure (T-PLS) decomposition screens incoming updates to separate the harmless process noise from the outliers that negatively affects the model. The integrated approach was demonstrated to be more robust. In addition, model adaptation is very inefficient when there are multiplicities in the process, multiplicities could occur due to process nonlinearity, switches in product grade, or different operating conditions. A growing structure multiple model system using local PLS and PCA models have been proposed to improve model performance around process conditions with multiplicity. The use of local PLS and PCA models allows the method to handle a much larger set of inputs and overcome several challenges in mixture model systems. In addition, fault detection sensitivities are also improved by using the multivariate monitoring statistics of these local PLS/PCA models. These proposed methods are tested on two plasma etch data sets provided by Texas Instruments. In addition, a proof of concept using virtual metrology in a controller performance assessment application was also tested.
46

The role of doubt in bulimia nervosa

Wilson, Samantha 12 1900 (has links)
No description available.
47

Sistema H?brido de Infer?ncia Baseado em An?lise de Componentes Principais e Redes Neurais Artificiais Aplicado a Plantas de Processamento de G?s Natural

Linhares, Leandro Luttiane da Silva 19 March 2010 (has links)
Made available in DSpace on 2014-12-17T14:55:42Z (GMT). No. of bitstreams: 1 LeandroLSL_DISSERT.pdf: 1890433 bytes, checksum: 540cbd4cf39fb3515249b7cecd6d0dcc (MD5) Previous issue date: 2010-03-19 / Conselho Nacional de Desenvolvimento Cient?fico e Tecnol?gico / Nowadays, where the market competition requires products with better quality and a constant search for cost savings and a better use of raw materials, the research for more efficient control strategies becomes vital. In Natural Gas Processin Units (NGPUs), as in the most chemical processes, the quality control is accomplished through their products composition. However, the chemical composition analysis has a long measurement time, even when performed by instruments such as gas chromatographs. This fact hinders the development of control strategies to provide a better process yield. The natural gas processing is one of the most important activities in the petroleum industry. The main economic product of a NGPU is the liquefied petroleum gas (LPG). The LPG is ideally composed by propane and butane, however, in practice, its composition has some contaminants, such as ethane and pentane. In this work is proposed an inferential system using neural networks to estimate the ethane and pentane mole fractions in LPG and the propane mole fraction in residual gas. The goal is to provide the values of these estimated variables in every minute using a single multilayer neural network, making it possibly to apply inferential control techniques in order to monitor the LPG quality and to reduce the propane loss in the process. To develop this work a NGPU was simulated in HYSYS R software, composed by two distillation collumns: deethanizer and debutanizer. The inference is performed through the process variables of the PID controllers present in the instrumentation of these columns. To reduce the complexity of the inferential neural network is used the statistical technique of principal component analysis to decrease the number of network inputs, thus forming a hybrid inferential system. It is also proposed in this work a simple strategy to correct the inferential system in real-time, based on measurements of the chromatographs which may exist in process under study / Nos dias atuais, em que a concorr?ncia de mercado exige produtos de melhor qualidade e a busca constante pela redu??o de custos e pelo melhor aproveitamento das mat?rias-primas, a utiliza??o de estrat?gias de controle mais eficientes torna-se fundamental. Nas Unidades de Processamento de G?s Natural (UPGNs), assim como na maioria dos processos qu?micos, o controle de qualidade ? realizado a partir da composi??o de seus produtos. Entretanto, a an?lise de composi??es qu?micas, mesmo quando realizada por equipamentos como os cromat?grafos a g?s, apresenta longos intervalos de medi??o. Esse fato dificulta a elabora??o de estrat?gias de controle que proporcionem um melhor rendimento do processo. Geralmente, o principal produto econ?mico de uma UPGN ? o GLP (G?s Liquefeito de Petr?leo). Outros produtos comumente obtidos nessas unidades s?o a gasolina natural e o g?s residual. O GLP ? formado idealmente por propano e butano. Entretanto, na pr?tica, apresenta em sua composi??o contaminantes, tais como o etano e o pentano. Neste trabalho ? proposto um sistema de infer?ncia utilizando redes neurais para estimar as fra??es molares de etano e pentano no GLP e a fra??o molar de propano no g?s residual. O objetivo ? estimar essas vari?veis a cada minuto com uma ?nica rede neural de m?ltiplas camadas, permitindo a aplica??o de t?cnicas de controle inferencial visando a controlar a qualidade do GLP e reduzir a perda de propano no processo. No desenvolvimento deste trabalho, ? simulada no software HYSYS R uma UPGN formada por uma coluna de destila??o deetanizadora e outra debutanizadora. A infer?ncia ? realizada a partir das vari?veis de processo de alguns controladores PID presentes na instrumenta??o das colunas citadas. Com o intuito de reduzir a complexidade da rede neural de infer?ncia, ? utilizada a t?cnica estat?stica de an?lise de componentes principais (ACP) para diminuir o n?mero de entradas da rede. Tem-se, portanto, um sistema h?brido de infer?ncia. Tamb?m ? proposta neste trabalho, uma estrat?gia simples para a corre??o em tempo real do sistema de infer?ncia, tendo como base as medi??es dos poss?veis cromat?grafos de linha presentes no processo em estudo
48

Aplikace fuzzy logiky při hodnocení dodavatelů firmy / The Application of Fuzzy Logic for Rating of Suppliers for the Firm

Zegzulka, Ivo January 2014 (has links)
This thesis deals with the design of fuzzy system that can evaluate supplier of spare parts for service. The result should be applicable to a company Iveta Šťastníková - car and tire service. Primarily it should simplify operations associated with the selection of appropriate spare parts, tools and other equipment needed to operate with car service station. First, we introduce the theoretical basis for the paper, and then we go to the present state and the analysis itself. The result is a proposed solution which should correspond to the needs of the owner.
49

The effects of types of question on EFL learners' reading comprehension scores

Ehara, Kazuhiro January 2008 (has links)
Little empirical research has been conducted on what effect task-based reading instruction with reading questions will have on reading comprehension, particularly in the domain of second language reading comprehension. The purpose of this research is to investigate which type of questions, textually explicit (TE) or inferential (IF) questions, will best facilitate text comprehension, and which type will have the most beneficial effect on Japanese EFL learners at three proficiency levels (low, intermediate, and high). In the study, two groups of Japanese senior high school students (N = 69) were classified into three different proficiency groups. One group received instruction emphasizing TE questions while the other received instruction emphasizing IF questions. TE questions are text-bound questions whose answers are locally and explicitly stated in the text. In contrast, IF questions are more knowledge-bound questions whose answers largely depend on readers' cognitive resources, such as relevant linguistic knowledge, background knowledge, world knowledge or context. The different treatments lasted five months. The results were statistically analyzed. The study revealed a significant task effect for reading questions on Japanese EFL learners' reading. Although one type of instruction did not have a significantly better effect than the other, the large between-groups gain gap seems to imply that instruction emphasizing IF questions might facilitate text comprehension more. The study also found that the participants who received instruction emphasizing IF questions benefited from their instruction regardless of proficiency level. With regard to instruction emphasizing TE questions, the higher proficiency participants benefited significantly more from their instruction than the lower proficiency students. The study suggests that reading teachers should use a task-based teaching method with reading questions. If the use of reading questions is already a part of reading teachers' methodology, they should include not only commonly used textually explicit reading questions but also inferential ones. The study suggests that implementing these changes might help break the cycle of translation-bound reading instruction with its overemphasis on lower-level processing, and might lead students to read texts in a more meaningful, interactive way. / CITE/Language Arts
50

An intervention programme to optimise the cognitive development of grade R-learners :|ba bounded pilot study / Stefani-Marié Esterhuizen

Esterhuizen, Stefanie-Marié January 2012 (has links)
It is imperative to prepare South African learners to participate and function confidently within the context of a rapidly changing world. The curriculum of the South African Education System emphasises the significance of optimising learners‟ cognitive development as early as pre-school age to enable them to become creative and critical citizens who lead purposeful lives in a safe and prejudice-free environment. Despite continuous efforts by educators to optimise cognitive development, recently executed research studies indicate that cognitive development has not been adequately optimised in South African schools. This study was undertaken to establish the cognitive development level (cognitive and meta-cognitive skills and strategies, cognitive functions and non-intellective factors) of Grade R-learners and to determine the effect of an intervention programme, the Cognitive Enhancement Programme for Pre-schoolers (CEPP), on their cognitive development. By means of a literature study, I investigated whether, to what extent the cognitive development of Grade R-learners was taking place, and established which cognitive and meta-cognitive thinking skills and strategies, cognitive functions and non-intellective factors are required for effective cognitive development among Grade R-learners. In addition to this, the role of mediation for optimising cognitive development was investigated. A concurrent embedded mixed methods design was conducted in the implementation of the research. Intervention research within a quasi-experimental research design was applied. The data collection by means of a quantitative strategy (quasi-experimental research) and qualitative strategy (observation study) was executed simultaneously. By means of convenient sampling, one Grade R-class with twenty learners was subjected to a pre-test to establish their cognitive developmental level. The test results as well as the observations conducted during the pre-test revealed that the learners experienced problems related to their cognitive development. Ten of the twenty learners were then divided purposively based on their test performance into two experimental groups, Experimental Group A and Experimental Group B consisting of five participants each. Experimental group A and Experimental Group B took part in the CEPP intervention based on the principles of mediation on a rotational basis over a period of twelve weeks, during which intentional attempts were made to optimise their cognitive development. Both groups completed a post-test and delayed post-test (retention) to determine the effect of the CEPP intervention on their cognitive development. In addition to the test results, observations in the form of structured running and anecdotal records and reflective notes were utilised to understand the nature and quality of the cognitive development of the learners better. Furthermore, the effect of the intervention on their cognitive development was established. The cognitive development of Grade R-learners who participated in this study was optimised, which is a clear indication that cognitive capacity can be optimised when instruction is based on the principles of mediation / PhD, Teaching and Learning, North-West University, Vaal Triangle Campus, 2012

Page generated in 0.0598 seconds