• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 410
  • 58
  • 47
  • 19
  • 13
  • 11
  • 11
  • 11
  • 11
  • 11
  • 11
  • 7
  • 7
  • 4
  • 3
  • Tagged with
  • 690
  • 132
  • 95
  • 94
  • 76
  • 70
  • 62
  • 59
  • 56
  • 54
  • 46
  • 42
  • 38
  • 37
  • 36
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
511

INVERSE SAMPLING PROCEDURES TO TEST FOR HOMOGENEITY IN A MULTIVARIATE HYPERGEOMETRIC DISTRIBUTION

Liu, Jun 04 1900 (has links)
<p>In this thesis we study several inverse sampling procedures to test for homogeneity in a multivariate hypergeometric distribution. The procedures are finite population analogues of the procedures introduced in Panchapakesan et al. (1998) for the multinomial distribution. In order to develop some exact calculations for critical values not considered in Panchapakesan et al. we introduce some terminologies for target probabilities, transfer probabilities, potential target points, right intersection, and left union. Under the null and the alternative hypotheses, we give theorems to calculate the target and transfer probabilities, we then use these results to develop exact calculations for the critical values and powers of one of the procedures. We also propose a new approximate calculation. In order to speed up some of the calculations, we propose several fast algorithms for multiple summation.</p> <p>N >= 1680000, all the results are the same as those in the multinomial distribution.</p> <p>The computing results showed that the simulations agree closely with the exact results. For small population sizes the critical values and powers of the procedures are different from the corresponding multinomial procedures, but when</p> / Master of Science (MSc)
512

Extended probabilistic symbolic execution

Uwimbabazi, Aline 12 1900 (has links)
Thesis (MSc)--Stellenbosch University, 2013. / ENGLISH ABSTRACT: Probabilistic symbolic execution is a new approach that extends the normal symbolic execution with probability calculations. This approach combines symbolic execution and model counting to estimate the number of input values that would satisfy a given path condition, and thus is able to calculate the execution probability of a path. The focus has been on programs that manipulate primitive types such as linear integer arithmetic in object-oriented programming languages such as Java. In this thesis, we extend probabilistic symbolic execution to handle data structures, thus allowing support for reference types. Two techniques are proposed to calculate the probability of an execution when the programs have structures as inputs: an approximate approach that assumes probabilities for certain choices stay fixed during the execution and an accurate technique based on counting valid structures. We evaluate these approaches on an example of a Binary Search Tree and compare it to the classic approach which only take symbolic values as input. / AFRIKAANSE OPSOMMING: Probabilistiese simboliese uitvoering is ’n nuwe benadering wat die normale simboliese uitvoering uitbrei deur waarksynlikheidsberekeninge by te voeg. Hierdie benadering kombineer simboliese uitvoering en modeltellings om die aantal invoerwaardes wat ’n gegewe padvoorwaarde sal bevredig, te beraam en is dus in staat om die uitvoeringswaarskynlikheid van ’n pad te bereken. Tot dus vêr was die fokus op programme wat primitiewe datatipes manipuleer, byvoorbeeld lineêre heelgetalrekenkunde in objek-geörienteerde tale soos Java. In hierdie tesis brei ons probabilistiese simboliese uitvoering uit om datastrukture, en dus verwysingstipes, te dek. Twee tegnieke word voorgestel om die uitvoeringswaarskynlikheid van ’n program met datastrukture as invoer te bereken. Eerstens is daar die benaderingstegniek wat aanneem dat waarskynlikhede vir sekere keuses onveranderd sal bly tydens die uitvoering van die program. Tweedens is daar die akkurate tegniek wat gebaseer is op die telling van geldige datastrukture. Ons evalueer hierdie benaderings op ’n voorbeeld van ’n binêre soekboom en vergelyk dit met die klassieke tegniek wat slegs simboliese waardes as invoer neem.
513

Community Colleges, Catalysts for Mobility or Engines for Inequality? Addressing Selection Bias in the Estimation of Their Effects on Educational and Occupational Outcomes

González Canché, Manuel Sacramento January 2012 (has links)
For the last 25 years, research on the effects of community colleges on baccalaureate degree attainment has concluded that community colleges drastically reduce the likelihood of attaining a bachelor's degree compared to the effects of four-year institutions on this likelihood. The thesis of this dissertation is that community colleges have been misjudged as institutions that tend to perpetuate social and economic stratification; what previous studies on the topic have found is based on systematic differences in the student populations. Community college students are consistently more at risk of failing academically than four-year students. Then, the positive impact that four-year colleges have on their students compared to the impact of two-year colleges is to a great extent due to the fact that four-year students tend to have more resources and means to handle college requirements than two-year students. The main challenges to analyze two- and four-year sector effects relies on identifying community college students who resemble four-year college students and then compare their outcomes. This dissertation expands on previous research that has only looked at the effect of community colleges on students' educational outcomes by including labor market outcomes. The analyses conducted in this study primarily relied on propensity score matching (PSM) and the Heckman two-stage estimation procedures to reduce bias in the analysis by accounting for non-random selection into the treatment. In addition, the analytic samples were disaggregated by gender and ethnicity. To estimate the effects of interest, a nationally representative sample that is longitudinal and panel in nature was used: The National Education Longitudinal Study of 1988 (NELS:88).Results revealed that neither the two- nor the four-year sectors were able to help students with very low probabilities of graduation from a four-year college. A new financial aid approach that bridges merit-based and aid-based perspectives is proposed. Community colleges, by welcoming a greater proportion of first-time, full-time undergraduate students, many of whom are underrepresented in higher education, and by helping their students to perform similarly than four-year college students in the outcomes analyzed, are conceptualized as engines for mobility helping surpass economic and social stratification of opportunities in American society.
514

Development and evaluation of techniques for estimating short duration design rainfall in South Africa.

Smithers, Jeffrey Colin. January 1998 (has links)
The objective of the study was to update and improve the reliability and accuracy of short duration (s 24 h) design rainfall values for South Africa. These were to be based on digitised rainfall data whereas previous studies conducted on a national scale in South Africa were based on data that were manually extracted from autographic charts. With the longer rainfall records currently available compared to the studies conducted in the early 1980s, it was expected that by utilising the longer, digitised rainfall data in conjunction with regional approaches, which have not previously been applied in South Africa, that more reliable short duration design rainfall values could Ix: estimated. A short duration rainfall database was established for South Africa with the majority of the data contributed by the South African Weather Bureau (SAWB). Numerous errors such as negative and zero time steps were identified in the SAWB digitised rainfall data. Automated procedures were developed to identify the probable cause of the errors and appropriate adjustments to the data were made. In cases where the cause of the error could be established, the data were adjusted to introduce randomly either the minimum, average or maximum intensity into the data as a result of the adjustment. The effect of the adjustments was found to have no significant effect on the extracted Annual Maximum Series (AMS). However, the effect of excluding erroneous points or events with erroneous points resulted in significantly different AMS. The low reliability of much of the digitised SAW B rainfall data was evident by numerous and large differences between daily rainfall totals recorded by standard, non-recording raingauges, measured at 08:00 every day, and the total rainfall depth for the equivalent period extracted from the digitised data. Hence alternative techniques of estimating short duration rainfall values were developed, with the focus on regional approaches and techniques that could be derived from daily rainfall totals measured by standard raingauges. Three approaches to estimating design storms from the unreliable short duration rainfall database were developed and evaluated. The first approach used a regional frequency analysis, the second investigated scaling relationships of the moments of the extreme events and the third approach used a stochastic intra-daily model to generate synthetic rainfall series. In the regional frequency analyses, 15 relatively homogeneous rainfall clusters were identified in South Africa and a regional index storm based approach using L-moments was applied. Homogeneous clusters were identified using site characteristics and tested using at-site data. The mean of the AMS was used as the index value and in 13 of the 15 relatively homogeneous clusters the index value for 24 h durations were well estimated as a function of site characteristics only, thus enabling the estimation of 24 h duration design rainfall values at any location in South Africa. In 13 of the 15 clusters the scaling properties of the moments of the AMS were used to successfully estimate design rainfall values for duration < 24h, using the moments of the AMS extracted from the data recorded by standard raingauges and regional relationships based on site characteristics. It was found that L-moments scaled better and over a wider range of durations than ordinary product moments. A methodology was developed for the derivation of the parameters for two Bartlett-Lewis rectangular pulse models using only standard raingauge data, thus enabling the estimation of design values for durations as short as 1 h at sites where only daily rainfall data are available. In view of the low reliability of the majority of short duration rainfall data in South Africa, it is recommended that the regional index value approach be adopted for South Africa, but scaled using values derived from the daily rainfall data. The use of the intra-daily stochastic rainfall models to estimate design rainfall values is recommended as further independent confirmation of the reliability of the design values. / Thesis (M.Sc.)-University of Natal, Pietermaritzburg, 1998.
515

The dynamics and energetics of tropical-temperature troughs over Southern Africa

D'Abreton, Peter Charles January 1992 (has links)
Water vapour content and transport over southern Africa and adjacent oceans are examined. Early summer rainfall over the northern and central interior of South Africa tends to be associated with baroclinic controls whereas late-summer rainfall is barotropic in origin. This is reflected in the northwesterly water vapour transport from an Atlantic Ocean source by middle and upper tropospheric westerly waves in early summer. A thermally indirect Ferrel cell, indicated-from energetics, COpIU1nSthe· temperate nature of the early-summer atmosphere over southern Africa. Late summer water vapour transport, in contrast, is strongly from the tropics, with' a reduced eddy component, indicating an important tropical control on late SUmmerrainfall especially in terms of fluctuations in the position of the ascending limb of .the Walker cell Over southern Africa. The Hadley cell is of importance to the late summer rainfall in that dry (wet) years are associated with an anomalous cell OVereastern (central) South Africa such that low level vapour transport is southerly (northerly). The anticyclone over the eastern parts of southern Africa, coupled with. a trough over the interior (especially at the 700 hPa pressure level), is important for the introduction of water vapour over the subcontinent in wet and dry years and for tropical-temperate trough case studies. Water vapour source regions differ from early summer (Atlantic Ocean) to late summer (Indian Ocean), which reflects the temperate. control on early and the tropical control on late summer circulation. The convergence of water vapour over southern Africa in wet years and during tropical-temperate troughs is not only important for cloud formation and precipitation, but also for latent heat release associated with convergent water vapour. Diabatic heating decreases the stability of the tropical atmosphere thereby resulting in increased vertical motion. It also forces an anomalous Badley circulation during wet late summers and tropical-temperate trough .cases as a result of complex energy transformations. Heating increases eddy available potential energy which is converted to zonal available potential energy by a thermally indirect circulation found in the tropics. The zonal potential energy is then converted to kinetic energy by the thermally direct Badley cell. Water vapour and its variations are thus important for the precipitation, heating and SUbsequent energy of the subtropical southern African atmosphere, / GR 2017
516

Evènements météo-océaniques extrêmes / Extreme meteo-oceanic events

Mazas, Franck 17 November 2017 (has links)
Cette thèse sur travaux vise à rassembler et unifier les travaux réalisés sur le sujet des évènements météo-océaniques extrêmes depuis 2009, dans le cadre de mon travail à SOGREAH, devenu depuis ARTELIA. À mesure que progressaient ces travaux, un thème central a progressivement apparu : la notion d'évènement, tel qu'une tempête. Ce concept fournit un cadre robuste et pertinent, en particulier dans le cas des extrêmes multivariés (par exemple, la probabilité d'occurrence conjointe des vagues et des niveaux marins), ainsi qu'une meilleure compréhension de la notion de période de retour, très utilisée dans le domaine de l'ingénierie.Les principaux résultats des travaux réalisés au cours de la décennie écoulée sont les suivants :- mise à jour de la méthodologie de détermination des houles ou vents extrêmes :- développement et justification d'un cadre en deux étapes pour la modélisation sup-seuil des extrêmes univariés (méthode du renouvellement), introduisant la notion d'évènement et la séparation des seuils physique et statistique,- proposition d'outils pratiques pour le choix du seuil statistique,- introduction de la méthode du bootstrap paramétrique pour le calcul des intervalles de confiance,- identification d'un comportement problématique de l'Estimateur du Maximum de Vraisemblance et proposition d'une solution : utilisation de distributions à trois paramètres avec l'estimateur des L-moments,- application du cadre POT (Peaks-Over-Threshold) à la Méthode des Probabilités Jointes (JPM) pour la détermination des niveaux marins extrêmes :- distinction entre les valeurs séquentielles et les pics des évènements à l'aide d'indices extrémaux pour les surcotes et les niveaux marins,- construction d'un modèle mixte pour la distribution des surcotes,- raffinements pour le traitement de la dépendance marée-surcote,- application du cadre POT-JPM pour l'analyse conjointe des hauteurs de vagues et des niveaux marins :- proposition d'une procédure alternative d'échantillonnage,- analyse séparée de la marée et de la surcote dans le but de modéliser la dépendance entre la hauteur de vagues et la surcote ; avec incorporation dans la distribution conjointe de la hauteur de vagues et du niveau marin à l'aide d'une opération de convolution 2D1D,- utilisation de copules des valeurs extrêmes,- présentation améliorée du chi-plot,- introduction d'une nouvelle classification pour les analyses multivariées :- Type A : un phénomène unique décrit par différentes grandeurs physiques qui ne sont pas du même type,- Type B : un phénomène fait de différentes composantes, décrits par des grandeurs physiques du même type d'un composant à l'autre,- Type C : plusieurs phénomènes décrits par des grandeurs physiques qui ne sont pas du même type,- interprétation de la signification des évènements multivariés :- lien avec l'échantillonnage,- lien avec les différentes définitions de la période de retour,- dans le cas bivarié : transformation d'une distribution conjointe de variables descriptives de l'évènement vers la distribution des couples de variables séquentielles,- génération de graphes de srotie alternatifs tels que les contours d'iso-densité pour les couples de variables séquentielles,- un package R dédié, artextreme, pour l'implémentation des méthodes ci-dessus / This PhD on published works aims at unifying the works carried out on the topic of extreme metocean events since 2009, while working for SOGREAH then ARTELIA.As these works went along, a leading theme progressively appeared: the notion of event, such as a storm. This concept provides a sound and relevant framework in particular in the case of multivariate extremes (such as joint probabilities of waves and sea levels), as well as a better understanding of the notion of return period, much used for design in the field of engineering.The main results of the works carried out in the last decade are as follows:- updating of the methodology for determining extreme wave heights or wind speeds:- development and justification of a two-step framework for extreme univariate over-threshold modelling introducing the concept of event and the separation of the physical and statistical thresholds,- proposal of practical tools for choosing the statistical threshold,- introduction of the parametric bootstrap approach for computing confidence intervals,- identification of a problematic issue in the behaviour of the Maximum Likelihood Estimator and proposal of a solution: use of 3-parameter distributions along with the L-moments estimator,- application of the POT framework to the Joint Probability Method for determining extreme sea levels:- distinction between sequential values and event peaks through extremal indexes for surge and sea level,- construction of a mixture model for the surge distribution,- refinements for handling tide-surge dependence,- application of the POT-JPM framework for the joint analysis of wave height and sea level:- proposal of an alternative sampling procedure,- separate analysis of tide and surge in order to model the dependence between wave height and surge to be incorporated in the joint distribution of wave height and sea level thanks to a 2D1D convolution operation,- use of extreme-value copulas,- improved presentation of the chi-plot,- introduction of a new classification for multivariate analyses:- Type A: a single phenomenon described by different physical quantities that are not of the same kind,- Type B: a phenomenon made of different components, described by physical quantities of the same kind between one component and another,- Type C: several phenomena described by physical quantities that are not of the same kind,- interpretation of the meaning of multivariate events:- link with the sampling procedure,- link with the different definitions of the return period,- in the bivariate case: transformation of the joint distribution of event-describing variables into the joint distribution of sequential pairs,- generation of alternative output plots such as contours of density for sequential pairs;- a dedicated R package, artextreme, for implementing the methodologies presented above
517

Programação dinâmica em tempo real para processos de decisão markovianos com probabilidades imprecisas / Real-time dynamic programming for Markov Decision Processes with Imprecise Probabilities

Dias, Daniel Baptista 28 November 2014 (has links)
Em problemas de tomada de decisão sequencial modelados como Processos de Decisão Markovianos (MDP) pode não ser possível obter uma medida exata para as probabilidades de transição de estados. Visando resolver esta situação os Processos de Decisão Markovianos com Probabilidades Imprecisas (Markov Decision Processes with Imprecise Transition Probabilities, MDP-IPs) foram introduzidos. Porém, enquanto estes MDP-IPs se mostram como um arcabouço robusto para aplicações de planejamento no mundo real, suas soluções consomem muito tempo na prática. Em trabalhos anteriores, buscando melhorar estas soluções foram propostos algoritmos de programação dinâmica síncrona eficientes para resolver MDP-IPs com uma representação fatorada para as funções de transição probabilística e recompensa, chamados de MDP-IP fatorados. Entretanto quando o estado inicial de um problema do Caminho mais Curto Estocástico (Stochastic Shortest Path MDP, SSP MDP) é dado, estas soluções não utilizam esta informação. Neste trabalho será introduzido o problema do Caminho mais Curto Estocástico com Probabilidades Imprecisas (Stochastic Shortest Path MDP-IP, SSP MDP-IP) tanto em sua forma enumerativa, quanto na fatorada. Um algoritmo de programação dinâmica assíncrona para SSP MDP-IP enumerativos com probabilidades dadas por intervalos foi proposto por Buffet e Aberdeen (2005). Entretanto, em geral um problema é dado de forma fatorada, i.e., em termos de variáveis de estado e nesse caso, mesmo se for assumida a imprecisão dada por intervalos sobre as variáveis, ele não poderá ser mais aplicado, pois as probabilidades de transição conjuntas serão multilineares. Assim, será mostrado que os SSP MDP-IPs fatorados são mais expressivos que os enumerativos e que a mudança do SSP MDP-IP enumerativo para o caso geral de um SSP MDP-IPs fatorado leva a uma mudança de resolução da função objetivo do Bellman backup de uma função linear para uma não-linear. Também serão propostos algoritmos enumerativos, chamados de RTDP-IP (Real-time Dynamic Programming with Imprecise Transition Probabilities), LRTDP-IP (Labeled Real-time Dynamic Programming with Imprecise Transition Probabilities), SSiPP-IP (Short-Sighted Probabilistic Planner with Imprecise Transition Probabilities) e LSSiPP-IP (Labeled Short-Sighted Probabilistic Planner with Imprecise Transition Probabilities) e fatorados chamados factRTDP-IP (factored RTDP-IP) e factLRTDP-IP (factored LRTDP-IP). Eles serão avaliados em relação aos algoritmos de programação dinâmica síncrona em termos de tempo de convergência da solução e de escalabilidade. / In sequential decision making problems modelled as Markov Decision Processes (MDP) we may not have the state transition probabilities. To solve this issue, the framework based in Markov Decision Processes with Imprecise Transition Probabilities (MDP-IPs) is introduced. Therefore, while MDP-IPs is a robust framework to use in real world planning problems, its solutions are time-consuming in practice. In previous works, efficient algorithms based in synchronous dynamic programming to solve MDP-IPs with factored representations of the probabilistic transition function and reward function, called factored MDP-IPs. However, given a initial state of a system, modeled as a Stochastic Shortest Path MDP (SSP MDP), solutions does not use this information. In this work we introduce the Stochastic Shortest Path MDP-IPs (SSP MDP-IPs) in enumerative form and in factored form. An efficient asynchronous dynamic programming solution for SSP MDP-IPs with enumerated states has been proposed by Buffet e Aberdeen (2005) before which is restricted to interval-based imprecision. Nevertheless, in general the problem is given in a factored form, i.e., in terms of state variables and in this case even if we assume interval-based imprecision over the variables, the previous solution is no longer applicable since we have multilinear parameterized joint transition probabilities. In this work we show that the innocuous change from the enumerated SSP MDP-IP cases to the general case of factored SSP MDP-IPs leads to a switch from a linear to nonlinear objectives in the Bellman backup. Also we propose assynchronous dynamic programming enumerative algorithms, called RTDP-IP (Real-time Dynamic Programming with Imprecise Transition Probabilities), LRTDP-IP (Labeled Real-time Dynamic Programming with Imprecise Transition Probabilities), SSiPP-IP (Short-Sighted Probabilistic Planner with Imprecise Transition Probabilities) and LSSiPP-IP (Labeled Short-Sighted Probabilistic Planner with Imprecise Transition Probabilities), and factored algorithms called factRTDP-IP (factored RTDP-IP) and factLRTDP-IP (factored LRTDP-IP). There algorithms will be evaluated with the synchronous dynamic programming algorithms previously proposed in terms of convergence time and scalability.
518

Tomada de decisão sequencial com preferências parcialmente ordenadas. / Sequential decision making with partially ordered preferences.

Kikuti, Daniel 20 August 2008 (has links)
Nesta tese, exploramos tomada de decisão com preferências parcialmente ordenadas: dadas duas ações, o indivíduo pode preferir uma ação a outra, julgá-las equivalentes, ou julgá-las incomparáveis. Tais preferências são originárias da incerteza sobre determinados estados do modelo de decisão e são reveladas pela imprecisão nos valores de probabilidade. Investigamos seis critérios de escolha de estratégias em problemas de decisão seqüenciais, representados por árvores de decisão e diagramas de influência, com probabilidades imprecisas: T-maximin, T-maximax, T-maximix, Dominação por Intervalos, Maximalidade e E-admissibilidade. Apresentamos novos algoritmos que geram estratégias para todos estes critérios. As principais contribuições deste trabalho estão na implementação dos algoritmos e na análise, sob o ponto de vista computacional, dos diversos critérios considerados como racionais em situações de incerteza representada por conjuntos de probabilidades. / In this thesis we explore situations where preferences are partially ordered: given two acts, the agent may prefer one to another, or nd them to be equivalent, or nd them to be incomparable. Such preferences stem from the uncertainty associated to some states of the decisions model and are revealed by imprecision in probability values. We investigate six criteria for strategy selection in decision trees and inuence diagrams with imprecise probabilities: -maximin, -maximax, -maximix, Interval Dominance, Maximality and E-admissibility. We present new algorithms that generate strategies for all these criteria. The main contributions of this work are twofold: the implementation of these algorithms and the analysis, under the computational point of view, of the criteria considered ratio- nal in uncertain situations represented by set of probabilities.
519

Influences of Climate variability on Rainfall Extremes of Different Durations

Unknown Date (has links)
The concept of Intensity Duration Frequency (IDF) relationship curve presents crucial design contribution for several decades under the assumption of a stationary climate, the frequency and intensity of extreme rainfall nonetheless seemingly increase worldwide. Based on the research conducted in recent years, the greatest increases are likely to occur in short-duration storms lasting less than a day, potentially leading to an increase in the magnitude and frequency of flash floods. The trend analysis of the precipitation influencing the climate variability and extreme rainfall in the state of Florida is conducted in this study. Since these local changes are potentially or directly related to the surrounding oceanic-atmospheric oscillations, the following oscillations are analyzed or highlighted in this study: Atlantic Multi-Decadal Oscillation (AMO), El Niño Southern Oscillation (ENSO), and Pacific Decadal Oscillations (PDO). Collected throughout the state of Florida, the precipitation data from rainfall gages are grouped and analyzed based on type of duration such as short-term duration or minute, in hourly and in daily period. To assess statistical associations based on the ranks of the data, the non-parametric tests Kendall’s tau and Spearman’s rho correlation coefficient are used to determine the orientation of the trend and ultimately utilize the testing results to determine the statistical significance of the analyzed data. The outcome of the latter confirms with confidence whether there is an increasing or decreasing trend in precipitation depth in the State of Florida. The main emphasis is on the influence of rainfall extremes of short-term duration over a period of about 50 years. Results from both Spearman and Mann-Kendall tests show that the greatest percentage of increase occurs during the short rainfall duration period. The result highlights a tendency of increasing trends in three different regions, two of which are more into the central and peninsula region of Florida and one in the continental region. Given its topography and the nature of its water surface such as the everglades and the Lake Okeechobee, Florida experience a wide range of weather patterns resulting in frequent flooding during wet season and drought in the dry season. / Includes bibliography. / Thesis (M.S.)--Florida Atlantic University, 2016. / FAU Electronic Theses and Dissertations Collection
520

Letramento estocástico: uma possível articulação entre os letramentos estatístico e probabilístico

Silva, Danilo Saes Corrêa da 19 April 2018 (has links)
Submitted by Filipe dos Santos (fsantos@pucsp.br) on 2018-07-27T13:30:57Z No. of bitstreams: 1 Danilo Saes Corrêa da Silva.pdf: 1860021 bytes, checksum: d63b0b3a3c442de33a36d490a793b1e4 (MD5) / Made available in DSpace on 2018-07-27T13:30:57Z (GMT). No. of bitstreams: 1 Danilo Saes Corrêa da Silva.pdf: 1860021 bytes, checksum: d63b0b3a3c442de33a36d490a793b1e4 (MD5) Previous issue date: 2018-04-19 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / The growing dissemination of news, driven by social networks, makes society need to be increasingly critical. And it is within this context that Statistical Education can play an important role in the formation of this society, since it is possible for this Education to facilitate the formation of a more critical society, since the Statistical and Probabilistic Literacy help in this framing. Our general objective is to analyze which elements of Statistical Letting and Probabilistic Literacy are worked with students of the sixth year of elementary school and study the possible articulation among them through activities that involve critical posture for data analysis. During our studies we indicate how points in agrément can promote a new Literacy, the Stochastic Letters. In order to analyze how these two Literacies interact and articulate with the students, we performed an activity, based on assumptions of Didactic Engineering, which consisted in the launching addicted dice and construction of graphs to verify the relative frequency associated with each of the faces, so as to aid in the learning of Frequency Probability. To verify the critical posture of the students were inserted in the activity some addicted dice, resulting in non-equiprobable launches. The results of the activity pointed to some points in common in the Statistical and Probabilistic Literacy, such as the importance of working on Statistics and Probability concomitantly, and the relevance of working with dice, which are elements of the students' familiarity, which are the context of the students, we also indicate some difficulties encountered, such as the absence of verification of non-equipotentiality / A crescente divulgação de notícias, impulsionada pelas redes sociais, faz com que a sociedade necessite ser cada vez mais crítica. E é dentro desse contexto que A Educação Estatística pode ter papel importante na formação dessa sociedade, visto que há a possibilidade dessa Educação facilitar na formação de uma sociedade mais crítica, já que os Letramentos Estatístico e Probabilístico auxiliam nessa construção. Nosso objetivo geral é analisar quais elementos do Letramento Estatístico e do Letramento Probabilístico são trabalhados com alunos do sexto ano do ensino fundamental e estudar a possível articulação entre eles por meio de atividades que envolvam postura crítica para análise de dados. Durante nossos estudos indicamos de que forma pontos consonantes podem promover um novo Letramento, o Letramento Estocástico. Para analisar a forma como esses dois Letramentos interagem e se articulam com os alunos, realizamos uma atividade, baseada em pressupostos da Engenharia Didática, que consistiu no lançamento de dados cúbicos e construção de gráficos para a verificação da frequência relativa associada à cada uma das faces, para assim auxiliar na aprendizagem da Probabilidade Frequentista. Para a verificação da postura crítica dos alunos foram inseridos na atividade alguns dados viciados, resultando em lançamentos não equiprováveis. Os resultados da atividade apontaram para alguns pontos em comum nos Letramentos Estatístico e Probabilístico, como a importância de se trabalhar a Estatística e a Probabilidade de forma concomitante, e a relevância de se trabalhar com dados, que são elementos do convívio dos alunos, aprimorando levantamentos que são do contexto dos estudantes, indicamos também algumas dificuldades encontradas, como a ausência da verificação da não-equiprobabilidade

Page generated in 0.1371 seconds