291 |
Possibilidades e dificuldades no desenvolvimento de situações de aprendizagem envolvendo funções trigonométricasRibeiro, Márcia Regina Ramos Costa 03 October 2011 (has links)
Made available in DSpace on 2016-04-27T16:57:10Z (GMT). No. of bitstreams: 1
Marcia Regina Ramos Costa Ribeiro.pdf: 3132591 bytes, checksum: fd48bc9e0ade34cd96a259cf8101d1a8 (MD5)
Previous issue date: 2011-10-03 / Fundo de Apoio à Pesquisa do Estado de São Paulo / This paper is part of an ongoing research on Mathematics in the curriculum, based on the assumption that the characterization of the prior knowledge on Math of the students may help in significant learning. In this perspective, the research focused on the prior knowledge of students in Brazilian Ensino Médio (roughly equivalent to American high school) on trigonometric functions, using qualitative research and participant observation. The objective is to understand the possibilities and difficulties in the use of the material given to students of the schools of the State of São Paulo, focusing in these students prior knowledge on trigonometric functions, identifying obstacles that may rise during these activities and checking the need for interventions to promote the build-up of knowledge on the subject. The proposal was to have students do activities proposed on the material supplied by the state of São Paulo. The analysis is based on Ausubel s theory of significant learning, on Coll s constructivist perspective (2006) and Pozo s studies, concerning learning and teaching contents. The results indicate that the prior knowledge of the students may be divided into many groups, given its important characteristic of including conceptual knowledge as well as procedures, values, regulations and attitudes. The ability to identify this prior knowledge is an important tool to create better pedagogical interventions and significant learning / Esse trabalho está inserido na linha de pesquisa A Matemática na Estrutura Curricular e Formação de Professores e no projeto de pesquisa A aprendizagem significativa e conhecimentos prévios: investigando o currículo de Matemática, em uma perspectiva construtivista; tem como pressuposto que a caracterização dos conhecimentos prévios dos estudantes relacionados aos conteúdos matemáticos que se pretende ensinar contribui para a aprendizagem significativa. Nessa perspectiva, o objetivo desta pesquisa é compreender as possibilidades e dificuldades em utilizar o material distribuído aos alunos da rede pública do Estado de São Paulo, focando conhecimentos prévios desses estudantes em relação ao conteúdo funções trigonométricas, identificando dificuldades que podem surgir durante a execução dessas atividades e verificando as necessidades de intervenções para a promoção da construção de conhecimento relativo ao tema. Para essa investigação, foi utilizada a pesquisa qualitativa e a técnica da observação participante. Foram observadas as ações de um grupo de alunos do segundo ano do Ensino Médio de uma escola pública estadual, durante a realização de atividades propostas e contidas no material. A análise está apoiada na teoria da aprendizagem significativa de Ausubel, na perspectiva construtivista de Coll (2006) e nos estudos de Pozo (2002) relativos aos conteúdos de ensino e à aprendizagem. Os resultados indicam que os conhecimentos prévios dos alunos relacionados as funções trigonométricas podem ser classificados em uma grande variedade de grupos, dada a sua importante característica de incluir tanto conhecimentos conceituais como procedimentos, valores, normas e atitudes; a caracterização desses conhecimentos prévios dos alunos, por parte do professor, a cada nova experiência educativa, constitui-se em importante ferramenta para a realização de intervenções pedagógicas mais eficientes e geradoras de aprendizagem significativa
|
292 |
Mapeamento da suscetibilidade a eventos perigosos de natureza geológica e hidrológica em São Carlos - SP / Susceptibility mapping of geological and hydrological dangerous events in São Carlos - SPEiras, Cahio Guimarães Seabra 07 July 2017 (has links)
Com a urbanização crescente no Brasil, o mapeamento de eventos perigosos mostra-se cada vez mais necessário para que se reduzam transtornos socioeconômicos nos municípios brasileiros. O objetivo principal da pesquisa foi elaborar mapas de suscetibilidade a eventos perigosos de natureza geológica e hidrológica, para a área urbana e de expansão urbana (148,97 km²) do município de São Carlos – SP. A análise foi feita em ambiente de SIG (Sistemas de Informação Geográfica), e combinou abordagens qualitativas e quantitativas de mapeamento de eventos perigosos, propostas pelo Working Group Committee on Risk Assessment (1997). Como método, analisou-se a freqüência dos eventos históricos em função da área (km²) dos condicionantes cartografados nos mapas temáticos (probabilidades iniciais). Os condicionantes foram analisados isoladamente e combinados, na forma de classes de terreno. O inventário dos registros históricos (482 registros) dos eventos foi feito com os dados de jornais, defesa civil e imagens de satélite entre os anos de 1965 a 2016. Foram elaborados os mapas de suscetibilidade para: movimentos de encosta, erosão pluvial, solos colapsíveis e enchentes/inundações. O método utilizado mostrou-se eficiente, visto que os objetivos da pesquisa foram alcançados. Foi possível validar os mapas de suscetibilidade, com as características do meio físico observadas em trabalhos de campo e dados de ensaios geotécnicos. Os eventos mais críticos para o município são as enchentes/inundações e alagamentos, erosões pluviais e movimentos de encostas causam pequenos transtornos, principalmente em consequência do relevo suave da região. Apesar da base cartográfica estar na escala 1:10.000, os mapas são apresentados na escala 1:20.000, com o objetivo destes serem impressos em papéis no formato A1. / With the increasing urbanization in Brazil, hazardous events mapping has become increasingly necessary to reduce socioeconomic disorders in Brazilian citys. The main goal of the research was to elaborate susceptibility maps to dangerous events of geological and hydrological nature, for the urban area and urban expansion (148,97 km²) to the city of São Carlos - SP. The analysis was done in a GIS environment (Geographic Information Systems), and combined qualitative and quantitative approaches of hazardous events mapping, proposed by the Working Group Committee on Risk Assessment (1997). The frequency of historical events was analyzed according to the area of the conditioners mapped on the thematic maps (prior probabilitys). The conditioners were analyzed separately and then combined, in the form of land classes. The inventory of historical records (482 records) of events was done with acquired data from newspapers, civil defense and satellite images between the years 1965 to 2016. Susceptibility maps were developed for slope movements, water erosion, collapsible soils and floods. This method proved to be efficient, since the research objectives were achieved. It was possible to validate the susceptibility maps, with the characteristics of the physical environment observed in field work and geotechnical data. The most critical events in São Carlos are floods, rainfall erosions and slope movements cause minor damage, as a result of the region\'s soft relief. Although the cartographic base is in the 1:10.000 scale, the maps are presented in the 1:20.000 scale, with the purpose of being printed on A1 format paper.
|
293 |
Estimação não linear de estado através do unscented Kalman filter na tomografia por impedância elétrica. / Nonlinear state estimation using the Unscented Kalman filter in electrical impedance tomography.Moura, Fernando Silva de 26 February 2013 (has links)
A Tomografia por Impedância Elétrica tem como objetivo estimar a distribuição de impedância elétrica dentro de uma região a partir de medidas de potencial elétrico coletadas apenas em seu contorno externo quando corrente elétrica é imposta neste mesmo contorno. Uma das aplicações para esta tecnologia é o monitoramento das condições pulmonares de pacientes em Unidades de Tratamento Intensivo. Dentre vários algoritmos, destacam-se os filtros de Kalman que abordam o problema de estimação sob o ponto de vista probabilístico, procurando encontrar a distribuição de probabilidade do estado condicionada à realização das medidas. Para que estes filtros possam ser utilizados, um modelo de evolução temporal do sistema sendo observado deve ser adotado. Esta tese propõe o uso de um modelo de evolução para a variação de volume de ar nos pulmões durante a respiração de um paciente sob ventilação artificial. Este modelo é utilizado no unscented Kalman filter, uma extensão não linear do filtro de Kalman. Tal modelo é ajustado em paralelo à estimação do estado, utilizando um esquema dual de estimação. Um algoritmo de segmentação de imagem é proposto para identificar as regiões pulmonares nas imagens estimadas e assim utilizar o modelo de evolução. Com o intuito de melhorar as estimativas, o método do erro de aproximação é utilizado no modelo de observação para mitigar os erros de modelagem e informação a priori é adicionada na solução do problema inverso mal-posto. O método é avaliado através de simulações numéricas e ensaio experimental coletado em um voluntário. Os resultados mostram que o método proposto melhora as estimativas feitas pelo filtro de Kalman, propiciando a visualização de imagens absolutas, dinâmicas e com bom nível de contraste entre os tecidos e órgãos internos. / Electrical impedance tomography estimates the electrical impedance distribution within a region given a set of electrical potential measurements acquired along its boundary at the same time that electrical currents are imposed on the same boundary. One of the applications of this technology is lung monitoring of patients in Intensive Care Units. One class of algorithms employed for the estimation are the Kalman filters which deal with the estimation problem in a probabilistic framework, looking for the probability density function of the state conditioned to the acquired measurements. In order to use such filters, an evolution models of the system must be employed. This thesis proposes an evolution model of the variation of air in the lungs of patients under artificial ventilation. This model is used on the Unscented Kalman Filter, a nonlinear extension of the Kalman filter. This model is adjusted in parallel to the state estimation, in a dual estimation scheme. An image segmentation algorithm is proposed for identifying the lungs in the images. In order to improve the estimate, the approximation error method is employed for mitigating the observation model errors and prior information is added for the solution of the ill-posed inverse problem. The method is evaluated with numerical simulations and with experimental data of a volunteer. The results show that the proposed method increases the quality of the estimates, allowing the visualization of absolute and dynamic images, with good level of contrast between the tissues and internal organs.
|
294 |
Erros conceituais na aprendizagem contábil: ensine o errado! / Misconceptions in learning accountin: Teach what is wrong!Elúbian de Moraes Sanchez 05 November 2018 (has links)
Conceitos e técnicas são ensinados em ambientes educacionais e deveriam ser aprendidos; porém, os exames nacionais de larga escala têm mostrado resultados indesejados, evidenciando uma lacuna na aprendizagem dos nossos alunos. Segundo Sanchez (2013), nos cursos de graduação em Ciências Contábeis, os principais erros conceituais cometidos pelos estudantes são: uso equivocado dos conceitos de caixa e competência e erros matemáticos. A definição de erro conceitual é referida na literatura sobre misconceptions (Chi, 1992) em que existe uma apresentação padrão na forma em que o erro desponta e um relacionamento incompatível entre os conceitos novos, a serem aprendidos pelos alunos, e os conceitos prévios, já existentes. Os erros conceituais têm seis características: são robustos, consistentes, persistentes, homogêneos, recapitulados e sistemáticos. Por isso, são difíceis de serem corrigidos. Chi et al (1994) utilizam da teoria da estruturação do conhecimento, em que definem que os conceitos são classificados em categorias ao serem aprendidos. Porém, conceitos que são classificados erroneamente transformam-se em erros conceituais robustos: são difíceis de serem aprendidos, pela dificuldade em transpor o conceito para a categoria adequada. Com base na definição de misconception e da estruturação dos conceitos em categorias, buscou-se entender como os estudantes formam os erros conceituais e, com base nestes tipos de erro encontrados e nas seis características dos padrões de erros, coletamos evidências da formação e superação dos erros por parte dos alunos. Estas evidências nos auxiliaram na criação de uma estratégia de ensino, construída com base na estruturação do conhecimento e, que seja diferente da estratégia \"comum\" de aula de Contabilidade Introdutória, que é o primeiro contato dos estudantes da área de negócios com contabilidade, com intuito de responder a nossa questão de pesquisa: \"Qual o impacto (proporção e sentido) da adoção desta estratégia de ensino baseada em erros conceituais no aprendizado dos estudantes?\" O impacto da estratégia foi motivacional, pois fez os alunos refletirem sobre os erros conceituais, mas insuficiente para aumentar a proporção de acertos nas avaliações realizadas. / Concepts and techniques are taught in educational settings and should be learned; however, large-scale national exams have shown undesirable results, evidencing a learning gap in our students. According to Sanchez (2013), in the undergraduate courses in Accounting, the main misconception made by students are: misuse of concepts of cash and accrual and mathematical errors. The definition of misconception is referred to in the literature (Chi, 1992) in which there is a standard presentation in the form in which the error emerges and an incompatible relationship between the new concepts to be learned by the students and the prior knowledge already existing. Misconception have six characteristics: they are robust, consistent, persistent, homogeneous, recapitulated and systematic. Therefore, they are difficult to correct. Chi et al. (1994) use the theory of knowledge structuring, where they define that concepts are classified into categories when they are learned. However, concepts that are misclassified become robust conceptual errors: they are difficult to learn because of the difficulty in transposing the concept into the appropriate category. Based on the definition of misconception and the structuring of concepts into categories, we sought to understand how students form misconception and, based on these types of errors found and on the six characteristics, we collect evidence of the formation and overcoming of errors on the part of the students. These evidences helped us in creating a teaching strategy, based on the structuring of knowledge and that is different from the \"common\" strategy of First Accounting Class, which is the first contact of the students of the business area with accounting, with In order to answer our research question: \"What is the impact (proportion and signal) of adopting this teaching strategy based on conceptual errors in learning?\" The impact of the strategy was motivational, as it made the students reflect on the conceptual errors, but insufficient to increase the proportion of correctness in the realized evaluations.
|
295 |
Motif representation and discoveryCarvalho, A.M. 01 July 2011 (has links) (PDF)
An important part of gene regulation is mediated by specific proteins, called transcription factors, which influence the transcription of a particular gene by binding to specific sites on DNA sequences, called transcription factor binding sites (TFBS) or, simply, motifs. Such binding sites are relatively short segments of DNA, normally 5 to 25 nucleotides long, over- represented in a set of co-regulated DNA sequences. There are two different problems in this setup: motif representation, accounting for the model that describes the TFBS's; and motif discovery, focusing in unravelling TFBS's from a set of co-regulated DNA sequences. This thesis proposes a discriminative scoring criterion that culminates in a discriminative mixture of Bayesian networks to distinguish TFBS's from the background DNA. This new probabilistic model supports further evidence in non-additivity among binding site positions, providing a superior discriminative power in TFBS's detection. On the other hand, extra knowledge carefully selected from the literature was incorporated in TFBS discovery in order to capture a variety of characteristics of the TFBS's patterns. This extra knowledge was combined during the process of motif discovery leading to results that are considerably more accurate than those achieved by methods that rely in the DNA sequence alone.
|
296 |
Automatic Bayesian Segmentation Of Human Facial Tissue Using 3d Mr-ct Fusion By Incorporating Models Of Measurement Blurring, Noise And Partial VolumeSener, Emre 01 September 2012 (has links) (PDF)
Segmentation of human head on medical images is an important process in a wide array of applications such as diagnosis, facial surgery planning, prosthesis design, and forensic identification. In this study, a new Bayesian method for segmentation of facial tissues is presented. Segmentation classes include muscle, bone, fat, air and skin. The method incorporates a model to account for image blurring during data acquisition, a prior helping to reduce noise as well as a partial
volume model. Regularization based on isotropic and directional Markov Random Field priors are integrated to the algorithm and their effects on segmentation accuracy are investigated. The Bayesian model is solved iteratively yielding tissue class labels at every voxel of an image. Sub-methods as variations of the main method are generated by switching on/off a combination of the models. Testing of the sub-methods are performed on two patients using single modality three-dimensional (3D) images as well as registered multi-modal 3D images (Magnetic Resonance and Computerized Tomography). Numerical, visual and statistical
analyses of the methods are conducted. Improved segmentation accuracy is obtained through the use of the proposed image models and multi-modal data. The methods are also compared with the Level Set method and an adaptive Bayesiansegmentation method proposed in a previous study.
|
297 |
The Nagoya protocol: a possible solution to the protection of traditional knowledge in biodiverse societies of AfricaMoody, Oluwatobiloba Oluwayomi January 2011 (has links)
<p>There is a growing interplay of competing realities facing the international community in the general areas of innovation, technological advancement and overall economic development. The highly industrialised wealthy nations, largely located on the Northern hemisphere are on the one hand undoubtedly at the forefront in global research, technology and infrastructure development. The developing and least developed countries on the other hand are mostly situated on the Southern hemisphere. They are not as wealthy or technologically advanced as their  / Northern counterparts, but are naturally endowed with unique variations of plant, animal and micro-organism species occurring in natural ecosystems, as well as the traditional knowledge on  / how to use these unique species. This knowledge has been adjudged to be responsible for the sustainable maintenance of the earth&rsquo / s biodiversity. Increasing exploitation of biodiversity,  / spurred on by the competing realities identified above, has left the earth in a present state of alarm with respect to the uncontrolled loss of biodiversity. The traditional knowledge of local  / peoples has significantly offered leads to research institutes from the North in developing major advancements in drugs, cosmetics and agriculture. Little or no compensation has however been seen to go back to the indigenous  / communities and countries that provide resources, and indicate various possibilities through their traditional knowledge to the use of such resources. Efforts by some biodiversity rich countries to  / ddress this trend through legislation developed in accordance with the principles of the Convention on Biological Diversity have been frustrated due to the inability to enforce their domestic laws outside their borders. Theft of genetic resources and its associated traditional knowledge  / from such countries has therefore remained a major challenge. Against this backdrop, and on the  / insistence of biodiversity-rich developing countries, an international regime on access and benefit sharing was negotiated and its final text adopted in 2010. This international regime is as  / contained in the Nagoya Protocol. This research sets out to examine whether the Nagoya Protocol offers a final solution to the protection of traditional knowledge associated with biodiversity in  / biodiverse countries. It further examines the importance of domestic legislation in achieving the objectives of the Protocol. The research has been tailored to African biodiverse countries, and  / seeks these answers within the context of Africa.<br />
  / </p>
|
298 |
Bayesian Methods in Gaussian Graphical ModelsMitsakakis, Nikolaos 31 August 2010 (has links)
This thesis contributes to the field of Gaussian Graphical Models by exploring either numerically or theoretically various topics of Bayesian Methods in Gaussian Graphical Models and by providing a number of interesting results, the further exploration of which would be promising, pointing to numerous future research directions.
Gaussian Graphical Models are statistical methods for the investigation and representation of interdependencies between components of continuous random vectors. This thesis aims to investigate some issues related to the application of Bayesian methods for Gaussian Graphical Models. We adopt the popular $G$-Wishart conjugate prior $W_G(\delta,D)$ for the precision matrix. We propose an efficient sampling method for the $G$-Wishart distribution based on the Metropolis Hastings algorithm and show its validity through a number of numerical experiments. We show that this method can be easily used to estimate the Deviance Information Criterion, providing a computationally inexpensive approach for model selection.
In addition, we look at the marginal likelihood of a graphical model given a set of data. This is proportional to the ratio of the posterior over the prior normalizing constant. We explore methods for the estimation of this ratio, focusing primarily on applying the Monte Carlo simulation method of path sampling. We also explore numerically the effect of the completion of the incomplete matrix $D^{\mathcal{V}}$, hyperparameter of the $G$-Wishart distribution, for the estimation of the normalizing constant.
We also derive a series of exact and approximate expressions for the Bayes Factor between two graphs that differ by one edge. A new theoretical result regarding the limit of the normalizing constant multiplied by the hyperparameter $\delta$ is given and its implications to the validity of an improper prior and of the subsequent Bayes Factor are discussed.
|
299 |
Buttressing a Monarchy: Literary Representations of William III and the Glorious RevolutionDolan, Jr., Richard L. 12 May 2005 (has links)
This study examines ways in which supporters of William III and his opponents used literature to buttress their respective views of government in the wake of the Glorious Revolution. Understanding the polemical character of this art provides more insight both into the literature of the 1690s and into the modes of political debate in the period. As the English people moved from a primarily hereditary view of monarchy at the beginning of the seventeenth century to a more elective view of government in the eighteenth century, the Glorious Revolution proved to be a watershed event. Those favoring James II relied on patriarchal ideas to characterize the new regime as illegitimate, and supporters of the coregent asserted the priority of English and Biblical law to assert that the former king forfeited his right to rule. Chapter one examines three thinkers – Robert Filmer, John Milton, and John Locke – whose thought provides a context for opinions expressed in the years surrounding William of Orange’s ascension to the English throne. In chapter two, John Dryden’s response to James II’s abdication is explored. As the deposed Poet Laureate and a prominent voice supporting of the Stuart line, Dryden sheds light on ways in which Jacobites resisted the authority of the new regime through his response to the Glorious Revolution. Chapter three addresses the work of Thomas Shadwell, who succeeded Dryden as Laureate, and Matthew Prior, whose poetry Frances Mayhew Rippy characterizes as “unofficial laureate verse.” These poets rely on ideas similar to those expressed by Milton and Locke as they seek to validate the events of 1688-1689. The final chapter explores the appropriation of varied conceptions of government in pamphlets and manuscripts written in favor of James II and William III. Focusing on the polemical character of these works from the late 1680s and the 1690s enhances our understanding of the period’s literature and the prominent interaction of politics and writing.
|
300 |
Bayesian Methods in Gaussian Graphical ModelsMitsakakis, Nikolaos 31 August 2010 (has links)
This thesis contributes to the field of Gaussian Graphical Models by exploring either numerically or theoretically various topics of Bayesian Methods in Gaussian Graphical Models and by providing a number of interesting results, the further exploration of which would be promising, pointing to numerous future research directions.
Gaussian Graphical Models are statistical methods for the investigation and representation of interdependencies between components of continuous random vectors. This thesis aims to investigate some issues related to the application of Bayesian methods for Gaussian Graphical Models. We adopt the popular $G$-Wishart conjugate prior $W_G(\delta,D)$ for the precision matrix. We propose an efficient sampling method for the $G$-Wishart distribution based on the Metropolis Hastings algorithm and show its validity through a number of numerical experiments. We show that this method can be easily used to estimate the Deviance Information Criterion, providing a computationally inexpensive approach for model selection.
In addition, we look at the marginal likelihood of a graphical model given a set of data. This is proportional to the ratio of the posterior over the prior normalizing constant. We explore methods for the estimation of this ratio, focusing primarily on applying the Monte Carlo simulation method of path sampling. We also explore numerically the effect of the completion of the incomplete matrix $D^{\mathcal{V}}$, hyperparameter of the $G$-Wishart distribution, for the estimation of the normalizing constant.
We also derive a series of exact and approximate expressions for the Bayes Factor between two graphs that differ by one edge. A new theoretical result regarding the limit of the normalizing constant multiplied by the hyperparameter $\delta$ is given and its implications to the validity of an improper prior and of the subsequent Bayes Factor are discussed.
|
Page generated in 0.0438 seconds